WorldWideScience

Sample records for bivariate normal distribution

  1. A note on finding peakedness in bivariate normal distribution using Mathematica

    Directory of Open Access Journals (Sweden)

    Anwer Khurshid

    2007-07-01

    Full Text Available Peakedness measures the concentration around the central value. A classical standard measure of peakedness is kurtosis which is the degree of peakedness of a probability distribution. In view of inconsistency of kurtosis in measuring of the peakedness of a distribution, Horn (1983 proposed a measure of peakedness for symmetrically unimodal distributions. The objective of this paper is two-fold. First, Horn’s method has been extended for bivariate normal distribution. Secondly, to show that computer algebra system Mathematica can be extremely useful tool for all sorts of computation related to bivariate normal distribution. Mathematica programs are also provided.

  2. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  3. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    Science.gov (United States)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  4. Probabilistic modeling using bivariate normal distributions for identification of flow and displacement intervals in longwall overburden

    Energy Technology Data Exchange (ETDEWEB)

    Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research

    2011-01-15

    Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.

  5. Bivariate Rayleigh Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2007-01-01

    Full Text Available Rayleigh (1880 observed that the sea waves follow no law because of the complexities of the sea, but it has been seen that the probability distributions of wave heights, wave length, wave induce pitch, wave and heave motions of the ships follow the Rayleigh distribution. At present, several different quantities are in use for describing the state of the sea; for example, the mean height of the waves, the root mean square height, the height of the “significant waves” (the mean height of the highest one-third of all the waves the maximum height over a given interval of the time, and so on. At present, the ship building industry knows less than any other construction industry about the service conditions under which it must operate. Only small efforts have been made to establish the stresses and motions and to incorporate the result of such studies in to design. This is due to the complexity of the problem caused by the extensive variability of the sea and the corresponding response of the ships. Although the problem appears feasible, yet it is possible to predict service conditions for ships in an orderly and relatively simple manner Rayleigh (1980 derived it from the amplitude of sound resulting from many independent sources. This distribution is also connected with one or two dimensions and is sometimes referred to as “random walk” frequency distribution. The Rayleigh distribution can be derived from the bivariate normal distribution when the variate are independent and random with equal variances. We try to construct bivariate Rayleigh distribution with marginal Rayleigh distribution function and discuss its fundamental properties.

  6. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  7. Newton Leibniz integration for ket-bra operators in quantum mechanics (V)—Deriving normally ordered bivariate-normal-distribution form of density operators and developing their phase space formalism

    Science.gov (United States)

    Fan, Hong-yi

    2008-06-01

    We show that Newton-Leibniz integration over Dirac's ket-bra projection operators with continuum variables, which can be performed by the technique of integration within ordered product (IWOP) of operators [Hong-yi Fan, Hai-liang Lu, Yue Fan, Ann. Phys. 321 (2006) 480], can directly recast density operators and generalized Wigner operators into normally ordered bivariate-normal-distribution form, which has resemblance in statistics. In this way the phase space formalism of quantum mechanics can be developed. The Husimi operator, entangled Husimi operator and entangled Wigner operator for entangled particles with different masses are naturally introduced by virtue of the IWOP technique, and their physical meanings are explained.

  8. Reliability for some bivariate beta distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models there has been a large amount of work as regards estimation of the reliability R=Pr( Xdistributions when X and Y are independent random variables belonging to the same univariate family. In this paper, we consider forms of R when ( X,Y follows a bivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate beta. The calculations involve the use of special functions.

  9. Reliability for some bivariate gamma distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models, there has been a large amount of work as regards estimation of the reliability R=Pr( Xdistributions when X and Y are independent random variables belonging to the same univariate family. In this paper, we consider forms of R when ( X,Y follows a bivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate gamma. The calculations involve the use of special functions.

  10. STUDI PERBANDINGAN ANTARA ALGORITMA BIVARIATE MARGINAL DISTRIBUTION DENGAN ALGORITMA GENETIKA

    Directory of Open Access Journals (Sweden)

    Chastine Fatichah

    2006-01-01

    Full Text Available Bivariate Marginal Distribution Algorithm is extended from Estimation of Distribution Algorithm. This heuristic algorithm proposes the new approach for recombination of generate new individual that without crossover and mutation process such as genetic algorithm. Bivariate Marginal Distribution Algorithm uses connectivity variable the pair gene for recombination of generate new individual. Connectivity between variable is doing along optimization process. In this research, genetic algorithm performance with one point crossover is compared with Bivariate Marginal Distribution Algorithm performance in case Onemax, De Jong F2 function, and Traveling Salesman Problem. In this research, experimental results have shown performance the both algorithm is dependence of parameter respectively and also population size that used. For Onemax case with size small problem, Genetic Algorithm perform better with small number of iteration and more fast for get optimum result. However, Bivariate Marginal Distribution Algorithm perform better of result optimization for case Onemax with huge size problem. For De Jong F2 function, Genetic Algorithm perform better from Bivariate Marginal Distribution Algorithm of a number of iteration and time. For case Traveling Salesman Problem, Bivariate Marginal Distribution Algorithm have shown perform better from Genetic Algorithm of optimization result. Abstract in Bahasa Indonesia : Bivariate Marginal Distribution Algorithm merupakan perkembangan lebih lanjut dari Estimation of Distribution Algorithm. Algoritma heuristik ini mengenalkan pendekatan baru dalam melakukan rekombinasi untuk membentuk individu baru, yaitu tidak menggunakan proses crossover dan mutasi seperti pada Genetic Algorithm. Bivariate Marginal Distribution Algorithm menggunakan keterkaitan pasangan variabel dalam melakukan rekombinasi untuk membentuk individu baru. Keterkaitan antar variabel tersebut ditemukan selama proses optimasi berlangsung. Aplikasi yang

  11. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  12. mitants of Order Statistics from Bivariate Inverse Rayleigh Distribution

    Directory of Open Access Journals (Sweden)

    Muhammad Aleem

    2006-01-01

    Full Text Available The probability density function (pdf of the rth, 1 r n and joint pdf of the rth and sth, 1 rBivariate Inverse Rayleigh Distribution and their moments, product moments are obtained. Its percentiles are also obtained.

  13. Comparison between two bivariate Poisson distributions through the ...

    African Journals Online (AJOL)

    To remedy this problem, Berkhout and Plug proposed a bivariate Poisson distribution accepting the correlation as well negative, equal to zero, that positive. In this paper, we show that these models are nearly everywhere asymptotically equal. From this survey that the ø-divergence converges toward zero, both models are ...

  14. Selection effects in the bivariate brightness distribution for spiral galaxies

    International Nuclear Information System (INIS)

    Phillipps, S.; Disney, M.

    1986-01-01

    The joint distribution of total luminosity and characteristic surface brightness (the bivariate brightness distribution) is investigated for a complete sample of spiral galaxies in the Virgo cluster. The influence of selection and physical limits of various kinds on the apparent distribution are detailed. While the distribution of surface brightness for bright galaxies may be genuinely fairly narrow, faint galaxies exist right across the (quite small) range of accessible surface brightnesses so no statement can be made about the true extent of the distribution. The lack of high surface brightness bright galaxies in the Virgo sample relative to an overall RC2 sample (mostly field galaxies) supports the contention that the star-formation rate is reduced in the inner region of the cluster for environmental reasons. (author)

  15. Bivariate generalized Pareto distribution for extreme atmospheric particulate matter

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-02-01

    The high particulate matter (PM10) level is the prominent issue causing various impacts to human health and seriously affecting the economics. The asymptotic theory of extreme value is apply for analyzing the relation of extreme PM10 data from two nearby air quality monitoring stations. The series of daily maxima PM10 for Johor Bahru and Pasir Gudang stations are consider for year 2001 to 2010 databases. The 85% and 95% marginal quantile apply to determine the threshold values and hence construct the series of exceedances over the chosen threshold. The logistic, asymmetric logistic, negative logistic and asymmetric negative logistic models areconsidered as the dependence function to the joint distribution of a bivariate observation. Maximum likelihood estimation is employed for parameter estimations. The best fitted model is chosen based on the Akaike Information Criterion and the quantile plots. It is found that the asymmetric logistic model gives the best fitted model for bivariate extreme PM10 data and shows the weak dependence between two stations.

  16. A non-parametric conditional bivariate reference region with an application to height/weight measurements on normal girls

    DEFF Research Database (Denmark)

    Petersen, Jørgen Holm

    2009-01-01

    A conceptually simple two-dimensional conditional reference curve is described. The curve gives a decision basis for determining whether a bivariate response from an individual is "normal" or "abnormal" when taking into account that a third (conditioning) variable may influence the bivariate...... response. The reference curve is not only characterized analytically but also by geometric properties that are easily communicated to medical doctors - the users of such curves. The reference curve estimator is completely non-parametric, so no distributional assumptions are needed about the two......-dimensional response. An example that will serve to motivate and illustrate the reference is the study of the height/weight distribution of 7-8-year-old Danish school girls born in 1930, 1950, or 1970....

  17. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Directory of Open Access Journals (Sweden)

    Jose Javier Gorgoso-Varela

    2016-04-01

    Full Text Available Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights.Area of study: North-West of Spain.Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill. stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution.Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic.Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass.

  18. Comparing Johnson’s SBB, Weibull and Logit-Logistic bivariate distributions for modeling tree diameters and heights using copulas

    Energy Technology Data Exchange (ETDEWEB)

    Cardil Forradellas, A.; Molina Terrén, D.M.; Oliveres, J.; Castellnou, M.

    2016-07-01

    Aim of study: In this study we compare the accuracy of three bivariate distributions: Johnson’s SBB, Weibull-2P and LL-2P functions for characterizing the joint distribution of tree diameters and heights. Area of study: North-West of Spain. Material and methods: Diameter and height measurements of 128 plots of pure and even-aged Tasmanian blue gum (Eucalyptus globulus Labill.) stands located in the North-west of Spain were considered in the present study. The SBB bivariate distribution was obtained from SB marginal distributions using a Normal Copula based on a four-parameter logistic transformation. The Plackett Copula was used to obtain the bivariate models from the Weibull and Logit-logistic univariate marginal distributions. The negative logarithm of the maximum likelihood function was used to compare the results and the Wilcoxon signed-rank test was used to compare the related samples of these logarithms calculated for each sample plot and each distribution. Main results: The best results were obtained by using the Plackett copula and the best marginal distribution was the Logit-logistic. Research highlights: The copulas used in this study have shown a good performance for modeling the joint distribution of tree diameters and heights. They could be easily extended for modelling multivariate distributions involving other tree variables, such as tree volume or biomass. (Author)

  19. On the Construction of Bivariate Exponential Distributions with an Arbitrary Correlation Coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2010-01-01

    In this article we use the concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...

  20. DBH Prediction Using Allometry Described by Bivariate Copula Distribution

    Science.gov (United States)

    Xu, Q.; Hou, Z.; Li, B.; Greenberg, J. A.

    2017-12-01

    Forest biomass mapping based on single tree detection from the airborne laser scanning (ALS) usually depends on an allometric equation that relates diameter at breast height (DBH) with per-tree aboveground biomass. The incapability of the ALS technology in directly measuring DBH leads to the need to predict DBH with other ALS-measured tree-level structural parameters. A copula-based method is proposed in the study to predict DBH with the ALS-measured tree height and crown diameter using a dataset measured in the Lassen National Forest in California. Instead of exploring an explicit mathematical equation that explains the underlying relationship between DBH and other structural parameters, the copula-based prediction method utilizes the dependency between cumulative distributions of these variables, and solves the DBH based on an assumption that for a single tree, the cumulative probability of each structural parameter is identical. Results show that compared with the bench-marking least-square linear regression and the k-MSN imputation, the copula-based method obtains better accuracy in the DBH for the Lassen National Forest. To assess the generalization of the proposed method, prediction uncertainty is quantified using bootstrapping techniques that examine the variability of the RMSE of the predicted DBH. We find that the copula distribution is reliable in describing the allometric relationship between tree-level structural parameters, and it contributes to the reduction of prediction uncertainty.

  1. On the construction of bivariate exponential distributions with an arbitrary correlation coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase {type distributions, The latter property is potentially important for the development hypothesis testing in linear models. Thirdly, it is very easy to simulate......In this paper we use a concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...

  2. On minimum divergence adaptation of discrete bivariate distributions to given marginals

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; van der Meulen, E. C.

    2005-01-01

    Roč. 51, č. 1 (2005), s. 313-320 ISSN 0018-9448 R&D Projects: GA ČR GA201/02/1391; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : approximation of contingency tables * bivariate discrete distributions * minimization of divergences Subject RIV: BD - Theory of Information Impact factor: 2.183, year: 2005

  3. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  4. The Normal Distribution From Binomial to Normal

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 6. The Normal Distribution From Binomial to Normal. S Ramasubramanian. Series Article Volume 2 Issue 6 June 1997 pp 15-24. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/002/06/0015-0024 ...

  5. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  6. Ventilation-perfusion distribution in normal subjects.

    Science.gov (United States)

    Beck, Kenneth C; Johnson, Bruce D; Olson, Thomas P; Wilson, Theodore A

    2012-09-01

    Functional values of LogSD of the ventilation distribution (σ(V)) have been reported previously, but functional values of LogSD of the perfusion distribution (σ(q)) and the coefficient of correlation between ventilation and perfusion (ρ) have not been measured in humans. Here, we report values for σ(V), σ(q), and ρ obtained from wash-in data for three gases, helium and two soluble gases, acetylene and dimethyl ether. Normal subjects inspired gas containing the test gases, and the concentrations of the gases at end-expiration during the first 10 breaths were measured with the subjects at rest and at increasing levels of exercise. The regional distribution of ventilation and perfusion was described by a bivariate log-normal distribution with parameters σ(V), σ(q), and ρ, and these parameters were evaluated by matching the values of expired gas concentrations calculated for this distribution to the measured values. Values of cardiac output and LogSD ventilation/perfusion (Va/Q) were obtained. At rest, σ(q) is high (1.08 ± 0.12). With the onset of ventilation, σ(q) decreases to 0.85 ± 0.09 but remains higher than σ(V) (0.43 ± 0.09) at all exercise levels. Rho increases to 0.87 ± 0.07, and the value of LogSD Va/Q for light and moderate exercise is primarily the result of the difference between the magnitudes of σ(q) and σ(V). With known values for the parameters, the bivariate distribution describes the comprehensive distribution of ventilation and perfusion that underlies the distribution of the Va/Q ratio.

  7. On the Folded Normal Distribution

    Directory of Open Access Journals (Sweden)

    Michail Tsagris

    2014-02-01

    Full Text Available The characteristic function of the folded normal distribution and its moment function are derived. The entropy of the folded normal distribution and the Kullback–Leibler from the normal and half normal distributions are approximated using Taylor series. The accuracy of the results are also assessed using different criteria. The maximum likelihood estimates and confidence intervals for the parameters are obtained using the asymptotic theory and bootstrap method. The coverage of the confidence intervals is also examined.

  8. The Normal Distribution

    Indian Academy of Sciences (India)

    tion in statistics, velocity distribution of an ideal gas, and the phenomenon of Brownian motion is briefly illustrated. Introduction. To compensate for the hard work done in part I of this series, we basically pontificate in this article. Mathematical details are side-stepped and we indulge in a lot of 'hand-waving', especially in the ...

  9. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  10. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    Science.gov (United States)

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Bivariate value-at-risk

    Directory of Open Access Journals (Sweden)

    Giuseppe Arbia

    2007-10-01

    Full Text Available In this paper we extend the concept of Value-at-risk (VaR to bivariate return distributions in order to obtain measures of the market risk of an asset taking into account additional features linked to downside risk exposure. We first present a general definition of risk as the probability of an adverse event over a random distribution and we then introduce a measure of market risk (b-VaR that admits the traditional b of an asset in portfolio management as a special case when asset returns are normally distributed. Empirical evidences are provided by using Italian stock market data.

  12. Improving the modelling of redshift-space distortions - I. A bivariate Gaussian description for the galaxy pairwise velocity distributions

    Science.gov (United States)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2015-01-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation r, such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and dispersion σ. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distortions on all scales, fully capturing the overall linear and non-linear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of RSD is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. We also show how this description naturally allows for the Taylor expansion of 1 + ξS(s) around 1 + ξR(r), which leads to the Kaiser linear formula when truncated to second order, explicating its connection with the moments of the velocity distribution functions. More work is needed, but these results indicate a very promising path to make definitive progress in our programme to improve RSD estimators.

  13. Explicit expressions for European option pricing under a generalized skew normal distribution

    OpenAIRE

    Doostparast, Mahdi

    2017-01-01

    Under a generalized skew normal distribution we consider the problem of European option pricing. Existence of the martingale measure is proved. An explicit expression for a given European option price is presented in terms of the cumulative distribution function of the univariate skew normal and the bivariate standard normal distributions. Some special cases are investigated in a greater detail. To carry out the sensitivity of the option price to the skew parameters, numerical methods are app...

  14. Understanding a Normal Distribution of Data.

    Science.gov (United States)

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  15. Quantiles for Finite Mixtures of Normal Distributions

    Science.gov (United States)

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  16. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models...

  17. NORMAL DISTRIBUTION LAW IN MEDICAL RESEARCH

    Directory of Open Access Journals (Sweden)

    М. A. Ivanchuk

    2013-05-01

    Full Text Available The main methods for assessing normality were described. As an example, multiple samples from clinical research were tested for normality using graphical (the histogram and t he normal probability plot, and statistical methods. The majority of clinical samples were not normally distributed (60 %. The practical recommendations were provided.

  18. A New Distribution-Random Limit Normal Distribution

    OpenAIRE

    Gong, Xiaolin; Yang, Shuzhen

    2013-01-01

    This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.

  19. The use of bivariate spatial modeling of questionnaire and parasitology data to predict the distribution of Schistosoma haematobium in Coastal Kenya.

    Directory of Open Access Journals (Sweden)

    Hugh J W Sturrock

    Full Text Available Questionnaires of reported blood in urine (BIU distributed through the existing school system provide a rapid and reliable method to classify schools according to the prevalence of Schistosoma haematobium, thereby helping in the targeting of schistosomiasis control. However, not all schools return questionnaires and it is unclear whether treatment is warranted in such schools. This study investigates the use of bivariate spatial modelling of available and multiple data sources to predict the prevalence of S. haematobium at every school along the Kenyan coast.Data from a questionnaire survey conducted by the Kenya Ministry of Education in Coast Province in 2009 were combined with available parasitological and environmental data in a Bayesian bivariate spatial model. This modeled the relationship between BIU data and environmental covariates, as well as the relationship between BIU and S. haematobium infection prevalence, to predict S. haematobium infection prevalence at all schools in the study region. Validation procedures were implemented to assess the predictive accuracy of endemicity classification.The prevalence of BIU was negatively correlated with distance to nearest river and there was considerable residual spatial correlation at small (~15 km spatial scales. There was a predictable relationship between the prevalence of reported BIU and S. haematobium infection. The final model exhibited excellent sensitivity (0.94 but moderate specificity (0.69 in identifying low (<10% prevalence schools, and had poor performance in differentiating between moderate and high prevalence schools (sensitivity 0.5, specificity 1.Schistosomiasis is highly focal and there is a need to target treatment on a school-by-school basis. The use of bivariate spatial modelling can supplement questionnaire data to identify schools requiring mass treatment, but is unable to distinguish between moderate and high prevalence schools.

  20. Mast cell distribution in normal adult skin

    NARCIS (Netherlands)

    A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)

    2005-01-01

    markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.

  1. Reliability Implications in Wood Systems of a Bivariate Gaussian-Weibull Distribution and the Associated Univariate Pseudo-truncated Weibull

    Science.gov (United States)

    Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield

    2014-01-01

    Two important wood properties are the modulus of elasticity (MOE) and the modulus of rupture (MOR). In the past, the statistical distribution of the MOE has often been modeled as Gaussian, and that of the MOR as lognormal or as a two- or three-parameter Weibull distribution. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior...

  2. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    Science.gov (United States)

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  3. Mast cell distribution in normal adult skin.

    Science.gov (United States)

    Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P

    2005-03-01

    To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.

  4. Bivariate Kumaraswamy Models via Modified FGM Copulas: Properties and Applications

    Directory of Open Access Journals (Sweden)

    Indranil Ghosh

    2017-11-01

    Full Text Available A copula is a useful tool for constructing bivariate and/or multivariate distributions. In this article, we consider a new modified class of FGM (Farlie–Gumbel–Morgenstern bivariate copula for constructing several different bivariate Kumaraswamy type copulas and discuss their structural properties, including dependence structures. It is established that construction of bivariate distributions by this method allows for greater flexibility in the values of Spearman’s correlation coefficient, ρ and Kendall’s τ .

  5. Choosing the Right Skew Normal Distribution: the Macroeconomist’ Dilemma

    OpenAIRE

    Wojciech Charemza; Carlos Díaz; Svetlana Makarova

    2015-01-01

    The paper discusses the consequences of possible misspecification in fitting skew normal distributions to empirical data. It is shown, through numerical experiments, that it is easy to choose a distribution which is different from that which generated the sample, if the minimum distance criterion is used. The distributions compared are the two-piece normal, weighted skew normal and the generalized Balakrishnan skew normal distribution which covers a variety of other skew normal distributions,...

  6. Ordinal Bivariate Inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    2016-01-01

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlation-increasing switches and....../or median-preserving spreads. For the canonical 2 × 2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  7. Ordinal bivariate inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlationincreasing switches and/or median......-preserving spreads. For the canonical 2x2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  8. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  9. A Generalization of the Skew-Normal Distribution: The Beta Skew-Normal

    OpenAIRE

    Mameli, Valentina; Musio, Monica

    2011-01-01

    The aim of this article is to introduce a new family of distributions, which generalizes the skew normal distribution (SN). This new family, called Beta skew-normal (BSN), arises naturally when we consider the distributions of order statistics of the SN. The BSN can also be obtained as a special case of the Beta generated distribution (Jones (2004)). In this work we pay attention to three other generalizations of the SN distribution: the Balakrishnan skew-normal (SNB) (Balakrishnan (2002), as...

  10. Some properties of normal moment distribution | Olosunde | Ife ...

    African Journals Online (AJOL)

    This paper provides an introductory overview of a portion of distribution theory in which we propose a new family of an extended form of a normal distribution called normal moment distribution; some of its properties are obtained. The cumulative distribution function which is not in close form but the table of the approximate ...

  11. Determining Normal-Distribution Tolerance Bounds Graphically

    Science.gov (United States)

    Mezzacappa, M. A.

    1983-01-01

    Graphical method requires calculations and table lookup. Distribution established from only three points: mean upper and lower confidence bounds and lower confidence bound of standard deviation. Method requires only few calculations with simple equations. Graphical procedure establishes best-fit line for measured data and bounds for selected confidence level and any distribution percentile.

  12. Monitoring bivariate process

    Directory of Open Access Journals (Sweden)

    Marcela A. G. Machado

    2009-12-01

    Full Text Available The T² chart and the generalized variance |S| chart are the usual tools for monitoring the mean vector and the covariance matrix of multivariate processes. The main drawback of these charts is the difficulty to obtain and to interpret the values of their monitoring statistics. In this paper, we study control charts for monitoring bivariate processes that only requires the computation of sample means (the ZMAX chart for monitoring the mean vector, sample variances (the VMAX chart for monitoring the covariance matrix, or both sample means and sample variances (the MCMAX chart in the case of the joint control of the mean vector and the covariance matrix.Os gráficos de T² e da variância amostral generalizada |S| são as ferramentas usualmente utilizadas no monitoramento do vetor de médias e da matriz de covariâncias de processos multivariados. A principal desvantagem desses gráficos é a dificuldade em obter e interpretar os valores de suas estatísticas de monitoramento. Neste artigo, estudam-se gráficos de controle para o monitoramento de processos bivariados que necessitam somente do cálculo de médias amostrais (gráfico ZMAX para o monitoramento do vetor de médias, ou das variâncias amostrais (gráfico VMAX para o monitoramento da matriz de covariâncias, ou então das médias e variâncias amostrais (gráfico MCMAX para o caso do monitoramento conjunto do vetor de médias e da matriz de covariâncias.

  13. One Criteria of Consent of Normal Distribution Law

    Directory of Open Access Journals (Sweden)

    Serezha N. Sandryan

    2013-01-01

    Full Text Available According to the Central limit theorem, normal probability distribution law is most often found in random phenomena. In this work the linear criteria of consent is developed for the verification of statistical hypotheses of the normal distribution law of the statistical population

  14. Comparing normal, lognormal and Weibull distributions for fitting ...

    African Journals Online (AJOL)

    Statistical probability density functions are widely used to model tree diameter distributions and to describe stand structure. The objective of this study was to compare the performance of normal, logarithmic-normal and threeparameter Weibull distributions for fitting diameter data from Akashmoni (Acacia auriculiformis A.

  15. Modified Normal Demand Distributions in (R,S)-Inventory Models

    NARCIS (Netherlands)

    Strijbosch, L.W.G.; Moors, J.J.A.

    2003-01-01

    To model demand, the normal distribution is by far the most popular; the disadvantage that it takes negative values is taken for granted.This paper proposes two modi.cations of the normal distribution, both taking non-negative values only.Safety factors and order-up-to-levels for the familiar (R,

  16. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    Science.gov (United States)

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  17. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  18. Financial Applications of Bivariate Markov Processes

    OpenAIRE

    Ortobelli Lozza, Sergio; Angelelli, Enrico; Bianchi, Annamaria

    2011-01-01

    This paper describes a methodology to approximate a bivariate Markov process by means of a proper Markov chain and presents possible financial applications in portfolio theory, option pricing and risk management. In particular, we first show how to model the joint distribution between market stochastic bounds and future wealth and propose an application to large-scale portfolio problems. Secondly, we examine an application to VaR estimation. Finally, we propose a methodology...

  19. Improved Root Normal Size Distributions for Liquid Atomization

    Science.gov (United States)

    2015-11-01

    parameters a and σ required to ensure that traditional (Type I) root normal size distributions have the correct count mean diameter . In the legend, ‘self...that is about 6% too large while the choice 9708.0a results in the correct mass mean diameter . Figure 3. Traditional (Type I) root normal size...Type II root normal size distributions have the correct mass mean diameter . This assumes that m=3. This is a plot of Equation (45). If )(MFM and

  20. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  1. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2018-02-26

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down the theoretical foundations for subsequent inference with this model. In particular, we study linear transformations, marginal distributions, selection representations, stochastic representations and hierarchical representations. We also describe an EM-type algorithm for maximum likelihood estimation of the parameters of the model and demonstrate its implementation on a wind dataset. Our family of multivariate distributions unifies and extends many existing models of the literature that can be seen as submodels of our proposal.

  2. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  3. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  4. Confidence bounds for normal and lognormal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill

    2003-01-01

    This paper compares the so-called exact approach for obtaining confidence intervals on normal distribution coefficients of variation to approximate methods. Approximate approaches were found to perform less well than the exact approach for large coefficients of variation and small sample sizes. Web-based computer programs are described for calculating confidence...

  5. Multivariate stochastic simulation with subjective multivariate normal distributions

    Science.gov (United States)

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  6. Sketching Curves for Normal Distributions--Geometric Connections

    Science.gov (United States)

    Bosse, Michael J.

    2006-01-01

    Within statistics instruction, students are often requested to sketch the curve representing a normal distribution with a given mean and standard deviation. Unfortunately, these sketches are often notoriously imprecise. Poor sketches are usually the result of missing mathematical knowledge. This paper considers relationships which exist among…

  7. Evaluating Transfer Entropy for Normal and y-Order Normal Distributions

    Czech Academy of Sciences Publication Activity Database

    Hlaváčková-Schindler, Kateřina; Toulias, T. L.; Kitsos, C. P.

    2016-01-01

    Roč. 17, č. 5 (2016), s. 1-20 ISSN 2231-0851 Institutional support: RVO:67985556 Keywords : Transfer entropy * time series * Kullback-Leibler divergence * causality * generalized normal distribution Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/hlavackova-schindler-0461261.pdf

  8. Percentile estimation using the normal and lognormal probability distribution

    International Nuclear Information System (INIS)

    Bement, T.R.

    1980-01-01

    Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution

  9. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela

    2016-05-11

    We introduce a density regression model for the spectral density of a bivariate extreme value distribution, that allows us to assess how extremal dependence can change over a covariate. Inference is performed through a double kernel estimator, which can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts of practical interest. An extreme temperature dataset is used to illustrate our methods. © 2016 Springer-Verlag Berlin Heidelberg

  10. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  11. Distributive justice and cognitive enhancement in lower, normal intelligence.

    Science.gov (United States)

    Dunlop, Mikael; Savulescu, Julian

    2014-01-01

    There exists a significant disparity within society between individuals in terms of intelligence. While intelligence varies naturally throughout society, the extent to which this impacts on the life opportunities it affords to each individual is greatly undervalued. Intelligence appears to have a prominent effect over a broad range of social and economic life outcomes. Many key determinants of well-being correlate highly with the results of IQ tests, and other measures of intelligence, and an IQ of 75 is generally accepted as the most important threshold in modern life. The ability to enhance our cognitive capacities offers an exciting opportunity to correct disabling natural variation and inequality in intelligence. Pharmaceutical cognitive enhancers, such as modafinil and methylphenidate, have been shown to have the capacity to enhance cognition in normal, healthy individuals. Perhaps of most relevance is the presence of an 'inverted U effect' for most pharmaceutical cognitive enhancers, whereby the degree of enhancement increases as intelligence levels deviate further below the mean. Although enhancement, including cognitive enhancement, has been much debated recently, we argue that there are egalitarian reasons to enhance individuals with low but normal intelligence. Under egalitarianism, cognitive enhancement has the potential to reduce opportunity inequality and contribute to relative income and welfare equality in the lower, normal intelligence subgroup. Cognitive enhancement use is justifiable under prioritarianism through various means of distribution; selective access to the lower, normal intelligence subgroup, universal access, or paradoxically through access primarily to the average and above average intelligence subgroups. Similarly, an aggregate increase in social well-being is achieved through similar means of distribution under utilitarianism. In addition, the use of cognitive enhancement within the lower, normal intelligence subgroup negates, or at

  12. Distribution of Dendritic Cells in Normal Human Salivary Glands

    International Nuclear Information System (INIS)

    Le, An; Saverin, Michele; Hand, Arthur R.

    2011-01-01

    Dendritic cells (DC) are believed to contribute to development of autoimmune sialadenitis, but little is known about their distribution in normal salivary glands. In this study, DC were identified and their distribution was determined in normal human parotid and submandibular glands. For light microscopy, salivary gland sections were stained with H&E or immunocytochemically using antibodies to DC markers. Transmission electron microscopy (TEM) was used to evaluate the ultrastructural characteristics of DC. In H&E sections, elongated, irregularly shaped nuclei were occasionally seen in the striated and excretory duct epithelium. Immunolabeling with anti-HLA-DR, anti-CD11c and anti-S100 revealed DC with numerous processes extending between ductal epithelial cells, often close to the lumen. Morphometric analyses indicated that HLA-DR-positive DC occupied approximately 4–11% of the duct wall volume. Similar reactive cells were present in acini, intercalated ducts and interstitial tissues. TEM observations revealed cells with indented nuclei containing dense chromatin, pale cytoplasm with few organelles, and lacking junctional attachments to adjacent cells. These results indicate that DC are abundant constituents of normal human salivary glands. Their location within ductal and acinar epithelium suggests a role in responding to foreign antigens and/or maintaining immunological tolerance to salivary proteins

  13. Distribution of normal superficial ocular vessels in digital images.

    Science.gov (United States)

    Banaee, Touka; Ehsaei, Asieh; Pourreza, Hamidreza; Khajedaluee, Mohammad; Abrishami, Mojtaba; Basiri, Mohsen; Daneshvar Kakhki, Ramin; Pourreza, Reza

    2014-02-01

    To investigate the distribution of different-sized vessels in the digital images of the ocular surface, an endeavor which may provide useful information for future studies. This study included 295 healthy individuals. From each participant, four digital photographs of the superior and inferior conjunctivae of both eyes, with a fixed succession of photography (right upper, right lower, left upper, left lower), were taken with a slit lamp mounted camera. Photographs were then analyzed by a previously described algorithm for vessel detection in the digital images. The area (of the image) occupied by vessels (AOV) of different sizes was measured. Height, weight, fasting blood sugar (FBS) and hemoglobin levels were also measured and the relationship between these parameters and the AOV was investigated. These findings indicated a statistically significant difference in the distribution of the AOV among the four conjunctival areas. No significant correlations were noted between the AOV of each conjunctival area and the different demographic and biometric factors. Medium-sized vessels were the most abundant vessels in the photographs of the four investigated conjunctival areas. The AOV of the different sizes of vessels follows a normal distribution curve in the four areas of the conjunctiva. The distribution of the vessels in successive photographs changes in a specific manner, with the mean AOV becoming larger as the photos were taken from the right upper to the left lower area. The AOV of vessel sizes has a normal distribution curve and medium-sized vessels occupy the largest area of the photograph. Copyright © 2013 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  14. Computation of distribution of minimum resolution for log-normal distribution of chromatographic peak heights.

    Science.gov (United States)

    Davis, Joe M

    2011-10-28

    General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Characteristic functions of scale mixtures of multivariate skew-normal distributions

    KAUST Repository

    Kim, Hyoung-Moon

    2011-08-01

    We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew-normal distributions. In particular, we describe the characteristic function of skew-normal, skew-t, and other related distributions. © 2011 Elsevier Inc.

  16. Concentration distribution of trace elements: from normal distribution to Levy flights

    Energy Technology Data Exchange (ETDEWEB)

    Kubala-Kukus, A. E-mail: aldona.kubala-kukus@pu.kielce.pl; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M

    2003-04-18

    The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail.

  17. Concentration distribution of trace elements: from normal distribution to Levy flights

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M.

    2003-01-01

    The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail

  18. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying

    2017-01-18

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  19. Basic study on radiation distribution sensing with normal optical fiber

    International Nuclear Information System (INIS)

    Naka, R.; Kawarabayashi, J.; Uritani, A.; Iguchi, T.; Kaneko, J.; Takeuchi, H.; Kakuta, T.

    2000-01-01

    Recently, some methods of radiation distribution sensing with optical fibers have been proposed. These methods employ scintillating fibers or scintillators with wavelength-shifting fibers. The positions of radiation interactions are detected by applying a time-of-flight (TOF) technique to the scintillation photon propagation. In the former method, the attenuation length for the scintillation photons in the scintillating fiber is relatively short, so that the operating length of the sensor is limited to several meters. In the latter method, a radiation distribution cannot continuously be obtained but discretely. To improve these shortcomings, a normal optical fiber made of polymethyl methacrylate (PMMA) is used in this study. Although the scintillation efficiency of PMMA is very low, several photons are emitted through interaction with a radiation. The fiber is transparent for the emitted photons to have a relatively long operating length. A radiation distribution can continuously be obtained. This paper describes a principle of the position sensing method based on the time of flight technique and preliminary results obtained for 90 Sr- 90 Y beta rays, 137 Cs gamma rays, and 14 MeV neutrons. The spatial resolutions for the above three kinds of radiations are 0.30 m, 0.37 m, 0.13 m, and the detection efficiencies are 1.1 x 10 -3 , 1.6 x 10 -7 , 5.4 x 10 -6 , respectively, with 10 m operation length. The results of a spectroscopic study on the optical property of the fiber are also described. (author)

  20. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  1. The Nickel Mass Distribution of Normal Type II Supernovae

    Science.gov (United States)

    Müller, Tomás; Prieto, José L.; Pejcha, Ondřej; Clocchiatti, Alejandro

    2017-06-01

    Core-collapse supernova (SN) explosions expose the structure and environment of massive stars at the moment of their death. We use the global fitting technique of Pejcha & Prieto to estimate a set of physical parameters of 19 normal SNe II, such as their distance moduli, reddenings, 56Ni masses {M}{Ni}, and explosion energies {E}\\exp from multicolor light curves and photospheric velocity curves. We confirm and characterize known correlations between {M}{Ni} and bolometric luminosity at 50 days after the explosion, and between {M}{Ni} and {E}\\exp . We pay special attention to the observed distribution of {M}{Ni} coming from a joint sample of 38 SNe II, which can be described as a skewed-Gaussian-like distribution between 0.005 {M}⊙ and 0.280 {M}⊙ , with a median of 0.031 {M}⊙ , mean of 0.046 {M}⊙ , standard deviation of 0.048 {M}⊙ , and skewness of 3.050. We use a two-sample Kolmogorov-Smirnov test and two-sample Anderson-Darling test to compare the observed distribution of {M}{Ni} to results from theoretical hydrodynamical codes of core-collapse explosions with the neutrino mechanism presented in the literature. Our results show that the theoretical distributions obtained from the codes tested in this work, KEPLER and Prometheus Hot Bubble, are compatible with the observations irrespective of different pre-SN calibrations and different maximum mass of the progenitors.

  2. Modeling Electronic Skin Response to Normal Distributed Force

    Directory of Open Access Journals (Sweden)

    Lucia Seminara

    2018-02-01

    Full Text Available The reference electronic skin is a sensor array based on PVDF (Polyvinylidene fluoride piezoelectric polymers, coupled to a rigid substrate and covered by an elastomer layer. It is first evaluated how a distributed normal force (Hertzian distribution is transmitted to an extended PVDF sensor through the elastomer layer. A simplified approach based on Boussinesq’s half-space assumption is used to get a qualitative picture and extensive FEM simulations allow determination of the quantitative response for the actual finite elastomer layer. The ultimate use of the present model is to estimate the electrical sensor output from a measure of a basic mechanical action at the skin surface. However this requires that the PVDF piezoelectric coefficient be known a-priori. This was not the case in the present investigation. However, the numerical model has been used to fit experimental data from a real skin prototype and to estimate the sensor piezoelectric coefficient. It turned out that this value depends on the preload and decreases as a result of PVDF aging and fatigue. This framework contains all the fundamental ingredients of a fully predictive model, suggesting a number of future developments potentially useful for skin design and validation of the fabrication technology.

  3. Modelling of Uncertainty and Bi-Variable Maps

    Science.gov (United States)

    Nánásiová, Ol'ga; Pykacz, Jarosław

    2016-05-01

    The paper gives an overview and compares various bi-varilable maps from orthomodular lattices into unit interval. It focuses mainly on such bi-variable maps that may be used for constructing joint probability distributions for random variables which are not defined on the same Boolean algebra.

  4. Visualizing Tensor Normal Distributions at Multiple Levels of Detail.

    Science.gov (United States)

    Abbasloo, Amin; Wiens, Vitalis; Hermann, Max; Schultz, Thomas

    2016-01-01

    Despite the widely recognized importance of symmetric second order tensor fields in medicine and engineering, the visualization of data uncertainty in tensor fields is still in its infancy. A recently proposed tensorial normal distribution, involving a fourth order covariance tensor, provides a mathematical description of how different aspects of the tensor field, such as trace, anisotropy, or orientation, vary and covary at each point. However, this wealth of information is far too rich for a human analyst to take in at a single glance, and no suitable visualization tools are available. We propose a novel approach that facilitates visual analysis of tensor covariance at multiple levels of detail. We start with a visual abstraction that uses slice views and direct volume rendering to indicate large-scale changes in the covariance structure, and locations with high overall variance. We then provide tools for interactive exploration, making it possible to drill down into different types of variability, such as in shape or orientation. Finally, we allow the analyst to focus on specific locations of the field, and provide tensor glyph animations and overlays that intuitively depict confidence intervals at those points. Our system is demonstrated by investigating the effects of measurement noise on diffusion tensor MRI, and by analyzing two ensembles of stress tensor fields from solid mechanics.

  5. Testing for bivariate spherical symmetry

    NARCIS (Netherlands)

    Einmahl, J.H.J.; Gantner, M.

    2012-01-01

    An omnibus test for spherical symmetry in R2 is proposed, employing localized empirical likelihood. The thus obtained test statistic is distribution free under the null hypothesis. The asymptotic null distribution is established and critical values for typical sample sizes, as well as the asymptotic

  6. Some case studies of skewed (and other ab-normal) data distributions arising in low-level environmental research

    International Nuclear Information System (INIS)

    Currie, L.A.

    2001-01-01

    Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85 Kr and 14 C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger- Mueller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test - for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14 C- 12 C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban ''soot''. The third, environmentally skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85 Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom. (orig.)

  7. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  8. Testing for Bivariate Spherical Symmetry

    NARCIS (Netherlands)

    Einmahl, J.H.J.; Gantner, M.

    2010-01-01

    An omnibus test for spherical symmetry in R2 is proposed, employing localized empirical likelihood. The thus obtained test statistic is distri- bution-free under the null hypothesis. The asymptotic null distribution is established and critical values for typical sample sizes, as well as the

  9. A new family of skewed slash distributions generated by the normal kernel

    Directory of Open Access Journals (Sweden)

    Bindu Punathumparambath

    2013-05-01

    Full Text Available The present paper is a generalization of the recent paper by Nadaraja and Kotz (2003 (Skewed distributions generated by the normal kernel, “Statistics & Probability Letters’’, 65, pp. 269-277. The new family of univariate skewed slash distributions generated by the normal kernel arises as the ratio of skewed distributions generated by the normal kernel and independent uniform power function distribution. The properties of the resulting distributions are studied. Normal, skew normal, slash (slash normal and skew slash distributions are special cases of this new family. The normal distribution belongs to this family, since when the skewness parameter is zero and tail parameter tends to infinity the skew slash distributions generated by normal kernel reduces to the normal distribution. The slash normal family is also belongs to this family when the skewness parameter is zero. These distributions provide us alternative choices in simulation study and in particular, in fitting skewed data sets with heavy tails. We believe that the new class will be useful for analyzing data sets having skewness and heavy tails. Heavy-tailed distributions are commonly found in complex multi-component systems like ecological systems, microarray, biometry, economics, sociology, internet traffic, finance, business etc. We are working on maximum likelihood estimation of the parameters using EM algorithm and to apply our models for analysing the genetic data sets.

  10. Testing for bivariate spherical symmetry

    OpenAIRE

    Einmahl, J.H.J.; Gantner, M.

    2012-01-01

    An omnibus test for spherical symmetry in R2 is proposed, employing localized empirical likelihood. The thus obtained test statistic is distri- bution-free under the null hypothesis. The asymptotic null distribution is established and critical values for typical sample sizes, as well as the asymptotic ones, are presented. In a simulation study, the good perfor- mance of the test is demonstrated. Furthermore, a real data example is presented.

  11. Bivariate hard thresholding in wavelet function estimation

    OpenAIRE

    Piotr Fryzlewicz

    2007-01-01

    We propose a generic bivariate hard thresholding estimator of the discrete wavelet coefficients of a function contaminated with i.i.d. Gaussian noise. We demonstrate its good risk properties in a motivating example, and derive upper bounds for its mean-square error. Motivated by the clustering of large wavelet coefficients in real-life signals, we propose two wavelet denoising algorithms, both of which use specific instances of our bivariate estimator. The BABTE algorithm uses basis averaging...

  12. Normal plantar weight distribution pattern and its variations with ...

    African Journals Online (AJOL)

    Early osteoarthritic changes at the knee result in altered plantar weight distribution pattern during stand, minisquat, squat and one leg stand positions. To study and quantify these plantar weight distribution variations with changes in static functional position, a cross-sectional study was conducted. A total of 202 subjects, ...

  13. Bivariate extreme value with application to PM10 concentration analysis

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-05-01

    This study is focus on a bivariate extreme of renormalized componentwise maxima with generalized extreme value distribution as a marginal function. The limiting joint distribution of several parametric models are presented. Maximum likelihood estimation is employed for parameter estimations and the best model is selected based on the Akaike Information Criterion. The weekly and monthly componentwise maxima series are extracted from the original observations of daily maxima PM10 data for two air quality monitoring stations located in Pasir Gudang and Johor Bahru. The 10 years data are considered for both stations from year 2001 to 2010. The asymmetric negative logistic model is found as the best fit bivariate extreme model for both weekly and monthly maxima componentwise series. However the dependence parameters show that the variables for weekly maxima series is more dependence to each other compared to the monthly maxima.

  14. Efficient algorithms for estimating the width of nearly normal distributions

    International Nuclear Information System (INIS)

    Akerlof, C.W.

    1983-01-01

    Typical physics data samples often conform to Gaussian distributions with admixtures of more slowly varying backgrounds. Under such circumstances the standard deviation is known to be a poor statistical measure of distribution width. As an alternative, the performance of Gini's mean difference is compared with the standard deviation and the mean deviation. Variants which sum over subsets of all possible pairs are shown to have statistical efficiencies comparable to the mean difference and mean deviation but do not require extensive data storage or a priori knowledge of the sample mean. These statistics are reasonable candidates for monitoring the distribution width of a real time data stream. (orig.)

  15. A method to dynamic stochastic multicriteria decision making with log-normally distributed random variables.

    Science.gov (United States)

    Wang, Xin-Fan; Wang, Jian-Qiang; Deng, Sheng-Yue

    2013-01-01

    We investigate the dynamic stochastic multicriteria decision making (SMCDM) problems, in which the criterion values take the form of log-normally distributed random variables, and the argument information is collected from different periods. We propose two new geometric aggregation operators, such as the log-normal distribution weighted geometric (LNDWG) operator and the dynamic log-normal distribution weighted geometric (DLNDWG) operator, and develop a method for dynamic SMCDM with log-normally distributed random variables. This method uses the DLNDWG operator and the LNDWG operator to aggregate the log-normally distributed criterion values, utilizes the entropy model of Shannon to generate the time weight vector, and utilizes the expectation values and variances of log-normal distributions to rank the alternatives and select the best one. Finally, an example is given to illustrate the feasibility and effectiveness of this developed method.

  16. Elk Distributions Relative to Spring Normalized Difference Vegetation Index Values

    OpenAIRE

    Samuel T. Smallidge; Terrell T. Baker; Dawn VanLeeuwen; William R. Gould; Bruce C. Thompson

    2010-01-01

    Rocky Mountain elk (Cervus elaphus) that winter near San Antonio Mountain in northern New Mexico provide important recreational and economic benefits while creating management challenges related to temporospatial variation in their spring movements. Our objective was to examine spring distributions of elk in relation to vegetative emergence as it progresses across the landscape as measured by remote sensing. Spring distributions of elk were closely associated with greater photosynthetic activ...

  17. Normalization.

    Science.gov (United States)

    Cuevas, Eduardo J.

    1997-01-01

    Discusses cornerstone of Montessori theory, normalization, which asserts that if a child is placed in an optimum prepared environment where inner impulses match external opportunities, the undeviated self emerges, a being totally in harmony with its surroundings. Makes distinctions regarding normalization, normalized, and normality, indicating how…

  18. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  19. Stellar Distributions and NIR Colours of Normal Galaxies

    NARCIS (Netherlands)

    Peletier, R. F.; Grijs, R. de

    1997-01-01

    Abstract: We discuss some results of a morphological study of edge-on galaxies, based on optical and especially near-infrared surface photometry. We find that the vertical surface brightness distributions of galaxies are fitted very well by exponential profiles, much better than by isothermal

  20. Different distribution of adriamycin in normal and leukaemic rats

    NARCIS (Netherlands)

    Sonneveld, P.; Bekkum, D.W. van

    1981-01-01

    Adriamycin (ADR) accumulates in well-perfused organs in the rat. This effect is especially evident for long periods in marrow and spleen of healthy animals. In rats bearing the Brown Norway Acute Myeloid Leukaemia (BNML) the in vitro distribution is significantly different. Maximum ADR levels in

  1. The retest distribution of the visual field summary index mean deviation is close to normal.

    Science.gov (United States)

    Anderson, Andrew J; Cheng, Allan C Y; Lau, Samantha; Le-Pham, Anne; Liu, Victor; Rahman, Farahnaz

    2016-09-01

    When modelling optimum strategies for how best to determine visual field progression in glaucoma, it is commonly assumed that the summary index mean deviation (MD) is normally distributed on repeated testing. Here we tested whether this assumption is correct. We obtained 42 reliable 24-2 Humphrey Field Analyzer SITA standard visual fields from one eye of each of five healthy young observers, with the first two fields excluded from analysis. Previous work has shown that although MD variability is higher in glaucoma, the shape of the MD distribution is similar to that found in normal visual fields. A Shapiro-Wilks test determined any deviation from normality. Kurtosis values for the distributions were also calculated. Data from each observer passed the Shapiro-Wilks normality test. Bootstrapped 95% confidence intervals for kurtosis encompassed the value for a normal distribution in four of five observers. When examined with quantile-quantile plots, distributions were close to normal and showed no consistent deviations across observers. The retest distribution of MD is not significantly different from normal in healthy observers, and so is likely also normally distributed - or nearly so - in those with glaucoma. Our results increase our confidence in the results of influential modelling studies where a normal distribution for MD was assumed. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.

  2. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  3. Normal distribution of standing balance for healthy Danish children

    DEFF Research Database (Denmark)

    Pedersen, Line Kjeldgaard; Ghasemi, Habib; Rahbek, Ole

    2013-01-01

    in children with orthopedic disabilities undergoing surgical procedures. Recent technology provides extremely usable sway analysis of balance parameters but a normal material for the standing balance of healthy children is lacking. Purpose/Aim of Study First, to assess standing balance in healthy Danish...... compared to open eyes (p=0,0000) and this difference was strongest in the lower grades. Girls had a significantly better balance than boys with open and closed eyes especially in the higher grades. Conclusions In this study we measured the effect of BMI, age, gender and visual information on standing...

  4. On the matched pairs sign test using bivariate ranked set sampling ...

    African Journals Online (AJOL)

    BVRSS) is introduced and investigated. We show that this test is asymptotically more efficient than its counterpart sign test based on a bivariate simple random sample (BVSRS). The asymptotic null distribution and the efficiency of the test are derived.

  5. Elk Distributions Relative to Spring Normalized Difference Vegetation Index Values

    International Nuclear Information System (INIS)

    Smallidge, S.T.; Baker, T.T.; VanLeeuwen, D.; Gould, W.R.; Thompson, B.C.

    2010-01-01

    Rocky Mountain elk (Cervus elaphus) that winter near San Antonio Mountain in northern New Mexico provide important recreational and economic benefits while creating management challenges related to temporospatial variation in their spring movements. Our objective was to examine spring distributions of elk in relation to vegetative emergence as it progresses across the landscape as measured by remote sensing. Spring distributions of elk were closely associated with greater photosynthetic activity of spring vegetation in 2 of 3 years as determined using NDVI values derived from AVHRR datasets. Observed elk locations were up to 271% greater than expected in the category representing the most photosynthetic activity. This association was not observed when analyses at a finer geographic scale were conducted. Managers facing challenges involving human-wildlife interactions and land-use issues should consider environmental conditions that may influence variation in elk association with greener portions of the landscape.

  6. Elk Distributions Relative to Spring Normalized Difference Vegetation Index Values

    Directory of Open Access Journals (Sweden)

    Samuel T. Smallidge

    2010-01-01

    Full Text Available Rocky Mountain elk (Cervus elaphus that winter near San Antonio Mountain in northern New Mexico provide important recreational and economic benefits while creating management challenges related to temporospatial variation in their spring movements. Our objective was to examine spring distributions of elk in relation to vegetative emergence as it progresses across the landscape as measured by remote sensing. Spring distributions of elk were closely associated with greater photosynthetic activity of spring vegetation in 2 of 3 years as determined using NDVI values derived from AVHRR datasets. Observed elk locations were up to 271% greater than expected in the category representing the most photosynthetic activity. This association was not observed when analyses at a finer geographic scale were conducted. Managers facing challenges involving human-wildlife interactions and land-use issues should consider environmental conditions that may influence variation in elk association with greener portions of the landscape.

  7. Superconductor-normal-superconductor with distributed Sharvin point contacts

    Science.gov (United States)

    Holcomb, Matthew J.; Little, William A.

    1994-01-01

    A non-linear superconducting junction device comprising a layer of high transient temperature superconducting material which is superconducting at an operating temperature, a layer of metal in contact with the layer of high temperature superconducting material and which remains non-superconducting at the operating temperature, and a metal material which is superconducting at the operating temperature and which forms distributed Sharvin point contacts with the metal layer.

  8. A Novel Generalized Normal Distribution for Human Longevity and other Negatively Skewed Data

    Science.gov (United States)

    Robertson, Henry T.; Allison, David B.

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution. PMID:22623974

  9. A novel generalized normal distribution for human longevity and other negatively skewed data.

    Science.gov (United States)

    Robertson, Henry T; Allison, David B

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.

  10. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  11. Normal and student´s t distributions and their applications

    CERN Document Server

    Ahsanullah, Mohammad; Shakil, Mohammad

    2014-01-01

    The most important properties of normal and Student t-distributions are presented. A number of applications of these properties are demonstrated. New related results dealing with the distributions of the sum, product and ratio of the independent normal and Student distributions are presented. The materials will be useful to the advanced undergraduate and graduate students and practitioners in the various fields of science and engineering.

  12. Kullback–Leibler Divergence of the γ–ordered Normal over t–distribution

    OpenAIRE

    Toulias, T-L.; Kitsos, C-P.

    2012-01-01

    The aim of this paper is to evaluate and study the Kullback–Leibler divergence of the γ–ordered Normal distribution, a generalization of Normal distribution emerged from the generalized Fisher’s information measure, over the scaled t–distribution. We investigate this evaluation through a series of bounds and approximations while the asymptotic behavior of the divergence is also studied. Moreover, we obtain a generalization of the known Kullback–Leibler information measure betwe...

  13. Evaluation of Kurtosis into the product of two normally distributed variables

    Science.gov (United States)

    Oliveira, Amílcar; Oliveira, Teresa; Seijas-Macías, Antonio

    2016-06-01

    Kurtosis (κ) is any measure of the "peakedness" of a distribution of a real-valued random variable. We study the evolution of the Kurtosis for the product of two normally distributed variables. Product of two normal variables is a very common problem for some areas of study, like, physics, economics, psychology, … Normal variables have a constant value for kurtosis (κ = 3), independently of the value of the two parameters: mean and variance. In fact, the excess kurtosis is defined as κ- 3 and the Normal Distribution Kurtosis is zero. The product of two normally distributed variables is a function of the parameters of the two variables and the correlation between then, and the range for kurtosis is in [0, 6] for independent variables and in [0, 12] when correlation between then is allowed.

  14. Asymptotic normality of conditional distribution estimation in the single index model

    Directory of Open Access Journals (Sweden)

    Hamdaoui Diaa Eddine

    2017-08-01

    Full Text Available This paper deals with the estimation of conditional distribution function based on the single-index model. The asymptotic normality of the conditional distribution estimator is established. Moreover, as an application, the asymptotic (1 − γ confidence interval of the conditional distribution function is given for 0 < γ < 1.

  15. Asymptotic normality of conditional distribution estimation in the single index model

    OpenAIRE

    Hamdaoui Diaa Eddine; Bouchentouf Amina Angelika; Rabhi Abbes; Guendouzi Toufik

    2017-01-01

    This paper deals with the estimation of conditional distribution function based on the single-index model. The asymptotic normality of the conditional distribution estimator is established. Moreover, as an application, the asymptotic (1 − γ) confidence interval of the conditional distribution function is given for 0 < γ < 1.

  16. The law of distribution of light beam direction fluctuations in telescopes. [normal density functions

    Science.gov (United States)

    Divinskiy, M. L.; Kolchinskiy, I. G.

    1974-01-01

    The distribution of deviations from mean star trail directions was studied on the basis of 105 star trails. It was found that about 93% of the trails yield a distribution in agreement with the normal law. About 4% of the star trails agree with the Charlier distribution.

  17. Bivariate flow cytometric analysis and sorting of different types of maize starch grains.

    Science.gov (United States)

    Zhang, Xudong; Feng, Jiaojiao; Wang, Heng; Zhu, Jianchu; Zhong, Yuyue; Liu, Linsan; Xu, Shutu; Zhang, Renhe; Zhang, Xinghua; Xue, Jiquan; Guo, Dongwei

    2018-02-01

    Particle-size distribution, granular structure, and composition significantly affect the physicochemical properties, rheological properties, and nutritional function of starch. Flow cytometry and flow sorting are widely considered convenient and efficient ways of classifying and separating natural biological particles or other substances into subpopulations, respectively, based on the differential response of each component to stimulation by a light beam; the results allow for the correlation analysis of parameters. In this study, different types of starches isolated from waxy maize, sweet maize, high-amylose maize, pop maize, and normal maize were initially classified into various subgroups by flow cytometer and then collected through flow sorting to observe their morphology and particle-size distribution. The results showed that a 0.25% Gelzan solution served as an optimal reagent for keeping individual starch particles homogeneously dispersed in suspension for a relatively long time. The bivariate flow cytometric population distributions indicated that the starches of normal maize, sweet maize, and pop maize were divided into two subgroups, whereas high-amylose maize starch had only one subgroup. Waxy maize starch, conversely, showed three subpopulations. The subgroups sorted by flow cytometer were determined and verified in terms of morphology and granule size by scanning electron microscopy and laser particle distribution analyzer. Results showed that flow cytometry can be regarded as a novel method for classifying and sorting starch granules. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  18. Limit distributions for the terms of central order statistics under power normalization

    OpenAIRE

    El Sayed M. Nigm

    2007-01-01

    In this paper the limiting distributions for sequences of central terms under power nonrandom normalization are obtained. The classes of the limit types having domain of L- attraction are investigated.

  19. Limit distributions for the terms of central order statistics under power normalization

    Directory of Open Access Journals (Sweden)

    El Sayed M. Nigm

    2007-12-01

    Full Text Available In this paper the limiting distributions for sequences of central terms under power nonrandom normalization are obtained. The classes of the limit types having domain of L- attraction are investigated.

  20. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  1. Semi-automated detection of aberrant chromosomes in bivariate flow karyotypes

    NARCIS (Netherlands)

    Boschman, G. A.; Manders, E. M.; Rens, W.; Slater, R.; Aten, J. A.

    1992-01-01

    A method is described that is designed to compare, in a standardized procedure, bivariate flow karyotypes of Hoechst 33258 (HO)/Chromomycin A3 (CA) stained human chromosomes from cells with aberrations with a reference flow karyotype of normal chromosomes. In addition to uniform normalization of

  2. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    Science.gov (United States)

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  3. Distribution of 51Cr labeled leukemia cells in mice: Comparison with representative normal cells

    International Nuclear Information System (INIS)

    Boranic, M.; Radacic, M.

    1978-01-01

    Cells of two transplantable leukemias of mice, one myeloid and one lymphoid, were labeled with 51 Cr in order to follow their distribution in hemopoietic and parenchymatous organs and blood of syngeneic recipients. Distribution of myeloid leukemia cells was compared with that of regenerating bone marrow cells and normal spleen cells. The organ distribution of myeloid leukemia cells was essentially different from that of cells of regenerating bone marrow, and both were different from that of normal spleen cells. Cells of lymphoid leukemia, which are presumably of B-lymphocyte origin, were compared with a B-lymphocyte enriched population, obtained from the lymph nodes of so-called TIR mice (thymectomized, irradiated, and reconstituted with syngeneic bone marrow), and with spleen cells of normal mice. The three patterns of organ distribution were different. It is concluded that the two leukemias studied each have a specific and characteristic distribution. (author)

  4. On Robustness of the Normal-Theory Based Asymptotic Distributions of Three Reliability Coefficient Estimates.

    Science.gov (United States)

    Yuan, Ke-Hai; Bentler, Peter M.

    2002-01-01

    Examined the asymptotic distributions of three reliability coefficient estimates: (1) sample coefficient alpha; (2) reliability estimate of a composite score following factor analysis; and (3) maximal reliability of a linear combination of item scores after factor analysis. Findings show that normal theory based asymptotic distributions for these…

  5. A study of the up-and-down method for non-normal distribution functions

    DEFF Research Database (Denmark)

    Vibholm, Svend; Thyregod, Poul

    1988-01-01

    The assessment of breakdown probabilities is examined by the up-and-down method. The exact maximum-likelihood estimates for a number of response patterns are calculated for three different distribution functions and are compared with the estimates corresponding to the normal distribution. Estimates...

  6. Computer program determines exact two-sided tolerance limits for normal distributions

    Science.gov (United States)

    Friedman, H. A.; Webb, S. R.

    1968-01-01

    Computer program determines by numerical integration the exact statistical two-sided tolerance limits, when the proportion between the limits is at least a specified number. The program is limited to situations in which the underlying probability distribution for the population sampled is the normal distribution with unknown mean and variance.

  7. Total LDH and LDH isoenzyme distribution in the serum of normal children

    NARCIS (Netherlands)

    Heiden, C.V.D.; Desplanque, J.; Stoop, J.W.; Wadman, S.K.

    1968-01-01

    Total LDH activity and LDH isoenzyme distribution were determined in sera of in normal children from 4 to 13 years old and compared to a control group of adult sera. It was found that in children the level of total LDH activity and the isoenzyme distribution did not differ significantly from that in

  8. Comparison of CSF Distribution between Idiopathic Normal Pressure Hydrocephalus and Alzheimer Disease.

    Science.gov (United States)

    Yamada, S; Ishikawa, M; Yamamoto, K

    2016-07-01

    CSF volumes in the basal cistern and Sylvian fissure are increased in both idiopathic normal pressure hydrocephalus and Alzheimer disease, though the differences in these volumes in idiopathic normal pressure hydrocephalus and Alzheimer disease have not been well-described. Using CSF segmentation and volume quantification, we compared the distribution of CSF in idiopathic normal pressure hydrocephalus and Alzheimer disease. CSF volumes were extracted from T2-weighted 3D spin-echo sequences on 3T MR imaging and quantified semi-automatically. We compared the volumes and ratios of the ventricles and subarachnoid spaces after classification in 30 patients diagnosed with idiopathic normal pressure hydrocephalus, 10 with concurrent idiopathic normal pressure hydrocephalus and Alzheimer disease, 18 with Alzheimer disease, and 26 control subjects 60 years of age or older. Brain to ventricle ratios at the anterior and posterior commissure levels and 3D volumetric convexity cistern to ventricle ratios were useful indices for the differential diagnosis of idiopathic normal pressure hydrocephalus or idiopathic normal pressure hydrocephalus with Alzheimer disease from Alzheimer disease, similar to the z-Evans index and callosal angle. The most distinctive characteristics of the CSF distribution in idiopathic normal pressure hydrocephalus were small convexity subarachnoid spaces and the large volume of the basal cistern and Sylvian fissure. The distribution of the subarachnoid spaces in the idiopathic normal pressure hydrocephalus with Alzheimer disease group was the most deformed among these 3 groups, though the mean ventricular volume of the idiopathic normal pressure hydrocephalus with Alzheimer disease group was intermediate between that of the idiopathic normal pressure hydrocephalus and Alzheimer disease groups. The z-axial expansion of the lateral ventricle and compression of the brain just above the ventricle were the common findings in the parameters for differentiating

  9. DBNorm: normalizing high-density oligonucleotide microarray data based on distributions.

    Science.gov (United States)

    Meng, Qinxue; Catchpoole, Daniel; Skillicorn, David; Kennedy, Paul J

    2017-11-29

    Data from patients with rare diseases is often produced using different platforms and probe sets because patients are widely distributed in space and time. Aggregating such data requires a method of normalization that makes patient records comparable. This paper proposed DBNorm, implemented as an R package, is an algorithm that normalizes arbitrarily distributed data to a common, comparable form. Specifically, DBNorm merges data distributions by fitting functions to each of them, and using the probability of each element drawn from the fitted distribution to merge it into a global distribution. DBNorm contains state-of-the-art fitting functions including Polynomial, Fourier and Gaussian distributions, and also allows users to define their own fitting functions if required. The performance of DBNorm is compared with z-score, average difference, quantile normalization and ComBat on a set of datasets, including several that are publically available. The performance of these normalization methods are compared using statistics, visualization, and classification when class labels are known based on a number of self-generated and public microarray datasets. The experimental results show that DBNorm achieves better normalization results than conventional methods. Finally, the approach has the potential to be applicable outside bioinformatics analysis.

  10. Central limit theorems for classical likelihood ratio tests for high-dimensional normal distributions

    OpenAIRE

    Jiang, Tiefeng; Yang, Fan

    2013-01-01

    For random samples of size $n$ obtained from $p$-variate normal distributions, we consider the classical likelihood ratio tests (LRT) for their means and covariance matrices in the high-dimensional setting. These test statistics have been extensively studied in multivariate analysis, and their limiting distributions under the null hypothesis were proved to be chi-square distributions as $n$ goes to infinity and $p$ remains fixed. In this paper, we consider the high-dimensional case where both...

  11. The rank of a normally distributed matrix and positive definiteness of a noncentral Wishart distributed matrix

    NARCIS (Netherlands)

    Steerneman, A. G. M.; van Perlo-ten Kleij, Frederieke

    2008-01-01

    If X similar to N-nxk(M, I-n circle times Sigma), then S = X'X has the noncentral Wishart distribution W-k(')(n, Sigma; A), where Lambda = M'M. Here Sigma is allowed to be singular. It is well known that if Lambda = 0, then S has a (central) Wishart distribution and. S is positive definite with

  12. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  13. Robust modeling of differential gene expression data using normal/independent distributions: a Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Mojtaba Ganjali

    Full Text Available In this paper, the problem of identifying differentially expressed genes under different conditions using gene expression microarray data, in the presence of outliers, is discussed. For this purpose, the robust modeling of gene expression data using some powerful distributions known as normal/independent distributions is considered. These distributions include the Student's t and normal distributions which have been used previously, but also include extensions such as the slash, the contaminated normal and the Laplace distributions. The purpose of this paper is to identify differentially expressed genes by considering these distributional assumptions instead of the normal distribution. A Bayesian approach using the Markov Chain Monte Carlo method is adopted for parameter estimation. Two publicly available gene expression data sets are analyzed using the proposed approach. The use of the robust models for detecting differentially expressed genes is investigated. This investigation shows that the choice of model for differentiating gene expression data is very important. This is due to the small number of replicates for each gene and the existence of outlying data. Comparison of the performance of these models is made using different statistical criteria and the ROC curve. The method is illustrated using some simulation studies. We demonstrate the flexibility of these robust models in identifying differentially expressed genes.

  14. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    Science.gov (United States)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when

  15. LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS

    Energy Technology Data Exchange (ETDEWEB)

    Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu [Division of Science and Mathematics, New York University Abu Dhabi, P.O. Box 129188, Abu Dhabi (United Arab Emirates)

    2017-01-20

    Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.

  16. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...... consist of a small areas, the dependence between stochastic inputs should be considered. In this paper, probabilistic analysis based on Monte Carlo simulation is described and applied to a real system....

  17. Validation of MCDS by comparison of predicted with experimental velocity distribution functions in rarefied normal shocks

    Science.gov (United States)

    Pham-Van-diep, Gerald C.; Erwin, Daniel A.

    1989-01-01

    Velocity distribution functions in normal shock waves in argon and helium are calculated using Monte Carlo direct simulation. These are compared with experimental results for argon at M = 7.18 and for helium at M = 1.59 and 20. For both argon and helium, the variable-hard-sphere (VHS) model is used for the elastic scattering cross section, with the velocity dependence derived from a viscosity-temperature power-law relationship in the way normally used by Bird (1976).

  18. Solving Bivariate Polynomial Systems on a GPU

    International Nuclear Information System (INIS)

    Moreno Maza, Marc; Pan Wei

    2012-01-01

    We present a CUDA implementation of dense multivariate polynomial arithmetic based on Fast Fourier Transforms over finite fields. Our core routine computes on the device (GPU) the subresultant chain of two polynomials with respect to a given variable. This subresultant chain is encoded by values on a FFT grid and is manipulated from the host (CPU) in higher-level procedures. We have realized a bivariate polynomial system solver supported by our GPU code. Our experimental results (including detailed profiling information and benchmarks against a serial polynomial system solver implementing the same algorithm) demonstrate that our strategy is well suited for GPU implementation and provides large speedup factors with respect to pure CPU code.

  19. Austenite Grain Size Estimtion from Chord Lengths of Logarithmic-Normal Distribution

    Directory of Open Access Journals (Sweden)

    Adrian H.

    2017-12-01

    Full Text Available Linear section of grains in polyhedral material microstructure is a system of chords. The mean length of chords is the linear grain size of the microstructure. For the prior austenite grains of low alloy structural steels, the chord length is a random variable of gamma- or logarithmic-normal distribution. The statistical grain size estimation belongs to the quantitative metallographic problems. The so-called point estimation is a well known procedure. The interval estimation (grain size confidence interval for the gamma distribution was given elsewhere, but for the logarithmic-normal distribution is the subject of the present contribution. The statistical analysis is analogous to the one for the gamma distribution.

  20. Efficient estimation of semiparametric copula models for bivariate survival data

    KAUST Repository

    Cheng, Guang

    2014-01-01

    A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

  1. Elastin distribution in the normal uterus, uterine leiomyomas, adenomyosis and adenomyomas: a comparison.

    Science.gov (United States)

    Zheng, Wei-Qiang; Ma, Rong; Zheng, Jian-Ming; Gong, Zhi-Jing

    2006-04-01

    To describe the histologic distribution of elastin in the nonpregnant human uterus, uterine leiomyomas, adenomyosis and adenomyomas. Uteri were obtained from women undergoing hysterectomy for benign conditions, including 26 cases of uterine leiomyomas, 24 cases of adenomyosis, 18 adenomyomas and 6 cases of autopsy specimens. Specific histochemical staining techniques were employed in order to demonstrate the distribution of elastin. The distribution of elastin components in the uterus was markedly uneven and showed a decreasing gradient from outer to inner myometrium. No elastin was present within leiomyomas, adenomyomas or adenomyosis. The distribution of elastin may help explain the normal function of the myometrium in labor. It implies that the uneven distribution of elastin components and absence of elastin within leiomyomas, adenomyomas and adenomyosis could be of some clinical significance. The altered elastin distribution in disease states may help explain such symptoms as dysmenorrhea in uterine endometriosis.

  2. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  3. Distribution of the anticancer drugs doxorubicin, mitoxantrone and topotecan in tumors and normal tissues.

    Science.gov (United States)

    Patel, Krupa J; Trédan, Olivier; Tannock, Ian F

    2013-07-01

    Pharmacokinetic analyses estimate the mean concentration of drug within a given tissue as a function of time, but do not give information about the spatial distribution of drugs within that tissue. Here, we compare the time-dependent spatial distribution of three anticancer drugs within tumors, heart, kidney, liver and brain. Mice bearing various xenografts were treated with doxorubicin, mitoxantrone or topotecan. At various times after injection, tumors and samples of heart, kidney, liver and brain were excised. Within solid tumors, the distribution of doxorubicin, mitoxantrone and topotecan was limited to perivascular regions at 10 min after administration and the distance from blood vessels at which drug intensity fell to half was ~25-75 μm. Although drug distribution improved after 3 and 24 h, there remained a significant decrease in drug fluorescence with increasing distance from tumor blood vessels. Drug distribution was relatively uniform in the heart, kidney and liver with substantially greater perivascular drug uptake than in tumors. There was significantly higher total drug fluorescence in the liver than in tumors after 10 min, 3 and 24 h. Little to no drug fluorescence was observed in the brain. There are marked differences in the spatial distributions of three anticancer drugs within tumor tissue and normal tissues over time, with greater exposure to most normal tissues and limited drug distribution to many cells in tumors. Studies of the spatial distribution of drugs are required to complement pharmacokinetic data in order to better understand and predict drug effects and toxicities.

  4. Optimization of b-value distribution for biexponential diffusion-weighted MR imaging of normal prostate.

    Science.gov (United States)

    Jambor, Ivan; Merisaari, Harri; Aronen, Hannu J; Järvinen, Jukka; Saunavaara, Jani; Kauko, Tommi; Borra, Ronald; Pesola, Marko

    2014-05-01

    To determine the optimal b-value distribution for biexponential diffusion-weighted imaging (DWI) of normal prostate using both a computer modeling approach and in vivo measurements. Optimal b-value distributions for the fit of three parameters (fast diffusion Df, slow diffusion Ds, and fraction of fast diffusion f) were determined using Monte-Carlo simulations. The optimal b-value distribution was calculated using four individual optimization methods. Eight healthy volunteers underwent four repeated 3 Tesla prostate DWI scans using both 16 equally distributed b-values and an optimized b-value distribution obtained from the simulations. The b-value distributions were compared in terms of measurement reliability and repeatability using Shrout-Fleiss analysis. Using low noise levels, the optimal b-value distribution formed three separate clusters at low (0-400 s/mm2), mid-range (650-1200 s/mm2), and high b-values (1700-2000 s/mm2). Higher noise levels resulted into less pronounced clustering of b-values. The clustered optimized b-value distribution demonstrated better measurement reliability and repeatability in Shrout-Fleiss analysis compared with 16 equally distributed b-values. The optimal b-value distribution was found to be a clustered distribution with b-values concentrated in the low, mid, and high ranges and was shown to improve the estimation quality of biexponential DWI parameters of in vivo experiments. Copyright © 2013 Wiley Periodicals, Inc.

  5. Parameter Estimation under Constraints for Multivariate Normal Distributions with Incomplete Data.

    Science.gov (United States)

    Zoppe, Alice; Buu, Yuh-Pey Anne; Flury, Bernard

    2001-01-01

    Presents an application of the EM-algorithm to two problems of estimation and testing in a multivariate normal distribution with missing data. The two models are tested applying the log-likelihood ratio test. Solves the problem of different and nonmonotone patterns of missing data by introducing suitable transformations and partitions of the data…

  6. Normal forms for sub-Lorentzian metrics supported on Engel type distributions

    Science.gov (United States)

    Grochowski, Marek

    2014-06-01

    We construct normal forms for Lorentzian metrics on Engel distributions under the assumption that abnormal curves are timelike future directed Hamiltonian geodesics. Then we indicate some cases in which the abnormal timelike future directed curve initiating at the origin is geometrically optimal. We also give certain estimates for reachable sets from a point.

  7. Normal Approximations to the Distributions of the Wilcoxon Statistics: Accurate to What "N"? Graphical Insights

    Science.gov (United States)

    Bellera, Carine A.; Julien, Marilyse; Hanley, James A.

    2010-01-01

    The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…

  8. The Weight of Euro Coins: Its Distribution Might Not Be as Normal as You Would Expect

    Science.gov (United States)

    Shkedy, Ziv; Aerts, Marc; Callaert, Herman

    2006-01-01

    Classical regression models, ANOVA models and linear mixed models are just three examples (out of many) in which the normal distribution of the response is an essential assumption of the model. In this paper we use a dataset of 2000 euro coins containing information (up to the milligram) about the weight of each coin, to illustrate that the…

  9. Using an APOS Framework to Understand Teachers' Responses to Questions on the Normal Distribution

    Science.gov (United States)

    Bansilal, Sarah

    2014-01-01

    This study is an exploration of teachers' engagement with concepts embedded in the normal distribution. The participants were a group of 290 in-service teachers enrolled in a teacher development program. The research instrument was an assessment task that can be described as an "unknown percentage" problem, which required the application…

  10. Reply to: Are There More Gifted People than Would Be Expected on a Normal Distribution?

    Science.gov (United States)

    Gallagher, James J.

    2014-01-01

    The author responds to the article by Warne, Godwin, and Smith (2013) on the question of whether there are more gifted people than would be expected in a Gaussian normal distribution. He asserts that the answer to this question is yes, based on (a) data that he and his colleagues have collected, (b) data that are already available and quoted by…

  11. Confidence Intervals for True Scores Using the Skew-Normal Distribution

    Science.gov (United States)

    Garcia-Perez, Miguel A.

    2010-01-01

    A recent comparative analysis of alternative interval estimation approaches and procedures has shown that confidence intervals (CIs) for true raw scores determined with the Score method--which uses the normal approximation to the binomial distribution--have actual coverage probabilities that are closest to their nominal level. It has also recently…

  12. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  13. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    Science.gov (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  14. Distributional Assumptions in Educational Assessments Analysis: Normal Distributions versus Generalized Beta Distribution in Modeling the Phenomenon of Learning

    Science.gov (United States)

    Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire

    2013-01-01

    This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…

  15. Distribution of Different Sized Ocular Surface Vessels in Diabetics and Normal Individuals.

    Science.gov (United States)

    Banaee, Touka; Pourreza, Hamidreza; Doosti, Hassan; Abrishami, Mojtaba; Ehsaei, Asieh; Basiry, Mohsen; Pourreza, Reza

    2017-01-01

    To compare the distribution of different sized vessels using digital photographs of the ocular surface of diabetic and normal individuals. In this cross-sectional study, red-free conjunctival photographs of diabetic and normal individuals, aged 30-60 years, were taken under defined conditions and analyzed using a Radon transform-based algorithm for vascular segmentation. The image areas occupied by vessels (AOV) of different diameters were calculated. The main outcome measure was the distribution curve of mean AOV of different sized vessels. Secondary outcome measures included total AOV and standard deviation (SD) of AOV of different sized vessels. Two hundred and sixty-eight diabetic patients and 297 normal (control) individuals were included, differing in age (45.50 ± 5.19 vs. 40.38 ± 6.19 years, P distribution curves of mean AOV differed between patients and controls (smaller AOV for larger vessels in patients; P distribution curve of vessels compared to controls. Presence of diabetes mellitus is associated with contraction of larger vessels in the conjunctiva. Smaller vessels dilate with diabetic retinopathy. These findings may be useful in the photographic screening of diabetes mellitus and retinopathy.

  16. A new derivative with normal distribution kernel: Theory, methods and applications

    Science.gov (United States)

    Atangana, Abdon; Gómez-Aguilar, J. F.

    2017-06-01

    New approach of fractional derivative with a new local kernel is suggested in this paper. The kernel introduced in this work is the well-known normal distribution that is a very common continuous probability distribution. This distribution is very important in statistics and also highly used in natural science and social sciences to portray real-valued random variables whose distributions are not known. Two definitions are suggested namely Atangana-Gómez Averaging in Liouville-Caputo and Riemann-Liouville sense. We presented some relationship with existing integrals transform operators. Numerical approximations for first and second order approximation are derived in detail. Some Applications of the new mathematical tools to describe some real world problems are presented in detail. This is a new door opened the field of statistics, natural and socials sciences.

  17. An inductive sensor for real-time measurement of plantar normal and shear forces distribution.

    Science.gov (United States)

    Du, Li; Zhu, Xiaoliang; Zhe, Jiang

    2015-05-01

    The objective of this paper is to demonstrate a multiplexed inductive force sensor for simultaneously measuring normal force and shear forces on a foot. The sensor measures the normal force and shear forces by monitoring the inductance changes of three planar sensing coils. Resonance frequency division multiplexing was applied to signals from the multiple sensing coils, making it feasible to simultaneously measure the three forces (normal force, shear forces in x- and y-axis) on a foot using only one set of measurement electronics with high sensitivity and resolution. The testing results of the prototype sensor have shown that the sensor is capable of measuring normal force ranging from 0 to 800 N and shear forces ranging from 0 to 130 N in real time. With its high resolution, high sensitivity, and the capability of monitoring forces at different positions of a foot simultaneously, this sensor can be potentially used for real-time measurement of plantar normal force and shear forces distribution on diabetes patient's foot. Real-time monitoring of the normal force and shear forces on diabetes patient's foot can provide useful information for physicians and diabetes patients to take actions in preventing foot ulceration.

  18. The Effects of Selection Strategies for Bivariate Loglinear Smoothing Models on NEAT Equating Functions

    Science.gov (United States)

    Moses, Tim; Holland, Paul W.

    2010-01-01

    In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…

  19. First-order dominance: stronger characterization and a bivariate checking algorithm

    DEFF Research Database (Denmark)

    Range, Troels Martin; Østerdal, Lars Peter Raahave

    2018-01-01

    distributions. Utilizing that this problem can be formulated as a transportation problem with a special structure, we provide a stronger characterization of multivariate first-order dominance and develop a linear time complexity checking algorithm for the bivariate case. We illustrate the use of the checking...

  20. Size distribution of interstellar particles. III. Peculiar extinctions and normal infrared extinction

    International Nuclear Information System (INIS)

    Mathis, J.S.; Wallenhorst, S.G.

    1981-01-01

    The effect of changing the upper and lower size limits of a distribution of bare graphite and silicate particles with n(a)αa/sup -q/ is investigated. Mathis, Rumpl, and Nordsieck showed that the normal extinction is matched very well by having the small-size cutoff, a/sub -/, roughly-equal0.005 or 0.01 μm, and the large size a/sub +/, about 0.25 μm, and q = 3.5 for both substances. We consider the progressively peculiar extinctions exhibited by the well-observed stars, sigma Sco, rho Oph, and theta 1 Ori C, with values of R/sub v/[equivalentA/sub v//E(B--V)] of 3.4, 4.4, and 5.5 compared to the normal 3.1. Two (sigma Sco, rho Oph) are in a neutral dense cloud; theta 1 Ori C is in the Orion Nebula. We find that sigma Sco has a normal graphite distribution but has had its small silicate particles removed, so that a/sub -/(sil)roughly-equal0.04 μm if q = 3.5, or q(sil) = 2.6 if the size limits are fixed. However, the upper size limit on silicates remains normal. In rho Oph, the graphite is still normal, but both a/sub -/(sil) and a/sub +/(sil) are increased, to about 0.04 μm and 0.4 or 0.5 μm, respectively, if q = 3.5, or q(sil)roughly-equal1.3 if the size limits are fixed. In theta 1 Ori, the small limit on graphite has increased to about 0.04 μm, or q(gra)roughly-equal3, while the silicates are about like those in rho Oph. The calculated lambda2175 bump is broader than the observed, but normal foreground extinction probably contributes appreciably to the observed bump. The absolute amount of extinction per H atom for rho Oph is not explained. The column density of H is so large that systematic effects might be present. Very large graphite particles (a>3 μm) are required to ''hide'' the graphite without overly affecting the visual extinction, but a normal (small) graphite size distribution is required by the lambda2175 bump. We feel that it is unlikely that such a bimodal distribution exists

  1. On the possible ''normalization'' of experimental curves of 230Th vertical distribution in abyssal oceanic sediments

    International Nuclear Information System (INIS)

    Kuznetsov, Yu.V.; Al'terman, Eh.I.; Lisitsyn, A.P.; AN SSSR, Moscow. Inst. Okeanologii)

    1981-01-01

    The possibilities of the method of normalization of experimental ionic curves in reference to dating of abyssal sediments and establishing their accumulation rapidities are studied. The method is based on using correlation between ionic curves extrema and variations of Fe, Mn, C org., and P contents in abyssal oceanic sediments. It has been found that the above method can be successfully applied for correction of 230 Th vertical distribution data obtained by low-background γ-spectrometry. The method leads to most reliable results in those cases when the vertical distribution curves in sediments of elements concentrators of 230 Th are symbasic between themselves. The normalization of experimental ionic curves in many cases gives the possibility to realize the sediment age stratification [ru

  2. American Option Pricing using GARCH models and the Normal Inverse Gaussian distribution

    DEFF Research Database (Denmark)

    Stentoft, Lars Peter

    In this paper we propose a feasible way to price American options in a model with time varying volatility and conditional skewness and leptokurtosis using GARCH processes and the Normal Inverse Gaussian distribution. We show how the risk neutral dynamics can be obtained in this model, we interpre....... In particular, improvements are found when considering the smile in implied standard deviations.......In this paper we propose a feasible way to price American options in a model with time varying volatility and conditional skewness and leptokurtosis using GARCH processes and the Normal Inverse Gaussian distribution. We show how the risk neutral dynamics can be obtained in this model, we interpret...... the effect of the riskneutralization, and we derive approximation procedures which allow for a computationally efficient implementation of the model. When the model is estimated on financial returns data the results indicate that compared to the Gaussian case the extension is important. A study of the model...

  3. Comparing of Normal Stress Distribution in Static and Dynamic Soil-Structure Interaction Analyses

    International Nuclear Information System (INIS)

    Kholdebarin, Alireza; Massumi, Ali; Davoodi, Mohammad; Tabatabaiefar, Hamid Reza

    2008-01-01

    It is important to consider the vertical component of earthquake loading and inertia force in soil-structure interaction analyses. In most circumstances, design engineers are primarily concerned about the analysis of behavior of foundations subjected to earthquake-induced forces transmitted from the bedrock. In this research, a single rigid foundation with designated geometrical parameters located on sandy-clay soil has been modeled in FLAC software with Finite Different Method and subjected to three different vertical components of earthquake records. In these cases, it is important to evaluate effect of footing on underlying soil and to consider normal stress in soil with and without footing. The distribution of normal stress under the footing in static and dynamic states has been studied and compared. This Comparison indicated that, increasing in normal stress under the footing caused by vertical component of ground excitations, has decreased dynamic vertical settlement in comparison with static state

  4. The distribution of YKL-40 in osteoarthritic and normal human articular cartilage

    DEFF Research Database (Denmark)

    Volck, B; Ostergaard, K; Johansen, J S

    1999-01-01

    YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL......-40 in osteoarthritic (n=9) and macroscopically normal (n=5) human articular cartilage, collected from 12 pre-selected areas of the femoral head, to discover a potential role for YKL-40 in cartilage remodelling in osteoarthritis. Immunohistochemical analysis showed that YKL-40 staining was found...... staining for YKL-40 was in general low in normal cartilage. The present findings, together with previous observations, suggests that YKL-40 may be of importance in cartilage remodelling/degradation of osteoarthritic joints....

  5. Spatial arrangement and size distribution of normal faults, Buckskin detachment upper plate, Western Arizona

    Science.gov (United States)

    Laubach, S. E.; Hundley, T. H.; Hooker, J. N.; Marrett, R. A.

    2018-03-01

    Fault arrays typically include a wide range of fault sizes and those faults may be randomly located, clustered together, or regularly or periodically located in a rock volume. Here, we investigate size distribution and spatial arrangement of normal faults using rigorous size-scaling methods and normalized correlation count (NCC). Outcrop data from Miocene sedimentary rocks in the immediate upper plate of the regional Buckskin detachment-low angle normal-fault, have differing patterns of spatial arrangement as a function of displacement (offset). Using lower size-thresholds of 1, 0.1, 0.01, and 0.001 m, displacements range over 5 orders of magnitude and have power-law frequency distributions spanning ∼ four orders of magnitude from less than 0.001 m to more than 100 m, with exponents of -0.6 and -0.9. The largest faults with >1 m displacement have a shallower size-distribution slope and regular spacing of about 20 m. In contrast, smaller faults have steep size-distribution slopes and irregular spacing, with NCC plateau patterns indicating imposed clustering. Cluster widths are 15 m for the 0.1-m threshold, 14 m for 0.01-m, and 1 m for 0.001-m displacement threshold faults. Results demonstrate normalized correlation count effectively characterizes the spatial arrangement patterns of these faults. Our example from a high-strain fault pattern above a detachment is compatible with size and spatial organization that was influenced primarily by boundary conditions such as fault shape, mechanical unit thickness and internal stratigraphy on a range of scales rather than purely by interaction among faults during their propagation.

  6. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  7. Breast cancer subtype distribution is different in normal weight, overweight, and obese women.

    Science.gov (United States)

    Gershuni, Victoria; Li, Yun R; Williams, Austin D; So, Alycia; Steel, Laura; Carrigan, Elena; Tchou, Julia

    2017-06-01

    Obesity is associated with tumor promoting pathways related to insulin resistance and chronic low-grade inflammation which have been linked to various disease states, including cancer. Many studies have focused on the relationship between obesity and increased estrogen production, which contributes to the pathogenesis of estrogen receptor-positive breast cancers. The link between obesity and other breast cancer subtypes, such as triple-negative breast cancer (TNBC) and Her2/neu+ (Her2+) breast cancer, is less clear. We hypothesize that obesity may be associated with the pathogenesis of specific breast cancer subtypes resulting in a different subtype distribution than normal weight women. A single-institution, retrospective analysis of tumor characteristics of 848 patients diagnosed with primary operable breast cancer between 2000 and 2013 was performed to evaluate the association between BMI and clinical outcome. Patients were grouped based on their BMI at time of diagnosis stratified into three subgroups: normal weight (BMI = 18-24.9), overweight (BMI = 25-29.9), and obese (BMI > 30). The distribution of breast cancer subtypes across the three BMI subgroups was compared. Obese and overweight women were more likely to present with TNBC and normal weight women with Her2+ breast cancer (p = 0.008). We demonstrated, for the first time, that breast cancer subtype distribution varied significantly according to BMI status. Our results suggested that obesity might activate molecular pathways other than the well-known obesity/estrogen circuit in the pathogenesis of breast cancer. Future studies are needed to understand the molecular mechanisms that drive the variation in subtype distribution across BMI subgroups.

  8. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming

    2013-08-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  9. A Monte Carlo Metropolis-Hastings algorithm for sampling from distributions with intractable normalizing constants.

    Science.gov (United States)

    Liang, Faming; Jin, Ick-Hoon

    2013-08-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem in machine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. The MCMH algorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals.

  10. Stereology of extremes; bivariate models and computation

    Czech Academy of Sciences Publication Activity Database

    Beneš, Viktor; Bodlák, M.; Hlubinka, D.

    2003-01-01

    Roč. 5, č. 3 (2003), s. 289-308 ISSN 1387-5841 R&D Projects: GA AV ČR IAA1075201; GA ČR GA201/03/0946 Institutional research plan: CEZ:AV0Z1075907 Keywords : sample extreme s * domain of attraction * normalizing constants Subject RIV: BA - General Mathematics

  11. Use of critical pathway models and log-normal frequency distributions for siting nuclear facilities

    International Nuclear Information System (INIS)

    Waite, D.A.; Denham, D.H.

    1975-01-01

    The advantages and disadvantages of potential sites for nuclear facilities are evaluated through the use of environmental pathway and log-normal distribution analysis. Environmental considerations of nuclear facility siting are necessarily geared to the identification of media believed to be sifnificant in terms of dose to man or to be potential centres for long-term accumulation of contaminants. To aid in meeting the scope and purpose of this identification, an exposure pathway diagram must be developed. This type of diagram helps to locate pertinent environmental media, points of expected long-term contaminant accumulation, and points of population/contaminant interface for both radioactive and non-radioactive contaminants. Confirmation of facility siting conclusions drawn from pathway considerations must usually be derived from an investigatory environmental surveillance programme. Battelle's experience with environmental surveillance data interpretation using log-normal techniques indicates that this distribution has much to offer in the planning, execution and analysis phases of such a programme. How these basic principles apply to the actual siting of a nuclear facility is demonstrated for a centrifuge-type uranium enrichment facility as an example. A model facility is examined to the extent of available data in terms of potential contaminants and facility general environmental needs. A critical exposure pathway diagram is developed to the point of prescribing the characteristics of an optimum site for such a facility. Possible necessary deviations from climatic constraints are reviewed and reconciled with conclusions drawn from the exposure pathway analysis. Details of log-normal distribution analysis techniques are presented, with examples of environmental surveillance data to illustrate data manipulation techniques and interpretation procedures as they affect the investigatory environmental surveillance programme. Appropriate consideration is given these

  12. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    Science.gov (United States)

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  13. A bivariate model for analyzing recurrent multi-type automobile failures

    Science.gov (United States)

    Sunethra, A. A.; Sooriyarachchi, M. R.

    2017-09-01

    The failure mechanism in an automobile can be defined as a system of multi-type recurrent failures where failures can occur due to various multi-type failure modes and these failures are repetitive such that more than one failure can occur from each failure mode. In analysing such automobile failures, both the time and type of the failure serve as response variables. However, these two response variables are highly correlated with each other since the timing of failures has an association with the mode of the failure. When there are more than one correlated response variables, the fitting of a multivariate model is more preferable than separate univariate models. Therefore, a bivariate model of time and type of failure becomes appealing for such automobile failure data. When there are multiple failure observations pertaining to a single automobile, such data cannot be treated as independent data because failure instances of a single automobile are correlated with each other while failures among different automobiles can be treated as independent. Therefore, this study proposes a bivariate model consisting time and type of failure as responses adjusted for correlated data. The proposed model was formulated following the approaches of shared parameter models and random effects models for joining the responses and for representing the correlated data respectively. The proposed model is applied to a sample of automobile failures with three types of failure modes and up to five failure recurrences. The parametric distributions that were suitable for the two responses of time to failure and type of failure were Weibull distribution and multinomial distribution respectively. The proposed bivariate model was programmed in SAS Procedure Proc NLMIXED by user programming appropriate likelihood functions. The performance of the bivariate model was compared with separate univariate models fitted for the two responses and it was identified that better performance is secured by

  14. Log-Normal Distribution in a Growing System with Weighted and Multiplicatively Interacting Particles

    Science.gov (United States)

    Fujihara, Akihiro; Tanimoto, Satoshi; Yamamoto, Hiroshi; Ohtsuki, Toshiya

    2018-03-01

    A growing system with weighted and multiplicatively interacting particles is investigated. Each particle has a quantity that changes multiplicatively after a binary interaction, with its growth rate controlled by a weight parameter in a homogeneous symmetric kernel. We consider the system using moment inequalities and analytically derive the log-normal-type tail in the probability distribution function of quantities when the parameter is negative, which is different from the result for single-body multiplicative processes. We also find that the system approaches a winner-take-all state when the parameter is positive.

  15. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting

    2011-03-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  16. Exact, time-independent estimation of clone size distributions in normal and mutated cells.

    Science.gov (United States)

    Roshan, A; Jones, P H; Greenman, C D

    2014-10-06

    Biological tools such as genetic lineage tracing, three-dimensional confocal microscopy and next-generation DNA sequencing are providing new ways to quantify the distribution of clones of normal and mutated cells. Understanding population-wide clone size distributions in vivo is complicated by multiple cell types within observed tissues, and overlapping birth and death processes. This has led to the increased need for mathematically informed models to understand their biological significance. Standard approaches usually require knowledge of clonal age. We show that modelling on clone size independent of time is an alternative method that offers certain analytical advantages; it can help parametrize these models, and obtain distributions for counts of mutated or proliferating cells, for example. When applied to a general birth-death process common in epithelial progenitors, this takes the form of a gambler's ruin problem, the solution of which relates to counting Motzkin lattice paths. Applying this approach to mutational processes, alternative, exact, formulations of classic Luria-Delbrück-type problems emerge. This approach can be extended beyond neutral models of mutant clonal evolution. Applications of these approaches are twofold. First, we resolve the probability of progenitor cells generating proliferating or differentiating progeny in clonal lineage tracing experiments in vivo or cell culture assays where clone age is not known. Second, we model mutation frequency distributions that deep sequencing of subclonal samples produce.

  17. A Vehicle for Bivariate Data Analysis

    Science.gov (United States)

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  18. Non-linear learning in online tutorial to enhance students’ knowledge on normal distribution application topic

    Science.gov (United States)

    Kartono; Suryadi, D.; Herman, T.

    2018-01-01

    This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.

  19. Elastic microfibril distribution in the cornea: Differences between normal and keratoconic stroma.

    Science.gov (United States)

    White, Tomas L; Lewis, Philip N; Young, Robert D; Kitazawa, Koji; Inatomi, Tsutomu; Kinoshita, Shigeru; Meek, Keith M

    2017-06-01

    The optical and biomechanical properties of the cornea are largely governed by the collagen-rich stroma, a layer that represents approximately 90% of the total thickness. Within the stroma, the specific arrangement of superimposed lamellae provides the tissue with tensile strength, whilst the spatial arrangement of individual collagen fibrils within the lamellae confers transparency. In keratoconus, this precise stromal arrangement is lost, resulting in ectasia and visual impairment. In the normal cornea, we previously characterised the three-dimensional arrangement of an elastic fiber network spanning the posterior stroma from limbus-to-limbus. In the peripheral cornea/limbus there are elastin-containing sheets or broad fibers, most of which become microfibril bundles (MBs) with little or no elastin component when reaching the central cornea. The purpose of the current study was to compare this network with the elastic fiber distribution in post-surgical keratoconic corneal buttons, using serial block face scanning electron microscopy and transmission electron microscopy. We have demonstrated that the MB distribution is very different in keratoconus. MBs are absent from a region of stroma anterior to Descemet's membrane, an area that is densely populated in normal cornea, whilst being concentrated below the epithelium, an area in which they are absent in normal cornea. We contend that these latter microfibrils are produced as a biomechanical response to provide additional strength to the anterior stroma in order to prevent tissue rupture at the apex of the cone. A lack of MBs anterior to Descemet's membrane in keratoconus would alter the biomechanical properties of the tissue, potentially contributing to the pathogenesis of the disease. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Preparation and bivariate analysis of suspensions of human chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    van den Engh, G.J.; Trask, B.J.; Gray, J.W.; Langlois, R.G.; Yu, L.C.

    1985-01-01

    Chromosomes were isolated from a variety of human cell types using a HEPES-buffered hypotonic solution (pH 8.0) containing KCl, MgSO/sub 4/ dithioerythritol, and RNase. The chromosomes isolated by this procedure could be stained with a variety of fluorescent stains including propidium iodide, chromomycin A3, and Hoeschst 33258. Addition of sodium citrate to the stained chromosomes was found to improve the total fluorescence resolution. High-quality bivariate Hoeschst vs. chromomycin fluorescence distributions were obtained for chromosomes isolated from a human fibroblast cell strain, a human colon carcinoma cell line, and human peripheral blood lymphocyte cultures. Good flow karyotypes were also obtained from primary amniotic cell cultures. The Hoeschst vs. chromomycin flow karyotypes of a given cell line, made at different times and at dye concentrations varying over fourfold ranges, show little variation in the relative peak positions of the chromosomes. The size of the DNA in chromosomes isolated using this procedure ranges from 20 to 50 kilobases. The described isolation procedure is simple, it yields high-quality flow karyotypes, and it can be used to prepare chromosomes from clinical samples. 22 references, 7 figures, 1 table.

  1. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Distribution and possible function of lysosomal enzymes in the inner ear under normal and pathophysiological conditions.

    Science.gov (United States)

    Schätzle, W

    1976-05-31

    The normal distribution of several lysosomal enzymes was studied in 20 guinea pigs. In the outer hair cells lysosomal enzymes are mainly localized at the apical cell pole, while in inner hair cells the distribution was uniform. Nonlysosomal enzymes like alcaline phosphatase are of predominantly basal localization. The concentration of some lysosomal enzymes like N-acetyl-beta-glucosaminidase was higher in outer than in inner hair cells while others like acid phosphatase, beta-glucuronidase and sulfatase showed a stronger reaction in the inner hair cells. After 10 days of sound overstimulation with 120 dB for 1 h a day, there was an increase of lysosomal enzyme content namely in the outer hair cells. There was no change of non-lysosomal enzymes. Under these conditions there might be a partial destruction of cellular organelles eliminated by lysosomal activity without loss of a total cell. In addition the distribution and possible function of lysosomal enzymes in other labyrinthine tissues was discussed.

  3. Skewed Normal Distribution Of Return Assets In Call European Option Pricing

    Directory of Open Access Journals (Sweden)

    Evy Sulistianingsih

    2011-12-01

    Full Text Available Option is one of security derivates. In financial market, option is a contract that gives a right (notthe obligation for its owner to buy or sell a particular asset for a certain price at a certain time.Option can give a guarantee for a risk that can be faced in a market.This paper studies about theuse of Skewed Normal Distribution (SN in call europeanoption pricing. The SN provides aflexible framework that captures the skewness of log return. We obtain aclosed form solution forthe european call option pricing when log return follow the SN. Then, we will compare optionprices that is obtained by the SN and the Black-Scholes model with the option prices of market. Keywords: skewed normaldistribution, log return, options.

  4. A Platoon Dispersion Model Based on a Truncated Normal Distribution of Speed

    Directory of Open Access Journals (Sweden)

    Ming Wei

    2012-01-01

    Full Text Available Understanding platoon dispersion is critical for the coordination of traffic signal control in an urban traffic network. Assuming that platoon speed follows a truncated normal distribution, ranging from minimum speed to maximum speed, this paper develops a piecewise density function that describes platoon dispersion characteristics as the platoon moves from an upstream to a downstream intersection. Based on this density function, the expected number of cars in the platoon that pass the downstream intersection, and the expected number of cars in the platoon that do not pass the downstream point are calculated. To facilitate coordination in a traffic signal control system, dispersion models for the front and the rear of the platoon are also derived. Finally, a numeric computation for the coordination of successive signals is presented to illustrate the validity of the proposed model.

  5. Financing options and economic impact: distributed generation using solar photovoltaic systems in Normal, Illinois

    Directory of Open Access Journals (Sweden)

    Jin H. Jo

    2016-04-01

    Full Text Available Due to increasing price volatility in fossil-fuel-produced energy, the demand for clean, renewable, and abundant energy is more prevalent than in past years. Solar photovoltaic (PV systems have been well documented for their ability to produce electrical energy while at the same time offering support to mitigate the negative externalities associated with fossil fuel combustion. Prices for PV systems have decreased over the past few years, however residential and commercial owners may still opt out of purchasing a system due to the overall price required for a PV system installation. Therefore, determining optimal financing options for residential and small-scale purchasers is a necessity. We report on payment methods currently used for distributed community solar projects throughout the US and suggest appropriate options for purchasers in Normal, Illinois given their economic status. We also examine the jobs and total economic impact of a PV system implementation in the case study area.

  6. Distribution Log Normal of 222 Rn in the state of Zacatecas, Mexico

    International Nuclear Information System (INIS)

    Garcia, M.L.; Mireles, F.; Quirino, L.; Davila, I.; Rios, C.; Pinedo, J.L.

    2006-01-01

    In this work the evaluation of the concentration of 222 Rn in air for Zacatecas is shown. The Solid State Nuclear Track Detectors were used as the technique for the realization of the measurements in large scale with cellulose nitrate LR-115, type 2, in open chambers of 222 Rn. The measurements were carried out during three months in different times of the year. In the results it is presented the log normal distribution, arithmetic mean and geometric media for the concentration at indoor and outdoor of residence constructions, the concentration at indoor of occupational constructions and in the 57 municipal heads of the state of Zacatecas. The statistics of the values in the concentration showed variation according to the time of the year, obtaining high quantities in winter seasons for both cases. The distribution of the concentration of 222 Rn is presented in the state map for each one of the municipalities, representing the measurement places in the entire state of Zacatecas. Finally the places where the values in the concentration of 222 Rn in air are near to the one limit settled down by the EPA of 148 Bq/m 3 are presented. (Author)

  7. [Calbindin and parvalbumin distribution in spinal cord of normal and rabies-infected mice].

    Science.gov (United States)

    Monroy-Gómez, Jeison; Torres-Fernández, Orlando

    2013-01-01

    Rabies is a fatal infectious disease of the nervous system; however, the knowledge about the pathogenic neural mechanisms in rabies is scarce. In addition, there are few studies of rabies pathology of the spinal cord. To study the distribution of calcium binding proteins calbindin and parvalbumin and assessing the effect of rabies virus infection on their expression in the spinal cord of mice. MATERIALES Y METHODS: Mice were inoculated with rabies virus, by intracerebral or intramuscular route. The spinal cord was extracted to perform some crosscuts which were treated by immunohistochemistry with monoclonal antibodies to reveal the presence of the two proteins in normal and rabies infected mice. We did qualitative and quantitative analyses of the immunoreactivity of the two proteins. Calbindin and parvalbumin showed differential distribution in Rexed laminae. Rabies infection produced a decrease in the expression of calbindin. On the contrary, the infection caused an increased expression of parvalbumin. The effect of rabies infection on the two proteins expression was similar when comparing both routes of inoculation. The differential effect of rabies virus infection on the expression of calbindin and parvalbumin in the spinal cord of mice was similar to that previously reported for brain areas. This result suggests uniformity in the response to rabies infection throughout the central nervous system. This is an important contribution to the understanding of the pathogenesis of rabies.

  8. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    Science.gov (United States)

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  9. Fitting statistical models in bivariate allometry.

    Science.gov (United States)

    Packard, Gary C; Birchard, Geoffrey F; Boardman, Thomas J

    2011-08-01

    Several attempts have been made in recent years to formulate a general explanation for what appear to be recurring patterns of allometric variation in morphology, physiology, and ecology of both plants and animals (e.g. the Metabolic Theory of Ecology, the Allometric Cascade, the Metabolic-Level Boundaries hypothesis). However, published estimates for parameters in allometric equations often are inaccurate, owing to undetected bias introduced by the traditional method for fitting lines to empirical data. The traditional method entails fitting a straight line to logarithmic transformations of the original data and then back-transforming the resulting equation to the arithmetic scale. Because of fundamental changes in distributions attending transformation of predictor and response variables, the traditional practice may cause influential outliers to go undetected, and it may result in an underparameterized model being fitted to the data. Also, substantial bias may be introduced by the insidious rotational distortion that accompanies regression analyses performed on logarithms. Consequently, the aforementioned patterns of allometric variation may be illusions, and the theoretical explanations may be wide of the mark. Problems attending the traditional procedure can be largely avoided in future research simply by performing preliminary analyses on arithmetic values and by validating fitted equations in the arithmetic domain. The goal of most allometric research is to characterize relationships between biological variables and body size, and this is done most effectively with data expressed in the units of measurement. Back-transforming from a straight line fitted to logarithms is not a generally reliable way to estimate an allometric equation in the original scale. © 2010 The Authors. Biological Reviews © 2010 Cambridge Philosophical Society.

  10. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering

    Science.gov (United States)

    Gamayunov, K. V.; Khazanov, G. V.

    2006-01-01

    calculate the pitch-angle diffusion coefficients using the typical wave normal distributions obtained from our self-consistent ring current-EMIC wave model, and try to quantify the effect of EMIC wave normal angle characteristics on relativistic electron scattering.

  11. mixsmsn: Fitting Finite Mixture of Scale Mixture of Skew-Normal Distributions

    Directory of Open Access Journals (Sweden)

    Marcos Oliveira Prates

    2013-09-01

    Full Text Available We present the R package mixsmsn, which implements routines for maximum likeli- hood estimation (via an expectation maximization EM-type algorithm in finite mixture models with components belonging to the class of scale mixtures of the skew-normal distribution, which we call the FMSMSN models. Both univariate and multivariate re- sponses are considered. It is possible to fix the number of components of the mixture to be fitted, but there exists an option that transfers this responsibility to an automated procedure, through the analysis of several models choice criteria. Plotting routines to generate histograms, plug-in densities and contour plots using the fitted models output are also available. The precision of the EM estimates can be evaluated through their esti- mated standard deviations, which can be obtained by the provision of an approximation of the associated information matrix for each particular model in the FMSMSN family. A function to generate artificial samples from several elements of the family is also supplied. Finally, two real data sets are analyzed in order to show the usefulness of the package.

  12. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering in Outer RB

    Science.gov (United States)

    Khazanov, G. V.; Gamayunov, K. V.

    2007-01-01

    We present the equatorial and bounce average pitch angle diffusion coefficients for scattering of relativistic electrons by the H+ mode of EMIC waves. Both the model (prescribed) and self consistent distributions over the wave normal angle are considered. The main results of our calculation can be summarized as follows: First, in comparison with field aligned waves, the intermediate and highly oblique waves reduce the pitch angle range subject to diffusion, and strongly suppress the scattering rate for low energy electrons (E less than 2 MeV). Second, for electron energies greater than 5 MeV, the |n| = 1 resonances operate only in a narrow region at large pitch-angles, and despite their greatest contribution in case of field aligned waves, cannot cause electron diffusion into the loss cone. For those energies, oblique waves at |n| greater than 1 resonances are more effective, extending the range of pitch angle diffusion down to the loss cone boundary, and increasing diffusion at small pitch angles by orders of magnitude.

  13. A method of moments to estimate bivariate survival functions: the copula approach

    Directory of Open Access Journals (Sweden)

    Silvia Angela Osmetti

    2013-05-01

    Full Text Available In this paper we discuss the problem on parametric and non parametric estimation of the distributions generated by the Marshall-Olkin copula. This copula comes from the Marshall-Olkin bivariate exponential distribution used in reliability analysis. We generalize this model by the copula and different marginal distributions to construct several bivariate survival functions. The cumulative distribution functions are not absolutely continuous and they unknown parameters are often not be obtained in explicit form. In order to estimate the parameters we propose an easy procedure based on the moments. This method consist in two steps: in the first step we estimate only the parameters of marginal distributions and in the second step we estimate only the copula parameter. This procedure can be used to estimate the parameters of complex survival functions in which it is difficult to find an explicit expression of the mixed moments. Moreover it is preferred to the maximum likelihood one for its simplex mathematic form; in particular for distributions whose maximum likelihood parameters estimators can not be obtained in explicit form.

  14. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster–Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran.

  15. Dissecting the correlation structure of a bivariate phenotype ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 84; Issue 2. Dissecting the correlation structure of a bivariate phenotype: common genes or shared environment? ... High correlations between two quantitative traits may be either due to common genetic factors or common environmental factors or a combination of both.

  16. An assessment on the use of bivariate, multivariate and soft ...

    Indian Academy of Sciences (India)

    Conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) models representing the bivariate, multivariate and soft computing techniques were used in GIS based collapse susceptibility mapping in an area from Sivas basin (Turkey). Collapse-related factors, directly or indirectly related to the ...

  17. An assessment on the use of bivariate, multivariate and soft ...

    Indian Academy of Sciences (India)

    The paper presented herein compares and discusses the use of bivariate, multivariate and soft computing techniques for ... map is a useful tool in urban planning. ..... 381. Table 1. Frequency ratio of geological factors to collapse occurrences and results of the P(A/Bi) obtained from the. Conditional Probability model. Class.

  18. About some properties of bivariate splines with shape parameters

    Science.gov (United States)

    Caliò, F.; Marchetti, E.

    2017-07-01

    The paper presents and proves geometrical properties of a particular bivariate function spline, built and algorithmically implemented in previous papers. The properties typical of this family of splines impact the field of computer graphics in particular that of the reverse engineering.

  19. How log-normal is your country? An analysis of the statistical distribution of the exported volumes of products

    Science.gov (United States)

    Annunziata, Mario Alberto; Petri, Alberto; Pontuale, Giorgio; Zaccaria, Andrea

    2016-10-01

    We have considered the statistical distributions of the volumes of 1131 products exported by 148 countries. We have found that the form of these distributions is not unique but heavily depends on the level of development of the nation, as expressed by macroeconomic indicators like GDP, GDP per capita, total export and a recently introduced measure for countries' economic complexity called fitness. We have identified three major classes: a) an incomplete log-normal shape, truncated on the left side, for the less developed countries, b) a complete log-normal, with a wider range of volumes, for nations characterized by intermediate economy, and c) a strongly asymmetric shape for countries with a high degree of development. Finally, the log-normality hypothesis has been checked for the distributions of all the 148 countries through different tests, Kolmogorov-Smirnov and Cramér-Von Mises, confirming that it cannot be rejected only for the countries of intermediate economy.

  20. Bivariate copulas on the exponentially weighted moving average control chart

    Directory of Open Access Journals (Sweden)

    Sasigarn Kuvattana

    2016-10-01

    Full Text Available This paper proposes four types of copulas on the Exponentially Weighted Moving Average (EWMA control chart when observations are from an exponential distribution using a Monte Carlo simulation approach. The performance of the control chart is based on the Average Run Length (ARL which is compared for each copula. Copula functions for specifying dependence between random variables are used and measured by Kendall’s tau. The results show that the Normal copula can be used for almost all shifts.

  1. Multiresolution transmission of the correlation modes between bivariate time series based on complex network theory

    Science.gov (United States)

    Huang, Xuan; An, Haizhong; Gao, Xiangyun; Hao, Xiaoqing; Liu, Pengpeng

    2015-06-01

    This study introduces an approach to study the multiscale transmission characteristics of the correlation modes between bivariate time series. The correlation between the bivariate time series fluctuates over time. The transmission among the correlation modes exhibits a multiscale phenomenon, which provides richer information. To investigate the multiscale transmission of the correlation modes, this paper describes a hybrid model integrating wavelet analysis and complex network theory to decompose and reconstruct the original bivariate time series into sequences in a joint time-frequency domain and defined the correlation modes at each time-frequency domain. We chose the crude oil spot and futures prices as the sample data. The empirical results indicate that the main duration of volatility (32-64 days) for the strongly positive correlation between the crude oil spot price and the futures price provides more useful information for investors. Moreover, the weighted degree, weighted indegree and weighted outdegree of the correlation modes follow power-law distributions. The correlation fluctuation strengthens the extent of persistence over the long term, whereas persistence weakens over the short and medium term. The primary correlation modes dominating the transmission process and the major intermediary modes in the transmission process are clustered both in the short and long term.

  2. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    Science.gov (United States)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  3. Distribution of CD163-positive cell and MHC class II-positive cell in the normal equine uveal tract.

    Science.gov (United States)

    Sano, Yuto; Matsuda, Kazuya; Okamoto, Minoru; Takehana, Kazushige; Hirayama, Kazuko; Taniyama, Hiroyuki

    2016-02-01

    Antigen-presenting cells (APCs) in the uveal tract participate in ocular immunity including immune homeostasis and the pathogenesis of uveitis. In horses, although uveitis is the most common ocular disorder, little is known about ocular immunity, such as the distribution of APCs. In this study, we investigated the distribution of CD163-positive and MHC II-positive cells in the normal equine uveal tract using an immunofluorescence technique. Eleven eyes from 10 Thoroughbred horses aged 1 to 24 years old were used. Indirect immunofluorescence was performed using the primary antibodies CD163, MHC class II (MHC II) and CD20. To demonstrate the site of their greatest distribution, positive cells were manually counted in 3 different parts of the uveal tract (ciliary body, iris and choroid), and their average number was assessed by statistical analysis. The distribution of pleomorphic CD163- and MHC II-expressed cells was detected throughout the equine uveal tract, but no CD20-expressed cells were detected. The statistical analysis demonstrated the distribution of CD163- and MHC II-positive cells focusing on the ciliary body. These results demonstrated that the ciliary body is the largest site of their distribution in the normal equine uveal tract, and the ciliary body is considered to play important roles in uveal and/or ocular immune homeostasis. The data provided in this study will help further understanding of equine ocular immunity in the normal state and might be beneficial for understanding of mechanisms of ocular disorders, such as equine uveitis.

  4. A Novel Multivariate Generalized Skew-Normal Distribution with Two Parameters BGSNn, m (λ1, λ2

    Directory of Open Access Journals (Sweden)

    Fathi B.

    2014-07-01

    Full Text Available In this paper we first introduce a new class of multivariate generalized asymmetric skew-normal distributions with two parameters λ1,λ2 that we present it by BGSNn, m (λ1,λ2, and we finally obtain some special properties of BGSNnm(λ1,λ2.

  5. A Novel Multivariate Generalized Skew-Normal Distribution with Two Parameters BGSNn, m (λ1, λ2)

    OpenAIRE

    Fathi B.; Hasanalipour P.

    2014-01-01

    In this paper we first introduce a new class of multivariate generalized asymmetric skew-normal distributions with two parameters λ1,λ2 that we present it by BGSNn, m (λ1,λ2), and we finally obtain some special properties of BGSNnm(λ1,λ2).

  6. Unified Formulation of Single- and Multimoment Normalizations of the Raindrop Size Distribution Based on the Gamma Probability Density Function

    NARCIS (Netherlands)

    Yu, N.; Delrieu, G.; Boudevillain, Brice; Hazenberg, P.; Uijlenhoet, R.

    2014-01-01

    This study offers a unified formulation of single- and multimoment normalizations of the raindrop size distribution (DSD), which have been proposed in the framework of scaling analyses in the literature. The key point is to consider a well-defined “general distribution” g(x) as the probability

  7. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, Addendum

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1975-01-01

    New results and insights concerning a previously published iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions were discussed. It was shown that the procedure converges locally to the consistent maximum likelihood estimate as long as a specified parameter is bounded between two limits. Bound values were given to yield optimal local convergence.

  8. Are There More Gifted People Than Would Be Expected in a Normal Distribution? An Investigation of the Overabundance Hypothesis

    Science.gov (United States)

    Warne, Russell T.; Godwin, Lindsey R.; Smith, Kyle V.

    2013-01-01

    Among some gifted education researchers, advocates, and practitioners, it is sometimes believed that there is a larger number of gifted people in the general population than would be predicted from a normal distribution (e.g., Gallagher, 2008; N. M. Robinson, Zigler, & Gallagher, 2000; Silverman, 1995, 2009), a belief that we termed the…

  9. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  10. Spectrum-based estimators of the bivariate Hurst exponent

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2014-01-01

    Roč. 90, č. 6 (2014), art. 062802 ISSN 1539-3755 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : bivariate Hurst exponent * power- law cross-correlations * estimation Subject RIV: AH - Economics Impact factor: 2.288, year: 2014 http://library.utia.cas.cz/separaty/2014/E/kristoufek-0436818.pdf

  11. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  12. Study of regional stability of 99Tcm-ECD distribution in normal brain by SPM and ROI

    International Nuclear Information System (INIS)

    Li Peiyong; Guo Wanhua; Chen Gang; Zhu Chengmo

    2001-01-01

    Objective: To quantify regional cerebral blood flow (rCBF) with repeated 99 Tc m -ECD brain SPECT. Methods: Each of thirteen normal volunteers (31.2 +- 11.8 years old) underwent 12 times of SPECT scanning 1 hour after injection of 99 Tc m -ECD, the acquisition lasted 60 minutes. The distribution of 99 Tc m -ECD in brain was analyzed by SPM and ROI method. Results: There was no difference of regional ECD distribution within 60 min in cortex, basal ganglia and thalamus showed by SPM. ROI analysis showed a very slow change of regional gray matter/white matter (G/W) over time, and the slope of the curve was almost zero. Conclusion: Regional ECD distribution is stable in normal brain. ECD clearance from brain is slow and the changes of it within 60 minutes were of no significant difference

  13. Carbon K-shell photoionization of CO: Molecular frame angular distributions of normal and conjugate shakeup satellites

    International Nuclear Information System (INIS)

    Jahnke, T.; Titze, J.; Foucar, L.; Wallauer, R.; Osipov, T.; Benis, E.P.; Jagutzki, O.; Arnold, W.; Czasch, A.; Staudte, A.; Schoeffler, M.; Alnaser, A.; Weber, T.; Prior, M.H.; Schmidt-Boecking, H.; Doerner, R.

    2011-01-01

    We have measured the molecular frame angular distributions of photoelectrons emitted from the Carbon K-shell of fixed-in-space CO molecules for the case of simultaneous excitation of the remaining molecular ion. Normal and conjugate shakeup states are observed. Photoelectrons belonging to normal Σ-satellite lines show an angular distribution resembling that observed for the main photoline at the same electron energy. Surprisingly a similar shape is found for conjugate shakeup states with Π-symmetry. In our data we identify shake rather than electron scattering (PEVE) as the mechanism producing the conjugate lines. The angular distributions clearly show the presence of a Σ shape resonance for all of the satellite lines.

  14. A Bayesian Model For The Estimation Of Latent Interaction And Quadratic Effects When Latent Variables Are Non-Normally Distributed.

    Science.gov (United States)

    Kelava, Augustin; Nagengast, Benjamin

    2012-09-01

    Structural equation models with interaction and quadratic effects have become a standard tool for testing nonlinear hypotheses in the social sciences. Most of the current approaches assume normally distributed latent predictor variables. In this article, we present a Bayesian model for the estimation of latent nonlinear effects when the latent predictor variables are nonnormally distributed. The nonnormal predictor distribution is approximated by a finite mixture distribution. We conduct a simulation study that demonstrates the advantages of the proposed Bayesian model over contemporary approaches (Latent Moderated Structural Equations [LMS], Quasi-Maximum-Likelihood [QML], and the extended unconstrained approach) when the latent predictor variables follow a nonnormal distribution. The conventional approaches show biased estimates of the nonlinear effects; the proposed Bayesian model provides unbiased estimates. We present an empirical example from work and stress research and provide syntax for substantive researchers. Advantages and limitations of the new model are discussed.

  15. Subchondral bone density distribution of the talus in clinically normal Labrador Retrievers.

    Science.gov (United States)

    Dingemanse, W; Müller-Gerbl, M; Jonkers, I; Vander Sloten, J; van Bree, H; Gielen, I

    2016-03-15

    Bones continually adapt their morphology to their load bearing function. At the level of the subchondral bone, the density distribution is highly correlated with the loading distribution of the joint. Therefore, subchondral bone density distribution can be used to study joint biomechanics non-invasively. In addition physiological and pathological joint loading is an important aspect of orthopaedic disease, and research focusing on joint biomechanics will benefit veterinary orthopaedics. This study was conducted to evaluate density distribution in the subchondral bone of the canine talus, as a parameter reflecting the long-term joint loading in the tarsocrural joint. Two main density maxima were found, one proximally on the medial trochlear ridge and one distally on the lateral trochlear ridge. All joints showed very similar density distribution patterns and no significant differences were found in the localisation of the density maxima between left and right limbs and between dogs. Based on the density distribution the lateral trochlear ridge is most likely subjected to highest loads within the tarsocrural joint. The joint loading distribution is very similar between dogs of the same breed. In addition, the joint loading distribution supports previous suggestions of the important role of biomechanics in the development of OC lesions in the tarsus. Important benefits of computed tomographic osteoabsorptiometry (CTOAM), i.e. the possibility of in vivo imaging and temporal evaluation, make this technique a valuable addition to the field of veterinary orthopaedic research.

  16. Different percentages of false-positive results obtained using five methods for the calculation of reference change values based on simulated normal and ln-normal distributions of data

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2016-01-01

    a homeostatic set point that follows a normal (Gaussian) distribution. This set point (or baseline in steady-state) should be estimated from a set of previous samples, but, in practice, decisions based on reference change value are often based on only two consecutive results. The original reference change value......-positive results. The aim of this study was to investigate false-positive results using five different published methods for calculation of reference change value. METHODS: The five reference change value methods were examined using normally and ln-normally distributed simulated data. RESULTS: One method performed...... best in approaching the theoretical false-positive percentages on normally distributed data and another method performed best on ln-normally distributed data. The commonly used reference change value method based on two results (without use of estimated set point) performed worst both on normally...

  17. Rotational velocity distribution of A stars: Searching for intrinsic slowly rotating normal A0-A1 stars

    Science.gov (United States)

    Royer, F.; Gebran, M.; Monier, R.; Caraty, Y.; Kiliçoğlu, T.; Pintado, O.; Adelman, S.; Smalley, B.; Reiners, A.; Hill, G.; Gulliver, A.

    2012-12-01

    Royer et al. (2007) showed that the distribution of rotational velocities for A0-A1 stars is bimodal although all known peculiar and/or binary stars had been excluded from their sample. We present here the preliminary results of the abundance analysis for 47 A0-A1 ``normal'' main sequence stars selected with v sin i slower than 65 kms. These high signal-to-noise spectra collected with ÉLODIE and SOPHIE (OHP) will allow us to obtain a clean sample of low v sin i normal A0-A1 stars and search for intrinsic slow rotators.

  18. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    Directory of Open Access Journals (Sweden)

    Casault Sébastien

    2016-05-01

    Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture

  19. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  20. Contributory fault and level of personal injury to drivers involved in head-on collisions: Application of copula-based bivariate ordinal models.

    Science.gov (United States)

    Wali, Behram; Khattak, Asad J; Xu, Jingjing

    2018-01-01

    The main objective of this study is to simultaneously investigate the degree of injury severity sustained by drivers involved in head-on collisions with respect to fault status designation. This is complicated to answer due to many issues, one of which is the potential presence of correlation between injury outcomes of drivers involved in the same head-on collision. To address this concern, we present seemingly unrelated bivariate ordered response models by analyzing the joint injury severity probability distribution of at-fault and not-at-fault drivers. Moreover, the assumption of bivariate normality of residuals and the linear form of stochastic dependence implied by such models may be unduly restrictive. To test this, Archimedean copula structures and normal mixture marginals are integrated into the joint estimation framework, which can characterize complex forms of stochastic dependencies and non-normality in residual terms. The models are estimated using 2013 Virginia police reported two-vehicle head-on collision data, where exactly one driver is at-fault. The results suggest that both at-fault and not-at-fault drivers sustained serious/fatal injuries in 8% of crashes, whereas, in 4% of the cases, the not-at-fault driver sustained a serious/fatal injury with no injury to the at-fault driver at all. Furthermore, if the at-fault driver is fatigued, apparently asleep, or has been drinking the not-at-fault driver is more likely to sustain a severe/fatal injury, controlling for other factors and potential correlations between the injury outcomes. While not-at-fault vehicle speed affects injury severity of at-fault driver, the effect is smaller than the effect of at-fault vehicle speed on at-fault injury outcome. Contrarily, and importantly, the effect of at-fault vehicle speed on injury severity of not-at-fault driver is almost equal to the effect of not-at-fault vehicle speed on injury outcome of not-at-fault driver. Compared to traditional ordered probability

  1. Site-dependent distribution of macrophages in normal human extraocular muscles

    NARCIS (Netherlands)

    Schmidt, E. D.; van der Gaag, R.; Mourits, M. P.; Koornneef, L.

    1993-01-01

    PURPOSE: Clinical data indicate that extraocular muscles have different susceptibilities for some orbital immune disorders depending on their anatomic location. The resident immunocompetent cells may be important mediators in the local pathogenesis of such disorders so the distribution of these

  2. Mixtures of Distributions, Moment Inequalities and Measures of Exponentiality and Normality

    NARCIS (Netherlands)

    Keilson, Julian; Sleutel, F.W.

    1974-01-01

    The central limit theorem and limit theorems for rarity require measures of normality and exponentiality for their implementation. Simple useful measures are exhibited for these in a metric space setting, obtained from inequalities for scale mixtures and power mixtures. It is shown that the Pearson

  3. Distribution Log Normal of {sup 222} Rn in the state of Zacatecas, Mexico; Distribucion Log Normal de {sup 222} Rn en el estado de Zacatecas, Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, M.L.; Mireles, F.; Quirino, L.; Davila, I.; Rios, C.; Pinedo, J.L. [Universidad de Zacatecas, Cipres 10, Frac. La Penuela, 98068 Zacatecas (Mexico)]. e-mail: mluisagb@hotmail.com

    2006-07-01

    In this work the evaluation of the concentration of {sup 222} Rn in air for Zacatecas is shown. The Solid State Nuclear Track Detectors were used as the technique for the realization of the measurements in large scale with cellulose nitrate LR-115, type 2, in open chambers of {sup 222} Rn. The measurements were carried out during three months in different times of the year. In the results it is presented the log normal distribution, arithmetic mean and geometric media for the concentration at indoor and outdoor of residence constructions, the concentration at indoor of occupational constructions and in the 57 municipal heads of the state of Zacatecas. The statistics of the values in the concentration showed variation according to the time of the year, obtaining high quantities in winter seasons for both cases. The distribution of the concentration of {sup 222} Rn is presented in the state map for each one of the municipalities, representing the measurement places in the entire state of Zacatecas. Finally the places where the values in the concentration of {sup 222} Rn in air are near to the one limit settled down by the EPA of 148 Bq/m{sup 3} are presented. (Author)

  4. Use of log-skew-normal distribution in analysis of continuous data with a discrete component at zero

    OpenAIRE

    Chai, High Seng; Bailey, Kent R.

    2008-01-01

    The problem of analyzing a continuous variable with a discrete component is addressed within the frame-work of the mixture model proposed by Moulton and Halsey. The model can be generalized by the introduction of the log-skew-normal distribution for the continuous component, and the fit can be significantly improved by its use, while retaining the interpretation of regression parameter estimates. Simulation studies and application to a real data set are used for demonstration.

  5. Problems with using the normal distribution--and ways to improve quality and efficiency of data analysis.

    Directory of Open Access Journals (Sweden)

    Eckhard Limpert

    Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.

  6. Computational approach to Thornley's problem by bivariate operational calculus

    Science.gov (United States)

    Bazhlekova, E.; Dimovski, I.

    2012-10-01

    Thornley's problem is an initial-boundary value problem with a nonlocal boundary condition for linear onedimensional reaction-diffusion equation, used as a mathematical model of spiral phyllotaxis in botany. Applying a bivariate operational calculus we find explicit representation of the solution, containing two convolution products of special solutions and the arbitrary initial and boundary functions. We use a non-classical convolution with respect to the space variable, extending in this way the classical Duhamel principle. The special solutions involved are represented in the form of fast convergent series. Numerical examples are considered to show the application of the present technique and to analyze the character of the solution.

  7. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Topographical Distribution of Arsenic, Manganese, and Selenium in the Normal Human Brain

    DEFF Research Database (Denmark)

    Larsen, Niels Agersnap; Pakkenberg, H.; Damsgaard, Else

    1979-01-01

    The concentrations of arsenic, manganese and selenium per gram wet tissue weight were determined in samples from 24 areas of normal human brains from 5 persons with ages ranging from 15 to 81 years of age. The concentrations of the 3 elements were determined for each sample by means of neutron......% for selenium. The results seem to indicate that arsenic is associated with the lipid phase, manganese with the dry matter and selenium with the aqueous phase of brain tissue....

  9. Rough Sets and Stomped Normal Distribution for Simultaneous Segmentation and Bias Field Correction in Brain MR Images.

    Science.gov (United States)

    Banerjee, Abhirup; Maji, Pradipta

    2015-12-01

    The segmentation of brain MR images into different tissue classes is an important task for automatic image analysis technique, particularly due to the presence of intensity inhomogeneity artifact in MR images. In this regard, this paper presents a novel approach for simultaneous segmentation and bias field correction in brain MR images. It integrates judiciously the concept of rough sets and the merit of a novel probability distribution, called stomped normal (SN) distribution. The intensity distribution of a tissue class is represented by SN distribution, where each tissue class consists of a crisp lower approximation and a probabilistic boundary region. The intensity distribution of brain MR image is modeled as a mixture of finite number of SN distributions and one uniform distribution. The proposed method incorporates both the expectation-maximization and hidden Markov random field frameworks to provide an accurate and robust segmentation. The performance of the proposed approach, along with a comparison with related methods, is demonstrated on a set of synthetic and real brain MR images for different bias fields and noise levels.

  10. Distributed hierarchical control architecture for integrating smart grid assets during normal and disrupted operations

    Science.gov (United States)

    Kalsi, Karan; Fuller, Jason C.; Somani, Abhishek; Pratt, Robert G.; Chassin, David P.; Hammerstrom, Donald J.

    2017-09-12

    Disclosed herein are representative embodiments of methods, apparatus, and systems for facilitating operation and control of a resource distribution system (such as a power grid). Among the disclosed embodiments is a distributed hierarchical control architecture (DHCA) that enables smart grid assets to effectively contribute to grid operations in a controllable manner, while helping to ensure system stability and equitably rewarding their contribution. Embodiments of the disclosed architecture can help unify the dispatch of these resources to provide both market-based and balancing services.

  11. Log-normal spray drop distribution...analyzed by two new computer programs

    Science.gov (United States)

    Gerald S. Walton

    1968-01-01

    Results of U.S. Forest Service research on chemical insecticides suggest that large drops are not as effective as small drops in carrying insecticides to target insects. Two new computer programs have been written to analyze size distribution properties of drops from spray nozzles. Coded in Fortran IV, the programs have been tested on both the CDC 6400 and the IBM 7094...

  12. Simulation study of pO2 distribution in induced tumour masses and normal tissues within a microcirculation environment.

    Science.gov (United States)

    Li, Mao; Li, Yan; Wen, Peng Paul

    2014-01-01

    The biological microenvironment is interrupted when tumour masses are introduced because of the strong competition for oxygen. During the period of avascular growth of tumours, capillaries that existed play a crucial role in supplying oxygen to both tumourous and healthy cells. Due to limitations of oxygen supply from capillaries, healthy cells have to compete for oxygen with tumourous cells. In this study, an improved Krogh's cylinder model which is more realistic than the previously reported assumption that oxygen is homogeneously distributed in a microenvironment, is proposed to describe the process of the oxygen diffusion from a capillary to its surrounding environment. The capillary wall permeability is also taken into account. The simulation study is conducted and the results show that when tumour masses are implanted at the upstream part of a capillary and followed by normal tissues, the whole normal tissues suffer from hypoxia. In contrast, when normal tissues are ahead of tumour masses, their pO2 is sufficient. In both situations, the pO2 in the whole normal tissues drops significantly due to the axial diffusion at the interface of normal tissues and tumourous cells. As the existence of the axial oxygen diffusion cannot supply the whole tumour masses, only these tumourous cells that are near the interface can be partially supplied, and have a small chance to survive.

  13. Skewness of the generalized centrifugal force divergence for a joint normal distribution of strain and vorticity components

    Science.gov (United States)

    Hua, Bach Lien

    1994-09-01

    This note attempts to connect the skewness of the probability distribution function (PDF) of pressure, which is commonly observed in two-dimensional turbulence, to differences in the geometry of the strain and vorticity fields. This paper illustrates analytically the respective roles of strain and vorticity in shaping the PDF of pressure, in the particular case of a joint normal distribution of velocity gradients. The latter assumption is not valid in general in direct numerical simulations (DNS) of two-dimensional turbulence but may apply to geostrophic turbulence in presence of a differential rotation (β effect). In essence, minus the Laplacian of pressure is the difference of squared strain and vorticity, a quantity which is named the generalized centrifugal force divergence (GCFD). Square strain and vorticity distributions follow chi-square statistics with unequal numbers of degrees of freedom, when one assumes a joint normal distribution of their components. Squared strain has two degrees of freedom and squared vorticity only one, thereby causing a skewness of the PDF of GCFD and hence of pressure.

  14. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    Science.gov (United States)

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  15. Effects of adipose tissue distribution on maximum lipid oxidation rate during exercise in normal-weight women.

    Science.gov (United States)

    Isacco, L; Thivel, D; Duclos, M; Aucouturier, J; Boisseau, N

    2014-06-01

    Fat mass localization affects lipid metabolism differently at rest and during exercise in overweight and normal-weight subjects. The aim of this study was to investigate the impact of a low vs high ratio of abdominal to lower-body fat mass (index of adipose tissue distribution) on the exercise intensity (Lipox(max)) that elicits the maximum lipid oxidation rate in normal-weight women. Twenty-one normal-weight women (22.0 ± 0.6 years, 22.3 ± 0.1 kg.m(-2)) were separated into two groups of either a low or high abdominal to lower-body fat mass ratio [L-A/LB (n = 11) or H-A/LB (n = 10), respectively]. Lipox(max) and maximum lipid oxidation rate (MLOR) were determined during a submaximum incremental exercise test. Abdominal and lower-body fat mass were determined from DXA scans. The two groups did not differ in aerobic fitness, total fat mass, or total and localized fat-free mass. Lipox(max) and MLOR were significantly lower in H-A/LB vs L-A/LB women (43 ± 3% VO(2max) vs 54 ± 4% VO(2max), and 4.8 ± 0.6 mg min(-1)kg FFM(-1)vs 8.4 ± 0.9 mg min(-1)kg FFM(-1), respectively; P normal-weight women, a predominantly abdominal fat mass distribution compared with a predominantly peripheral fat mass distribution is associated with a lower capacity to maximize lipid oxidation during exercise, as evidenced by their lower Lipox(max) and MLOR. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  16. Distribution of calcium pyroantimonate precipitates in Xenotoca Mauthner cells at normal and increased functional activity.

    Science.gov (United States)

    Moshkov, D A; Santalova, I M

    1995-04-01

    The pyroantimonate method was used for the ultrastructural localization of calcium ions (Ca2+) in Xenotoca Mauthner cells under normal conditions and after prolonged natural stimulation. In normal state, the highest concentration of these ions was observed as compact electron-dense precipitates inside the synaptic cleft exactly at the synaptic active zones. Some amount of dotted precipitates was revealed in the synaptic boutons. In the extracellular space and in the cytoplasm the precipitates are seen mainly as single membrane-bound dots. After prolonged stimulation significant redistribution of the precipitates was observed. They were entirely absent in the presynaptic areas, became diffuse and discontinuous or disappeared completely at the synaptic active zones. On the contrary, in the cytoplasmic organelles (subsynaptic cisternae, vacuoles, smooth reticulum, mitochondria) the precipitates were aggregated into continuous dense clusters inside the membranous compartments or on their surfaces. Also, large amounts of granules, not associated with membranes, were localized inside the cytoplasm directly at the cytoskeletal elements. It is suggested that membrane subsynaptic organelles are the primary structures which sequestrate, accumulate and retain Ca2+. Thus, these elements, together with deeper elements of smooth cytoplasmic reticulum, may control the cytoplasmic activity of Ca2+ and, as a consequence, control many physiologically significant reactions of the neurons.

  17. The pharmacokinetics, distribution and degradation of human recombinant interleukin 1 beta in normal rats

    DEFF Research Database (Denmark)

    Wogensen, L D; Welinder, B; Hejnaes, K R

    1991-01-01

    the circulation with a T1/2 alpha of 2.9 min and a T1/2 beta of 41.1 min. The central and peripheral volume of distribution was 20.7 and 19.1 ml/rat, respectively, and the metabolic clearance rate was 16.9 ml/min/kg. The kidney and liver showed the highest accumulation of tracer, and autoradiography demonstrated...

  18. Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning

    OpenAIRE

    Rachkovskij, Dmitri A.; Kussul, Ernst M.

    2001-01-01

    Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the...

  19. Normal loads program for aerodynamic lifting surface theory. [evaluation of spanwise and chordwise loading distributions

    Science.gov (United States)

    Medan, R. T.; Ray, K. S.

    1974-01-01

    A description of and users manual are presented for a U.S.A. FORTRAN 4 computer program which evaluates spanwise and chordwise loading distributions, lift coefficient, pitching moment coefficient, and other stability derivatives for thin wings in linearized, steady, subsonic flow. The program is based on a kernel function method lifting surface theory and is applicable to a large class of planforms including asymmetrical ones and ones with mixed straight and curved edges.

  20. The pharmacokinetics, distribution and degradation of human recombinant interleukin 1 beta in normal rats

    DEFF Research Database (Denmark)

    Reimers, J; Wogensen, L D; Welinder, B

    1991-01-01

    Based upon in vivo rat experiments it was recently suggested that interleukin 1 in the circulation may be implicated in the initial events of beta-cell destruction leading to insulin-dependent diabetes mellitus (IDDM) in humans. The aim of the present study was to estimate half-lives of distribut......Based upon in vivo rat experiments it was recently suggested that interleukin 1 in the circulation may be implicated in the initial events of beta-cell destruction leading to insulin-dependent diabetes mellitus (IDDM) in humans. The aim of the present study was to estimate half......-lives of distribution (T1/2 alpha) and elimination phases (T1/2 beta) of human recombinant interleukin 1 beta (rIL-1 beta), and its tissue distribution and cellular localization by means of mono-labelled, biologically active 125I-rIL-1 beta. After intravenous (i.v.) injection, 125I-rIL-1 beta was eliminated from...... of administration was of importance for the biological effects of rIL-1 beta, as demonstrated by a reduced food intake, increased rectal temperature and blood glucose after s.c. injection of rIL-1 beta compared with i.p. The present demonstration of intact rIL-1 beta in the circulation and the islets of Langerhans...

  1. Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis

    Science.gov (United States)

    Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2018-02-01

    The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).

  2. A bivariate measurement error model for semicontinuous and continuous variables: Application to nutritional epidemiology.

    Science.gov (United States)

    Kipnis, Victor; Freedman, Laurence S; Carroll, Raymond J; Midthune, Douglas

    2016-03-01

    Semicontinuous data in the form of a mixture of a large portion of zero values and continuously distributed positive values frequently arise in many areas of biostatistics. This article is motivated by the analysis of relationships between disease outcomes and intakes of episodically consumed dietary components. An important aspect of studies in nutritional epidemiology is that true diet is unobservable and commonly evaluated by food frequency questionnaires with substantial measurement error. Following the regression calibration approach for measurement error correction, unknown individual intakes in the risk model are replaced by their conditional expectations given mismeasured intakes and other model covariates. Those regression calibration predictors are estimated using short-term unbiased reference measurements in a calibration substudy. Since dietary intakes are often "energy-adjusted," e.g., by using ratios of the intake of interest to total energy intake, the correct estimation of the regression calibration predictor for each energy-adjusted episodically consumed dietary component requires modeling short-term reference measurements of the component (a semicontinuous variable), and energy (a continuous variable) simultaneously in a bivariate model. In this article, we develop such a bivariate model, together with its application to regression calibration. We illustrate the new methodology using data from the NIH-AARP Diet and Health Study (Schatzkin et al., 2001, American Journal of Epidemiology 154, 1119-1125), and also evaluate its performance in a simulation study. © 2015, The International Biometric Society.

  3. The pharmacokinetics, distribution and degradation of human recombinant interleukin 1 beta in normal rats

    DEFF Research Database (Denmark)

    Wogensen, L D; Welinder, B; Hejnaes, K R

    1991-01-01

    the circulation with a T1/2 alpha of 2.9 min and a T1/2 beta of 41.1 min. The central and peripheral volume of distribution was 20.7 and 19.1 ml/rat, respectively, and the metabolic clearance rate was 16.9 ml/min/kg. The kidney and liver showed the highest accumulation of tracer, and autoradiography demonstrated...... of administration was of importance for the biological effects of rIL-1 beta, as demonstrated by a reduced food intake, increased rectal temperature and blood glucose after s.c. injection of rIL-1 beta compared with i.p. The present demonstration of intact rIL-1 beta in the circulation and the islets of Langerhans...

  4. A normal T cell receptor beta CDR3 length distribution in patients with APECED.

    Science.gov (United States)

    Niemi, Heikki J; Laakso, Sini; Salminen, Jukka T; Arstila, T Petteri; Tuulasvaara, Anni

    2015-06-01

    Autoimmune polyendocrinopathy-candidiasis-ectodermal dystrophy (APECED) is caused by mutations in the AIRE gene. Murine studies suggest that AIRE controls thymic expression of tissue-restricted antigens, its absence allowing nonselected autoreactive cells to escape. We tested this in humans using the TCRβ CDR3 length repertoire as a surrogate of thymic selection, as it shortens during the process. Analysis of healthy thymuses showed an altogether 1.9 base pair shortening, starting at the CD4(+)CD8(+)CD3(low) stage and continuing until the CD4(+) subset, likely encompassing both the positive and negative selection. Comparison of five APECED patients with eight healthy controls showed a skewed repertoire with oligoclonal expansions in the patients' CD4(+) and CD8(+) populations. The average CDR3 length, however, was normal and unaffected by the skewing. This was also true of the hypothesized autoreactive CD8(+)CD45RA(+) population. We failed to detect a subset with an abnormally long CDR3 repertoire, as would be predicted by a failure in selection. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Bell-Type Inequalities for Bivariate Maps on Orthomodular Lattices

    Science.gov (United States)

    Pykacz, Jarosław; Valášková, L'ubica; Nánásiová, Ol'ga

    2015-08-01

    Bell-type inequalities on orthomodular lattices, in which conjunctions of propositions are not modeled by meets but by maps for simultaneous measurements (-maps), are studied. It is shown, that the most simple of these inequalities, that involves only two propositions, is always satisfied, contrary to what happens in the case of traditional version of this inequality in which conjunctions of propositions are modeled by meets. Equivalence of various Bell-type inequalities formulated with the aid of bivariate maps on orthomodular lattices is studied. Our investigations shed new light on the interpretation of various multivariate maps defined on orthomodular lattices already studied in the literature. The paper is concluded by showing the possibility of using -maps and -maps to represent counterfactual conjunctions and disjunctions of non-compatible propositions about quantum systems.

  6. Use of log-skew-normal distribution in analysis of continuous data with a discrete component at zero.

    Science.gov (United States)

    Chai, High Seng; Bailey, Kent R

    2008-08-15

    The problem of analyzing a continuous variable with a discrete component is addressed within the framework of the mixture model proposed by Moulton and Halsey (Biometrics 1995; 51:1570-1578). The model can be generalized by the introduction of the log-skew-normal distribution for the continuous component, and the fit can be significantly improved by its use, while retaining the interpretation of regression parameter estimates. Simulation studies and application to a real data set are used for demonstration. 2008 John Wiley & Sons, Ltd

  7. Interaction between a normal shock wave and a turbulent boundary layer at high transonic speeds. I - Pressure distribution

    Science.gov (United States)

    Messiter, A. F.

    1980-01-01

    Asymptotic solutions are derived for the pressure distribution in the interaction of a weak normal shock wave with a turbulent boundary layer. The undisturbed boundary layer is characterized by the law of the wall and the law of the wake for compressible flow. In the limiting case considered, for 'high' transonic speeds, the sonic line is very close to the wall. Comparisons with experiment are shown, with corrections included for the effect of longitudinal wall curvature and for the boundary-layer displacement effect in a circular pipe.

  8. Bivariate Extension of the Quadrature Method of Moments for Modeling Simultaneous Coagulation and Sintering of Particle Populations.

    Science.gov (United States)

    Wright, Douglas L.; McGraw, Robert; Rosner, Daniel E.

    2001-04-15

    We extendthe application of moment methods to multivariate suspended particle population problems-those for which size alone is insufficient to specify the state of a particle in the population. Specifically, a bivariate extension of the quadrature method of moments (QMOM) (R. McGraw, Aerosol Sci. Technol. 27, 255 (1997)) is presented for efficiently modeling the dynamics of a population of inorganic nanoparticles undergoing simultaneous coagulation and particle sintering. Continuum regime calculations are presented for the Koch-Friedlander-Tandon-Rosner model, which includes coagulation by Brownian diffusion (evaluated for particle fractal dimensions, D(f), in the range 1.8-3) and simultaneous sintering of the resulting aggregates (P. Tandon and D. E. Rosner, J. Colloid Interface Sci. 213, 273 (1999)). For evaluation purposes, and to demonstrate the computational efficiency of the bivariate QMOM, benchmark calculations are carried out using a high-resolution discrete method to evolve the particle distribution function n(nu, a) for short to intermediate times (where nu and a are particle volume and surface area, respectively). Time evolution of a selected set of 36 low-order mixed moments is obtained by integration of the full bivariate distribution and compared with the corresponding moments obtained directly using two different extensions of the QMOM. With the more extensive treatment, errors of less than 1% are obtained over substantial aerosol evolution, while requiring only a few minutes (rather than days) of CPU time. Longer time QMOM simulations lend support to the earlier finding of a self-preserving limit for the dimensionless joint (nu, a) particle distribution function under simultaneous coagulation and sintering (Tandon and Rosner, 1999; D. E. Rosner and S. Yu, AIChE J., 47 (2001)). We demonstrate that, even in the bivariate case, it is possible to use the QMOM to rapidly model the approach to asymptotic behavior, allowing an immediate assessment of

  9. A framework for analysis of abortive colony size distributions using a model of branching processes in irradiated normal human fibroblasts.

    Science.gov (United States)

    Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Ouchi, Noriyuki B; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki

    2013-01-01

    Clonogenicity gives important information about the cellular reproductive potential following ionizing irradiation, but an abortive colony that fails to continue to grow remains poorly characterized. It was recently reported that the fraction of abortive colonies increases with increasing dose. Thus, we set out to investigate the production kinetics of abortive colonies using a model of branching processes. We firstly plotted the experimentally determined colony size distribution of abortive colonies in irradiated normal human fibroblasts, and found the linear relationship on the log-linear or log-log plot. By applying the simple model of branching processes to the linear relationship, we found the persistent reproductive cell death (RCD) over several generations following irradiation. To verify the estimated probability of RCD, abortive colony size distribution (≤ 15 cells) and the surviving fraction were simulated by the Monte Carlo computational approach for colony expansion. Parameters estimated from the log-log fit demonstrated the good performance in both simulations than those from the log-linear fit. Radiation-induced RCD, i.e. excess probability, lasted over 16 generations and mainly consisted of two components in the early (probability over 5 generations, whereas abortive colony size distribution was robust against it. These results suggest that, whereas short-term RCD is critical to the abortive colony size distribution, long-lasting RCD is important for the dose response of the surviving fraction. Our present model provides a single framework for understanding the behavior of primary cell colonies in culture following irradiation.

  10. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed

    2017-08-28

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  11. Inland dissolved salt chemistry: statistical evaluation of bivariate and ternary diagram models for surface and subsurface waters

    Directory of Open Access Journals (Sweden)

    Stephen T. THRELKELD

    2000-08-01

    Full Text Available We compared the use of ternary and bivariate diagrams to distinguish the effects of atmospheric precipitation, rock weathering, and evaporation on inland surface and subsurface water chemistry. The three processes could not be statistically differentiated using bivariate models even if large water bodies were evaluated separate from small water bodies. Atmospheric precipitation effects were identified using ternary diagrams in water with total dissolved salts (TDS 1000 mg l-1. A principal components analysis showed that the variability in the relative proportions of the major ions was related to atmospheric precipitation, weathering, and evaporation. About half of the variation in the distribution of inorganic ions was related to rock weathering. By considering most of the important inorganic ions, ternary diagrams are able to distinguish the contributions of atmospheric precipitation, rock weathering, and evaporation to inland water chemistry.

  12. Global Bi-ventricular endocardial distribution of activation rate during long duration ventricular fibrillation in normal and heart failure canines.

    Science.gov (United States)

    Luo, Qingzhi; Jin, Qi; Zhang, Ning; Han, Yanxin; Wang, Yilong; Huang, Shangwei; Lin, Changjian; Ling, Tianyou; Chen, Kang; Pan, Wenqi; Wu, Liqun

    2017-04-13

    The objective of this study was to detect differences in the distribution of the left and right ventricle (LV & RV) activation rate (AR) during short-duration ventricular fibrillation (SDVF, 1 min) in normal and heart failure (HF) canine hearts. Ventricular fibrillation (VF) was electrically induced in six healthy dogs (control group) and six dogs with right ventricular pacing-induced congestive HF (HF group). Two 64-electrode basket catheters deployed in the LV and RV were used for global endocardium electrical mapping. The AR of VF was estimated by fast Fourier transform analysis from each electrode. In the control group, the LV was activated faster than the RV in the first 20 s, after which there was no detectable difference in the AR between them. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the posterior LV was activated fastest, while the anterior was slowest. In the HF group, a detectable AR gradient existed between the two ventricles within 3 min of VF, with the LV activating more quickly than the RV. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the septum of the LV was activated fastest, while the anterior was activated slowest. A global bi-ventricular endocardial AR gradient existed within the first 20 s of VF but disappeared in the LDVF in healthy hearts. However, the AR gradient was always observed in both SDVF and LDVF in HF hearts. The findings of this study suggest that LDVF in HF hearts can be maintained differently from normal hearts, which accordingly should lead to the development of different management strategies for LDVF resuscitation.

  13. Prospective microglia and brain macrophage distribution pattern in normal rat brain shows age sensitive dispersal and stabilization with development.

    Science.gov (United States)

    Ghosh, Payel; Mukherjee, Nabanita; Ghosh, Krishnendu; Mallick, Suvadip; Pal, Chiranjib; Laskar, Aparna; Ghosh, Anirban

    2015-09-01

    The monocytic lineage cells in brain, generally speaking brain macrophage and/or microglia show some dissimilar distribution patterns and disagreement regarding their origin and onset in brain. Here, we investigated its onset and distribution/colonization pattern in normal brain with development. Primarily, early and late embryonic stages, neonate and adult brains were sectioned for routine H/E staining; a modified silver-gold staining was used for discriminating monocytic lineage cells in brain; and TEM to deliver ultramicroscopic details of these cells in brain. Immunofluorescence study with CD11b marker revealed the distribution of active microglia/macrophage like cells. Overall, in early embryonic day 12, the band of densely stained cells are found at the margin of developing ventricles and cells sprout from there dispersed towards the outer edge. However, with development, this band shrunk and the dispersion trend decreased. The deeply stained macrophage like cell population migration from outer cortex to ventricle observed highest in late embryonic days, continued with decreased amount in neonates and settled down in adult. In adult, a few blood borne macrophage like cells were observed through the vascular margins. TEM study depicted less distinguishable features of cells in brain in early embryo, whereas from late embryo to adult different neuroglial populations and microglia/macrophages showed distinctive features and organization in brain. CD11b expression showed some similarity, though not fully, with the distribution pattern depending on the differentiation/activation status of these macrophage lineage cells. This study provides some generalized spatial and temporal pattern of macrophage/microglia distribution in rat brain, and further indicates some intrigue areas that need to be addressed.

  14. Spatial Distribution of Iron Within the Normal Human Liver Using Dual-Source Dual-Energy CT Imaging.

    Science.gov (United States)

    Abadia, Andres F; Grant, Katharine L; Carey, Kathleen E; Bolch, Wesley E; Morin, Richard L

    2017-11-01

    Explore the potential of dual-source dual-energy (DSDE) computed tomography (CT) to retrospectively analyze the uniformity of iron distribution and establish iron concentration ranges and distribution patterns found in healthy livers. Ten mixtures consisting of an iron nitrate solution and deionized water were prepared in test tubes and scanned using a DSDE 128-slice CT system. Iron images were derived from a 3-material decomposition algorithm (optimized for the quantification of iron). A conversion factor (mg Fe/mL per Hounsfield unit) was calculated from this phantom study as the quotient of known tube concentrations and their corresponding CT values. Retrospective analysis was performed of patients who had undergone DSDE imaging for renal stones. Thirty-seven patients with normal liver function were randomly selected (mean age, 52.5 years). The examinations were processed for iron concentration. Multiple regions of interest were analyzed, and iron concentration (mg Fe/mL) and distribution was reported. The mean conversion factor obtained from the phantom study was 0.15 mg Fe/mL per Hounsfield unit. Whole-liver mean iron concentrations yielded a range of 0.0 to 2.91 mg Fe/mL, with 94.6% (35/37) of the patients exhibiting mean concentrations below 1.0 mg Fe/mL. The most important finding was that iron concentration was not uniform and patients exhibited regionally high concentrations (36/37). These regions of higher concentration were observed to be dominant in the middle-to-upper part of the liver (75%), medially (72.2%), and anteriorly (83.3%). Dual-source dual-energy CT can be used to assess the uniformity of iron distribution in healthy subjects. Applying similar techniques to unhealthy livers, future research may focus on the impact of hepatic iron content and distribution for noninvasive assessment in diseased subjects.

  15. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model.

    Directory of Open Access Journals (Sweden)

    Habib Baghirov

    Full Text Available The treatment of brain diseases is hindered by the blood-brain barrier (BBB preventing most drugs from entering the brain. Focused ultrasound (FUS with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma.

  16. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model

    Science.gov (United States)

    Baghirov, Habib; Snipstad, Sofie; Sulheim, Einar; Berg, Sigrid; Hansen, Rune; Thorsen, Frits; Mørch, Yrr; Åslund, Andreas K. O.

    2018-01-01

    The treatment of brain diseases is hindered by the blood-brain barrier (BBB) preventing most drugs from entering the brain. Focused ultrasound (FUS) with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate) nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma. PMID:29338016

  17. A non-Gaussian multivariate distribution with all lower-dimensional Gaussians and related families

    KAUST Repository

    Dutta, Subhajit

    2014-07-28

    Several fascinating examples of non-Gaussian bivariate distributions which have marginal distribution functions to be Gaussian have been proposed in the literature. These examples often clarify several properties associated with the normal distribution. In this paper, we generalize this result in the sense that we construct a pp-dimensional distribution for which any proper subset of its components has the Gaussian distribution. However, the jointpp-dimensional distribution is inconsistent with the distribution of these subsets because it is not Gaussian. We study the probabilistic properties of this non-Gaussian multivariate distribution in detail. Interestingly, several popular tests of multivariate normality fail to identify this pp-dimensional distribution as non-Gaussian. We further extend our construction to a class of elliptically contoured distributions as well as skewed distributions arising from selections, for instance the multivariate skew-normal distribution.

  18. Determination and correlation of spatial distribution of trace elements in normal and neoplastic breast tissues evaluated by {mu}-XRF

    Energy Technology Data Exchange (ETDEWEB)

    Silva, M.P.; Oliveira, M.A.; Poletti, M.E. [Universidade de Sao Paulo (USP),Ribeirao Preto, SP (Brazil)

    2012-07-01

    Full text: Some trace elements, naturally present in breast tissues, participate in a large number of biological processes, which include among others, activation or inhibition of enzymatic reactions and changes on cell membranes permeability, suggesting that these elements may influence carcinogenic processes. Thus, knowledge of the amounts of these elements and their spatial distribution in normal and neoplastic tissues may help in understanding the role of these elements in the carcinogenic process and tumor progression of breast cancers. Concentrations of trace elements like Ca, Fe, Cu and Zn, previously studied at LNLS using TXRF and conventional XRF, were elevated in neoplastic breast tissues compared to normal tissues. In this study we determined the spatial distribution of these elements in normal and neoplastic breast tissues using {mu}-XRF technique. We analyzed 22 samples of normal and neoplastic breast tissues (malignant and benign) obtained from paraffin blocks available for study at the Department of Pathology HC-FMRP/USP. From the blocks, a small fraction of material was removed and subjected to histological sections of 60 {mu}m thick made with a microtome. The slices where placed in holder samples and covered with ultralen film. Tissue samples were irradiated with a white beam of synchrotron radiation. The samples were positioned at 45 degrees with respect to the incident beam on a table with 3 freedom degrees (x, y and z), allowing independent positioning of the sample in these directions. The white beam was collimated by a 20 {mu}m microcapillary and samples were fully scanned. At each step, a spectrum was detected for 10 s. The fluorescence emitted by elements present in the sample was detected by a Si (Li) detector with 165 eV at 5.9 keV energy resolution, placed at 90 deg with respect to the incident beam. Results reveal that trace elements Ca-Zn and Fe-Cu could to be correlated in malignant breast tissues. Quantitative results, achieved by

  19. A bivariate optimal replacement policy for a multistate repairable system

    International Nuclear Information System (INIS)

    Zhang Yuanlin; Yam, Richard C.M.; Zuo, Ming J.

    2007-01-01

    In this paper, a deteriorating simple repairable system with k+1 states, including k failure states and one working state, is studied. It is assumed that the system after repair is not 'as good as new' and the deterioration of the system is stochastic. We consider a bivariate replacement policy, denoted by (T,N), in which the system is replaced when its working age has reached T or the number of failures it has experienced has reached N, whichever occurs first. The objective is to determine the optimal replacement policy (T,N)* such that the long-run expected profit per unit time is maximized. The explicit expression of the long-run expected profit per unit time is derived and the corresponding optimal replacement policy can be determined analytically or numerically. We prove that the optimal policy (T,N)* is better than the optimal policy N* for a multistate simple repairable system. We also show that a general monotone process model for a multistate simple repairable system is equivalent to a geometric process model for a two-state simple repairable system in the sense that they have the same structure for the long-run expected profit (or cost) per unit time and the same optimal policy. Finally, a numerical example is given to illustrate the theoretical results

  20. Epileptic seizure prediction based on a bivariate spectral power methodology.

    Science.gov (United States)

    Bandarabadi, Mojtaba; Teixeira, Cesar A; Direito, Bruno; Dourado, Antonio

    2012-01-01

    The spectral power of 5 frequently considered frequency bands (Alpha, Beta, Gamma, Theta and Delta) for 6 EEG channels is computed and then all the possible pairwise combinations among the 30 features set, are used to create a 435 dimensional feature space. Two new feature selection methods are introduced to choose the best candidate features among those and to reduce the dimensionality of this feature space. The selected features are then fed to Support Vector Machines (SVMs) that classify the cerebral state in preictal and non-preictal classes. The outputs of the SVM are regularized using a method that accounts for the classification dynamics of the preictal class, also known as "Firing Power" method. The results obtained using our feature selection approaches are compared with the ones obtained using minimum Redundancy Maximum Relevance (mRMR) feature selection method. The results in a group of 12 patients of the EPILEPSIAE database, containing 46 seizures and 787 hours multichannel recording for out-of-sample data, indicate the efficiency of the bivariate approach as well as the two new feature selection methods. The best results presented sensitivity of 76.09% (35 of 46 seizures predicted) and a false prediction rate of 0.15(-1).

  1. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  2. A framework for analysis of abortive colony size distributions using a model of branching processes in irradiated normal human fibroblasts.

    Directory of Open Access Journals (Sweden)

    Tetsuya Sakashita

    Full Text Available Clonogenicity gives important information about the cellular reproductive potential following ionizing irradiation, but an abortive colony that fails to continue to grow remains poorly characterized. It was recently reported that the fraction of abortive colonies increases with increasing dose. Thus, we set out to investigate the production kinetics of abortive colonies using a model of branching processes.We firstly plotted the experimentally determined colony size distribution of abortive colonies in irradiated normal human fibroblasts, and found the linear relationship on the log-linear or log-log plot. By applying the simple model of branching processes to the linear relationship, we found the persistent reproductive cell death (RCD over several generations following irradiation. To verify the estimated probability of RCD, abortive colony size distribution (≤ 15 cells and the surviving fraction were simulated by the Monte Carlo computational approach for colony expansion. Parameters estimated from the log-log fit demonstrated the good performance in both simulations than those from the log-linear fit. Radiation-induced RCD, i.e. excess probability, lasted over 16 generations and mainly consisted of two components in the early (<3 generations and late phases. Intriguingly, the survival curve was sensitive to the excess probability over 5 generations, whereas abortive colony size distribution was robust against it. These results suggest that, whereas short-term RCD is critical to the abortive colony size distribution, long-lasting RCD is important for the dose response of the surviving fraction.Our present model provides a single framework for understanding the behavior of primary cell colonies in culture following irradiation.

  3. The Normal Distribution

    Indian Academy of Sciences (India)

    of uncertainties in climatic conditions, political situation, prices of auxiliary commodities like fertilisers, etc. over the year; each of these .... ceaseless chaotic movement encountered in colloidal solu- tions; (such a movement has been observed whenever mi- croscopic particles are suspended in liquids/gases). For ex- ample ...

  4. Exciting Normal Distribution

    Science.gov (United States)

    Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd

    2008-01-01

    This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…

  5. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1978-01-01

    This paper addresses the problem of obtaining numerically maximum-likelihood estimates of the parameters for a mixture of normal distributions. In recent literature, a certain successive-approximations procedure, based on the likelihood equations, was shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, we introduce a general iterative procedure, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. We show that, with probability 1 as the sample size grows large, this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. We also show that the step-size which yields optimal local convergence rates for large samples is determined in a sense by the 'separation' of the component normal densities and is bounded below by a number between 1 and 2.

  6. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, 2

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1976-01-01

    The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.

  7. Statistical Power Analysis with Microsoft Excel: Normal Tests for One or Two Means as a Prelude to Using Non-Central Distributions to Calculate Power

    Science.gov (United States)

    Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa

    2009-01-01

    This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…

  8. A branching process model for the analysis of abortive colony size distributions in carbon ion-irradiated normal human fibroblasts.

    Science.gov (United States)

    Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki

    2014-05-01

    A single cell can form a colony, and ionizing irradiation has long been known to reduce such a cellular clonogenic potential. Analysis of abortive colonies unable to continue to grow should provide important information on the reproductive cell death (RCD) following irradiation. Our previous analysis with a branching process model showed that the RCD in normal human fibroblasts can persist over 16 generations following irradiation with low linear energy transfer (LET) γ-rays. Here we further set out to evaluate the RCD persistency in abortive colonies arising from normal human fibroblasts exposed to high-LET carbon ions (18.3 MeV/u, 108 keV/µm). We found that the abortive colony size distribution determined by biological experiments follows a linear relationship on the log-log plot, and that the Monte Carlo simulation using the RCD probability estimated from such a linear relationship well simulates the experimentally determined surviving fraction and the relative biological effectiveness (RBE). We identified the short-term phase and long-term phase for the persistent RCD following carbon-ion irradiation, which were similar to those previously identified following γ-irradiation. Taken together, our results suggest that subsequent secondary or tertiary colony formation would be invaluable for understanding the long-lasting RCD. All together, our framework for analysis with a branching process model and a colony formation assay is applicable to determination of cellular responses to low- and high-LET radiation, and suggests that the long-lasting RCD is a pivotal determinant of the surviving fraction and the RBE.

  9. A branching process model for the analysis of abortive colony size distributions in carbon ion-irradiated normal human fibroblasts

    International Nuclear Information System (INIS)

    Sakashita, Tetsuya; Kobayashi, Yasuhiko; Hamada, Nobuyuki; Kawaguchi, Isao; Hara, Takamitsu; Saito, Kimiaki

    2014-01-01

    A single cell can form a colony, and ionizing irradiation has long been known to reduce such a cellular clonogenic potential. Analysis of abortive colonies unable to continue to grow should provide important information on the reproductive cell death (RCD) following irradiation. Our previous analysis with a branching process model showed that the RCD in normal human fibroblasts can persist over 16 generations following irradiation with low linear energy transfer (LET) γ-rays. Here we further set out to evaluate the RCD persistency in abortive colonies arising from normal human fibroblasts exposed to high-LET carbon ions (18.3 MeV/u, 108 keV/μm). We found that the abortive colony size distribution determined by biological experiments follows a linear relationship on the log–log plot, and that the Monte Carlo simulation using the RCD probability estimated from such a linear relationship well simulates the experimentally determined surviving fraction and the relative biological effectiveness (RBE). We identified the short-term phase and long-term phase for the persistent RCD following carbon-ion irradiation, which were similar to those previously identified following γ-irradiation. Taken together, our results suggest that subsequent secondary or tertiary colony formation would be invaluable for understanding the long-lasting RCD. All together, our framework for analysis with a branching process model and a colony formation assay is applicable to determination of cellular responses to low- and high-LET radiation, and suggests that the long-lasting RCD is a pivotal determinant of the surviving fraction and the RBE. (author)

  10. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    the production trait evaluation of Nordic Red dairy cattle. Genotyped bulls with daughters are used as training animals, and genotyped bulls and producing cows as candidate animals. For simplicity, size of the data is chosen so that the full inverses of the mixed model equation coefficient matrices can......Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was...

  11. Fluid Distribution Pattern in Adult-Onset Congenital, Idiopathic, and Secondary Normal-Pressure Hydrocephalus: Implications for Clinical Care.

    Science.gov (United States)

    Yamada, Shigeki; Ishikawa, Masatsune; Yamamoto, Kazuo

    2017-01-01

    In spite of growing evidence of idiopathic normal-pressure hydrocephalus (NPH), a viewpoint about clinical care for idiopathic NPH is still controversial. A continuous divergence of viewpoints might be due to confusing classifications of idiopathic and adult-onset congenital NPH. To elucidate the classification of NPH, we propose that adult-onset congenital NPH should be explicitly distinguished from idiopathic and secondary NPH. On the basis of conventional CT scan or MRI, idiopathic NPH was defined as narrow sulci at the high convexity in concurrent with enlargement of the ventricles, basal cistern and Sylvian fissure, whereas adult-onset congenital NPH was defined as huge ventricles without high-convexity tightness. We compared clinical characteristics and cerebrospinal fluid distribution among 85 patients diagnosed with idiopathic NPH, 17 patients with secondary NPH, and 7 patients with adult-onset congenital NPH. All patients underwent 3-T MRI examinations and tap-tests. The volumes of ventricles and subarachnoid spaces were measured using a 3D workstation based on T2-weighted 3D sequences. The mean intracranial volume for the patients with adult-onset congenital NPH was almost 100 mL larger than the volumes for patients with idiopathic and secondary NPH. Compared with the patients with idiopathic or secondary NPH, patients with adult-onset congenital NPH exhibited larger ventricles but normal sized subarachnoid spaces. The mean volume ratio of the high-convexity subarachnoid space was significantly less in idiopathic NPH than in adult-onset congenital NPH, whereas the mean volume ratio of the basal cistern and Sylvian fissure in idiopathic NPH was >2 times larger than that in adult-onset congenital NPH. The symptoms of gait disturbance, cognitive impairment, and urinary incontinence in patients with adult-onset congenital NPH tended to progress more slowly compared to their progress in patients with idiopathic NPH. Cerebrospinal fluid distributions and

  12. Comparison of metabolic profile and abdominal fat distribution between karyotypically normal women with premature ovarian insufficiency and age matched controls.

    Science.gov (United States)

    Ates, Seda; Yesil, Gozde; Sevket, Osman; Molla, Taner; Yildiz, Seyma

    2014-11-01

    We designed a prospective case-control study in order to investigate the lipid profiles, insulin sensitivity, presence of metabolic syndrome (MetS) and the abdominal fat distribution in karyotypically normal women with premature ovarian insufficiency (POI). Anthropometric measurements, FSH, estradiol, total testosterone (T), sex hormone binding globulin (SHBG), free androgen index (FAI), fasting glucose and insulin, homeostatic model for insulin resistance (HOMA-IR), lipid profile, the prevalence of MetS and ultrasonographic abdominal fat measurements were assessed in 56 women with POI and 59 healthy controls at the same age range. Serum levels of T, SHBG and FAI were not significantly different between both groups. Total cholesterol (TC) and high-density lipoprotein cholesterol (HDL-C) were higher in women with POI. There were no differences in glucose, insulin, HOMA-IR, low-density lipoprotein cholesterol (LDL-C), triglyceride levels between the two groups. A significant positive correlation was identified between T and TG and also between FAI and LDL-C; SHBG levels were correlated inversely with FSH, and positively with HDL-C in women with POI. The presence of MetS was significantly higher in women with POI. The subcutaneous, preperitoneal and visceral fat thicknesses were not significantly different between the groups. Early cessation of ovulatory function may associated with higher levels of serum TC and HDL-C, but does not seem to cause differences in abdominal fat distribution in women with POI. POI is associated with higher risk of MetS. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Cellulose structure and lignin distribution in normal and compression wood of the Maidenhair tree (Ginkgo biloba L.).

    Science.gov (United States)

    Andersson, Seppo; Wang, Yurong; Pönni, Raili; Hänninen, Tuomas; Mononen, Marko; Ren, Haiqing; Serimaa, Ritva; Saranpää, Pekka

    2015-04-01

    We studied in detail the mean microfibril angle and the width of cellulose crystals from the pith to the bark of a 15-year-old Maidenhair tree (Ginkgo biloba L.). The orientation of cellulose microfibrils with respect to the cell axis and the width and length of cellulose crystallites were determined using X-ray diffraction. Raman microscopy was used to compare the lignin distribution in the cell wall of normal/opposite and compression wood, which was found near the pith. Ginkgo biloba showed a relatively large mean microfibril angle, varying between 19° and 39° in the S2 layer, and the average width of cellulose crystallites was 3.1-3.2 nm. Mild compression wood without any intercellular spaces or helical cavities was observed near the pith. Slit-like bordered pit openings and a heavily lignified S2L layer confirmed the presence of compression wood. Ginkgo biloba showed typical features present in the juvenile wood of conifers. The microfibril angle remained large over the 14 annual rings. The entire stem disc, with a diameter of 18 cm, was considered to consist of juvenile wood. The properties of juvenile and compression wood as well as the cellulose orientation and crystalline width indicate that the wood formation of G. biloba is similar to that of modern conifers. © 2015 Institute of Botany, Chinese Academy of Sciences.

  14. Circularly-symmetric complex normal ratio distribution for scalar transmissibility functions. Part III: Application to statistical modal analysis

    Science.gov (United States)

    Yan, Wang-Ji; Ren, Wei-Xin

    2018-01-01

    This study applies the theoretical findings of circularly-symmetric complex normal ratio distribution Yan and Ren (2016) [1,2] to transmissibility-based modal analysis from a statistical viewpoint. A probabilistic model of transmissibility function in the vicinity of the resonant frequency is formulated in modal domain, while some insightful comments are offered. It theoretically reveals that the statistics of transmissibility function around the resonant frequency is solely dependent on 'noise-to-signal' ratio and mode shapes. As a sequel to the development of the probabilistic model of transmissibility function in modal domain, this study poses the process of modal identification in the context of Bayesian framework by borrowing a novel paradigm. Implementation issues unique to the proposed approach are resolved by Lagrange multiplier approach. Also, this study explores the possibility of applying Bayesian analysis in distinguishing harmonic components and structural ones. The approaches are verified through simulated data and experimentally testing data. The uncertainty behavior due to variation of different factors is also discussed in detail.

  15. Comparative pharmacokinetic and tissue distribution profiles of four major bioactive components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Wang, Chang-Hong; Tao, Yan-Yan; Zhou, Hua; Liu, Cheng-Hai

    2015-10-10

    Fuzheng Huayu recipe (FZHY) is a herbal product for the treatment of liver fibrosis approved by the Chinese State Food and Drug Administration (SFDA), but its pharmacokinetics and tissue distribution had not been investigated. In this study, the liver fibrotic model was induced with intraperitoneal injection of dimethylnitrosamine (DMN), and FZHY was given orally to the model and normal rats. The plasma pharmacokinetics and tissue distribution profiles of four major bioactive components from FZHY were analyzed in the normal and fibrotic rat groups using an ultrahigh performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. Results revealed that the bioavailabilities of danshensu (DSS), salvianolic acid B (SAB) and rosmarinic acid (ROS) in liver fibrotic rats increased 1.49, 3.31 and 2.37-fold, respectively, compared to normal rats. There was no obvious difference in the pharmacokinetics of amygdalin (AMY) between the normal and fibrotic rats. The tissue distribution of DSS, SAB, and AMY trended to be mostly in the kidney and lung. The distribution of DSS, SAB, and AMY in liver tissue of the model rats was significantly decreased compared to the normal rats. Significant differences in the pharmacokinetics and tissue distribution profiles of DSS, ROS, SAB and AMY were observed in rats with hepatic fibrosis after oral administration of FZHY. These results provide a meaningful basis for developing a clinical dosage regimen in the treatment of hepatic fibrosis by FZHY. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Generalization of the normal-exponential model: exploration of a more accurate parametrisation for the signal distribution on Illumina BeadArrays.

    Science.gov (United States)

    Plancade, Sandra; Rozenholc, Yves; Lund, Eiliv

    2012-12-11

    Illumina BeadArray technology includes non specific negative control features that allow a precise estimation of the background noise. As an alternative to the background subtraction proposed in BeadStudio which leads to an important loss of information by generating negative values, a background correction method modeling the observed intensities as the sum of the exponentially distributed signal and normally distributed noise has been developed. Nevertheless, Wang and Ye (2012) display a kernel-based estimator of the signal distribution on Illumina BeadArrays and suggest that a gamma distribution would represent a better modeling of the signal density. Hence, the normal-exponential modeling may not be appropriate for Illumina data and background corrections derived from this model may lead to wrong estimation. We propose a more flexible modeling based on a gamma distributed signal and a normal distributed background noise and develop the associated background correction, implemented in the R-package NormalGamma. Our model proves to be markedly more accurate to model Illumina BeadArrays: on the one hand, it is shown on two types of Illumina BeadChips that this model offers a more correct fit of the observed intensities. On the other hand, the comparison of the operating characteristics of several background correction procedures on spike-in and on normal-gamma simulated data shows high similarities, reinforcing the validation of the normal-gamma modeling. The performance of the background corrections based on the normal-gamma and normal-exponential models are compared on two dilution data sets, through testing procedures which represent various experimental designs. Surprisingly, we observe that the implementation of a more accurate parametrisation in the model-based background correction does not increase the sensitivity. These results may be explained by the operating characteristics of the estimators: the normal-gamma background correction offers an improvement

  17. Effects of glucose, insulin, and insulin resistance on cerebral 18F-FDG distribution in cognitively normal older subjects.

    Science.gov (United States)

    Ishibashi, Kenji; Onishi, Airin; Fujiwara, Yoshinori; Ishiwata, Kiichi; Ishii, Kenji

    2017-01-01

    Increasing plasma glucose levels and insulin resistance can alter the distribution pattern of fluorine-18-labeled fluorodeoxyglucose (18F-FDG) in the brain and relatively reduce 18F-FDG uptake in Alzheimer's disease (AD)-related hypometabolic regions, leading to the appearance of an AD-like pattern. However, its relationship with plasma insulin levels is unclear. We aimed to compare the effects of plasma glucose levels, plasma insulin levels and insulin resistance on the appearance of the AD-like pattern in 18F-FDG images. Fifty-nine cognitively normal older subjects (age = 75.7 ± 6.4 years) underwent 18F-FDG positron emission tomography along with measurement of plasma glucose and insulin levels. As an index of insulin resistance, the Homeostasis model assessment of Insulin Resistance (HOMA-IR) was calculated. Plasma glucose levels, plasma insulin levels, and HOMA-IR were 102.2 ± 8.1 mg/dL, 4.1 ± 1.9 μU/mL, and 1.0 ± 0.5, respectively. Whole-brain voxelwise analysis showed a negative correlation of 18F-FDG uptake with plasma glucose levels in the precuneus and lateral parietotemporal regions (cluster-corrected p insulin levels or HOMA-IR. In the significant cluster, 18F-FDG uptake decreased by approximately 4-5% when plasma glucose levels increased by 20 mg/dL. In the precuneus region, volume-of-interest analysis confirmed a negative correlation of 18F-FDG uptake with plasma glucose levels (r = -0.376, p = 0.002), and no correlation with plasma insulin levels (r = 0.156, p = 0.12) or HOMA-IR (r = 0.096, p = 0.24). This study suggests that, of the three parameters, plasma glucose levels have the greatest effect on the appearance of the AD-like pattern in 18F-FDG images.

  18. Dissecting the correlation structure of a bivariate phenotype ...

    Indian Academy of Sciences (India)

    Unknown

    High correlations between two quantitative traits may be either due to common genetic factors or common environ- mental factors or a combination ... different trait parameters and quantitative trait distributions. An application of the method .... mean vectors have components α1, β1 or – α1 and, α2, β2 or – α2, for the two traits ...

  19. Dissecting the correlation structure of a bivariate phenotype ...

    Indian Academy of Sciences (India)

    Unknown

    We use Monte-Carlo simulations to evaluate the performance of the proposed test under different trait parameters and quantitative trait distributions. An application of the method is illustrated using data on two alcohol-related phenotypes from a project on the collaborative study on the genetics of alcoholism. [Ghosh S 2005 ...

  20. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    Science.gov (United States)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  1. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.

    2015-05-22

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields\\' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  2. Bivariate Gaussian bridges: directional factorization of diffusion in Brownian bridge models.

    Science.gov (United States)

    Kranstauber, Bart; Safi, Kamran; Bartumeus, Frederic

    2014-01-01

    In recent years high resolution animal tracking data has become the standard in movement ecology. The Brownian Bridge Movement Model (BBMM) is a widely adopted approach to describe animal space use from such high resolution tracks. One of the underlying assumptions of the BBMM is isotropic diffusive motion between consecutive locations, i.e. invariant with respect to the direction. Here we propose to relax this often unrealistic assumption by separating the Brownian motion variance into two directional components, one parallel and one orthogonal to the direction of the motion. Our new model, the Bivariate Gaussian bridge (BGB), tracks movement heterogeneity across time. Using the BGB and identifying directed and non-directed movement within a trajectory resulted in more accurate utilisation distributions compared to dynamic Brownian bridges, especially for trajectories with a non-isotropic diffusion, such as directed movement or Lévy like movements. We evaluated our model with simulated trajectories and observed tracks, demonstrating that the improvement of our model scales with the directional correlation of a correlated random walk. We find that many of the animal trajectories do not adhere to the assumptions of the BBMM. The proposed model improves accuracy when describing the space use both in simulated correlated random walks as well as observed animal tracks. Our novel approach is implemented and available within the "move" package for R.

  3. Bayesian bivariate meta-analysis of diagnostic test studies using integrated nested Laplace approximations.

    Science.gov (United States)

    Paul, M; Riebler, A; Bachmann, L M; Rue, H; Held, L

    2010-05-30

    For bivariate meta-analysis of diagnostic studies, likelihood approaches are very popular. However, they often run into numerical problems with possible non-convergence. In addition, the construction of confidence intervals is controversial. Bayesian methods based on Markov chain Monte Carlo (MCMC) sampling could be used, but are often difficult to implement, and require long running times and diagnostic convergence checks. Recently, a new Bayesian deterministic inference approach for latent Gaussian models using integrated nested Laplace approximations (INLA) has been proposed. With this approach MCMC sampling becomes redundant as the posterior marginal distributions are directly and accurately approximated. By means of a real data set we investigate the influence of the prior information provided and compare the results obtained by INLA, MCMC, and the maximum likelihood procedure SAS PROC NLMIXED. Using a simulation study we further extend the comparison of INLA and SAS PROC NLMIXED by assessing their performance in terms of bias, mean-squared error, coverage probability, and convergence rate. The results indicate that INLA is more stable and gives generally better coverage probabilities for the pooled estimates and less biased estimates of variance parameters. The user-friendliness of INLA is demonstrated by documented R-code. Copyright (c) 2010 John Wiley & Sons, Ltd.

  4. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males.

    Directory of Open Access Journals (Sweden)

    Yao-Zhong Liu

    2009-08-01

    Full Text Available Current genome-wide association studies (GWAS are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically.To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI, with the osteoporosis risk phenotype, hip bone mineral density (BMD, scanning approximately 380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6 gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82x10(-7 and 1.47x10(-6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the approximately 380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS cohort containing 3,355 Caucasians (1,370 males and 1,985 females from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat.Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis.

  5. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    Science.gov (United States)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  6. Approximation Order from Bivariate C1-Cubics: A Counter Example.

    Science.gov (United States)

    1982-06-01

    Approved for public release J Distribution walimitod i SPMMwozv by U. a. Army e~ serch Office National Science Vourdlation P. .O ox 12211 Washingtop, DC...e.g., in (Fr]). This needlessly complicates the notation. It ine sufficient to note that OW permutation of the meshline families can be accomplished by...J) x(-oJ) , for all p 6 P (1) jej and shme how this result leads, in standard quasi-interpolant fashion, to the conclusion that dist(f, 8 h 0(h3) (2

  7. Measuring early or late dependence for bivariate lifetimes of twins

    DEFF Research Database (Denmark)

    Scheike, Thomas; Holst, Klaus K; Hjelmborg, Jacob B

    2015-01-01

    -Oakes model. This model can be extended in several directions. One extension is to allow the dependence parameter to depend on covariates. Another extension is to model dependence via piecewise constant cross-hazard ratio models. We show how both these models can be implemented for large sample data......, and suggest a computational solution for obtaining standard errors for such models for large registry data. In addition we consider alternative models that have some computational advantages and with different dependence parameters based on odds ratios of the survival function using the Plackett distribution...

  8. Noncentral Chi-Square versus Normal Distributions in Describing the Likelihood Ratio Statistic: The Univariate Case and Its Multivariate Implication

    Science.gov (United States)

    Yuan, Ke-Hai

    2008-01-01

    In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…

  9. Anatomic distribution of culprit lesions in patients with non-ST-segment elevation myocardial infarction and normal ECG.

    Science.gov (United States)

    Moustafa, Abdelmoniem; Abi-Saleh, Bernard; El-Baba, Mohammad; Hamoui, Omar; AlJaroudi, Wael

    2016-02-01

    In patients presenting with non-ST-elevation myocardial infarction (NSTEMI), left anterior descending (LAD) coronary artery and three-vessel disease are the most commonly encountered culprit lesions in the presence of ST depression, while one third of patients with left circumflex (LCX) artery related infarction have normal ECG. We sought to determine the predictors of presence of culprit lesion in NSTEMI patients based on ECG, echocardiographic, and clinical characteristics. Patients admitted to the coronary care unit with the diagnosis of NSTEMI between June 2012 and December 2013 were retrospectively identified. Admission ECG was interpreted by an electrophysiologist that was blinded to the result of the coronary angiogram. Patients were dichotomized into either normal or abnormal ECG group. The primary endpoint was presence of culprit lesion. Secondary endpoints included length of stay, re-hospitalization within 60 days, and in-hospital mortality. A total of 118 patients that were identified; 47 with normal and 71 with abnormal ECG. At least one culprit lesion was identified in 101 patients (86%), and significantly more among those with abnormal ECG (91.5% vs. 76.6%, P=0.041).The LAD was the most frequently detected culprit lesion in both groups. There was a higher incidence of two and three-vessel disease in the abnormal ECG group (P=0.041).On the other hand, there was a trend of higher LCX involvement (25% vs. 13.8%, P=0.18) and more normal coronary arteries in the normal ECG group (23.4% vs. 8.5%, P=0.041). On multivariate analysis, prior history of coronary artery disease (CAD) [odds ratio (OR) 6.4 (0.8-52)], male gender [OR 5.0 (1.5-17)], and abnormal admission ECG [OR 3.6 (1.12-12)], were independent predictors of a culprit lesion. There was no difference in secondary endpoints between those with normal and abnormal ECG. Among patients presenting with NSTEMI, prior history of CAD, male gender and abnormal admission ECG were independent predictors of a

  10. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  11. A Monte Carlo Study of Levene's Test of Homogeneity of Variance: Empirical Frequencies of Type I Error in Normal Distributions.

    Science.gov (United States)

    Neel, John H.; Stallings, William M.

    An influential statistics test recommends a Levene text for homogeneity of variance. A recent note suggests that Levene's test is upwardly biased for small samples. Another report shows inflated Alpha estimates and low power. Neither study utilized more than two sample sizes. This Monte Carlo study involved sampling from a normal population for…

  12. A test research on ventilative well-distributivity under normal temperature for a control rod drive mechanism (Continuous article)

    International Nuclear Information System (INIS)

    Zhu Longxing

    1989-01-01

    A test for cooling of the control rod drive mechnism under normal temperature is described. The relationship between the unbalanced cofficient and the frictional resistance and wind velocity is found by comparing the ventilation in plate top structure of reactor with that in global top structure of reactor

  13. Normal spectrum of pulmonary parametric response map to differentiate lung collapsibility: distribution of densitometric classifications in healthy adult volunteers

    International Nuclear Information System (INIS)

    Silva, Mario; Nemec, Stefan F.; Dufresne, Valerie; Occhipinti, Mariaelena; Heidinger, Benedikt H.; Bankier, Alexander A.; Chamberlain, Ryan

    2016-01-01

    Pulmonary parametric response map (PRM) was proposed for quantitative densitometric phenotypization of chronic obstructive pulmonary disease. However, little is known about this technique in healthy subjects. The purpose of this study was to describe the normal spectrum of densitometric classification of pulmonary PRM in a group of healthy adults. 15 healthy volunteers underwent spirometrically monitored chest CT at total lung capacity (TLC) and functional residual capacity (FRC). The paired CT scans were analyzed by PRM for voxel-by-voxel characterization of lung parenchyma according to 4 densitometric classifications: normal lung (TLC ≥ -950 HU, FRC ≥ -856 HU); expiratory low attenuation area (LAA) (TLC ≥ -950 HU, FRC < -856 HU); dual LAA (TLC<-950 HU, FRC < -856 HU); uncharacterized (TLC < -950 HU, FRC ≥ -856 HU). PRM spectrum was 78 % ± 10 % normal lung, 20 % ± 8 % expiratory LAA, and 1 % ± 1 % dual LAA. PRM was similar between genders, there was moderate correlation between dual LAA and spirometrically assessed TLC (R = 0.531; p = 0.042), and between expiratory LAA and Vol Exp/Insp ratio (R = -0.572; p = 0.026). PRM reflects the predominance of normal lung parenchyma in a group of healthy volunteers. However, PRM also confirms the presence of physiological expiratory LAA seemingly related to air trapping and a minimal amount of dual LAA likely reflecting emphysema. (orig.)

  14. Pharmacokinetics and tissue distribution of five active ingredients of Eucommiae cortex in normal and ovariectomized mice by UHPLC-MS/MS.

    Science.gov (United States)

    An, Jing; Hu, Fangdi; Wang, Changhong; Zhang, Zijia; Yang, Li; Wang, Zhengtao

    2016-09-01

    1. Pinoresinol di-O-β-d-glucopyranoside (PDG), geniposide (GE), geniposidic acid (GA), aucubin (AN) and chlorogenic acid (CA) are the representative active ingredients in Eucommiae cortex (EC), which may be estrogenic. 2. The ultra high-performance liquid chromatography/tandem mass spectrometry (UHPLC-MS/MS) method for simultaneous determination of the five ingredients showed good linearity, low limits of quantification and high extraction recoveries, as well as acceptable precision, accuracy and stability in mice plasma and tissue samples (liver, spleen, kidney and uterus). It was successfully applied to the comparative study on pharmacokinetics and tissue distribution of PDG, GE, GA, AN and CA between normal and ovariectomized (OVX) mice. 3. The results indicated that except CA, the plasma and tissue concentrations of PDG, GE, GA in OVX mice were all greater than those in normal mice. AN could only be detected in the plasma and liver homogenate of normal mice, which was poorly absorbed in OVX mice and low in other measured tissues. PDG, GE and GA seem to be better absorbed in OVX mice than in normal mice proved by the remarkable increased value of AUC0-∞ and Cmax. It is beneficial that PDG, GE, GA have better plasma absorption and tissue distribution in pathological state.

  15. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    Science.gov (United States)

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    Perhaps no other pair of variables in ecology has generated as much discussion as species richness and ecosystem productivity, as illustrated by the reactions by Pierce (2013) and others to Adler et al.'s (2011) report that empirical patterns are weak and inconsistent. Adler et al. (2011) argued we need to move beyond a focus on simplistic bivariate relationships and test mechanistic, multivariate causal hypotheses. We feel the continuing debate over productivity–richness relationships (PRRs) provides a focused context for illustrating the fundamental difficulties of using bivariate relationships to gain scientific understanding.

  16. Contribution to the power distribution methodology uncertainties assessment

    International Nuclear Information System (INIS)

    Svarny, J.

    2008-01-01

    The present methodology of safety margins in NPP Dukovany design power distribution calculations is based on the philosophy of engineering factors with errors defined on the bases of statistical approach of standard (95%) confidence intervals. On the level of FA power distribution the normality (normal density distribution) of this approach is tested and comparison with errors defined on the 95-percent probability at a 95-percent confidence level (shortly in statistics 95%/95%)) is provided. Practical applications are presented for several NPP Dukovany fuel cycles. The paper also deals briefly with difference between confidence interval and tolerance interval, with the problems of density distribution of mechanical engineering factor variables and solution of axial and radial error distribution like bivariate problem. (Author)

  17. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  18. Dose Distribution in Bladder and Surrounding Normal Tissues in Relation to Bladder Volume in Conformal Radiotherapy for Bladder Cancer

    International Nuclear Information System (INIS)

    Majewski, Wojciech; Wesolowska, Iwona; Urbanczyk, Hubert; Hawrylewicz, Leszek; Schwierczok, Barbara; Miszczyk, Leszek

    2009-01-01

    Purpose: To estimate bladder movements and changes in dose distribution in the bladder and surrounding tissues associated with changes in bladder filling and to estimate the internal treatment margins. Methods and Materials: A total of 16 patients with bladder cancer underwent planning computed tomography scans with 80- and 150-mL bladder volumes. The bladder displacements associated with the change in volume were measured. Each patient had treatment plans constructed for a 'partially empty' (80 mL) and a 'partially full' (150 mL) bladder. An additional plan was constructed for tumor irradiation alone. A subsequent 9 patients underwent sequential weekly computed tomography scanning during radiotherapy to verify the bladder movements and estimate the internal margins. Results: Bladder movements were mainly observed cranially, and the estimated internal margins were nonuniform and largest (>2 cm) anteriorly and cranially. The dose distribution in the bladder worsened if the bladder increased in volume: 70% of patients (11 of 16) would have had bladder underdosed to 70%, 80%, and 90% of the prescribed dose was 23%, 20%, and 15% for the rectum and 162, 144, 123 cm 3 for the intestines, respectively) than with a 'partially full' bladder (volume that received >70%, 80%, and 90% of the prescribed dose was 28%, 24%, and 18% for the rectum and 180, 158, 136 cm 3 for the intestines, respectively). The change in bladder filling during RT was significant for the dose distribution in the intestines. Tumor irradiation alone was significantly better than whole bladder irradiation in terms of organ sparing. Conclusion: The displacements of the bladder due to volume changes were mainly related to the upper wall. The internal margins should be nonuniform, with the largest margins cranially and anteriorly. The changes in bladder filling during RT could influence the dose distribution in the bladder and intestines. The dose distribution in the rectum and bowel was slightly better with

  19. Tritium distribution ratios between the 30 % tributyl phosphate(TBP)-normal dodecane(nDD) organic phase and uranyl nitrate-nitric acid aqueous phase

    International Nuclear Information System (INIS)

    Fujine, Sachio; Uchiyama, Gunzou; Sugikawa, Susumu; Maeda, Mitsuru; Tsujino, Takeshi.

    1989-10-01

    Tritium distribution ratios between the organic and aqueous phases were measured for the system of 30 % tributyl phosphate(TBP)-normal dodecane(nDD)/uranyl nitrate-nitric acid water. It was confirmed that tritium is extracted by TBP into the organic phase in both chemical forms of tritiated water (HTO) and tritiated nitric acid (TNO 3 ). The value of tritium distribution ratio ranged from 0.002 to 0.005 for the conditions of 0-6 mol/L nitric acid, 0.5-800 mCi/L tritium in aqueous phase, and 0-125 g-U/L uranium in organic phase. Isotopic distribution coefficient of tritium between the organic and aqueous phases was observed to be about 0.95. (author)

  20. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    Science.gov (United States)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  1. Comparative pharmacokinetics and tissue distribution profiles of lignan components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Zheng, Tian-Hui; Tao, Yan-Yan; Liu, Cheng-Hai

    2015-05-26

    Fuzheng Huayu recipe (FZHY) is formulated on the basis of Chinese medicine theory in treating liver fibrosis. To illuminate the influence of the pathological state of liver fibrosis on the pharmacokinetics and tissue distribution profiles of lignan components from FZHY. Male Wistar rats were randomly divided into normal group and Hepatic fibrosis group (induced by dimethylnitrosamine). Six lignan components were detected and quantified by ultrahigh performance liquid chromatography-tandem mass spectrometry(UHPLC-MS/MS)in the plasma and tissue of normal and hepatic fibrosis rats. A rapid, sensitive and convenient UHPLC-MS/MS method has been developed for the simultaneous determination of six lignan components in different rat biological samples successfully. After oral administration of FZHY at a dose of 15g/kg, the pharmacokinetic behaviors of schizandrin A (SIA), schizandrin B (SIB), schizandrin C (SIC), schisandrol A (SOA), Schisandrol B (SOB) and schisantherin A (STA) have been significantly changed in hepatic fibrosis rats compared with the normal rats, and their AUC(0-t) values were increased by 235.09%, 388.44%, 223.30%, 669.30%, 295.08% and 267.63% orderly (Pdistribution results showed the amount of SIA, SIB, SOA and SOB were significant increased in heart, lung, spleen and kidney of hepatic fibrosis rats compared with normal rats at most of the time point (Pdistribution of lignan components in normal and hepatic fibrosis rats. The hepatic fibrosis could alter the pharmacokinetics and tissue distribution properties of lignan components in rats after administration of FZHY. The results might be helpful for guide the clinical application of this medicine. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Accuracy of body mass index in predicting pre-eclampsia: bivariate meta-analysis

    NARCIS (Netherlands)

    Cnossen, J. S.; Leeflang, M. M. G.; de Haan, E. E. M.; Mol, B. W. J.; van der Post, J. A. M.; Khan, K. S.; ter Riet, G.

    2007-01-01

    OBJECTIVE: The objective of this study was to determine the accuracy of body mass index (BMI) (pre-pregnancy or at booking) in predicting pre-eclampsia and to explore its potential for clinical application. DESIGN: Systematic review and bivariate meta-analysis. SETTING: Medline, Embase, Cochrane

  3. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    Science.gov (United States)

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  4. Bivariate analysis of sensitivity and specificity produces informative summary measures in diagnostic reviews

    NARCIS (Netherlands)

    Reitsma, Johannes B.; Glas, Afina S.; Rutjes, Anne W. S.; Scholten, Rob J. P. M.; Bossuyt, Patrick M.; Zwinderman, Aeilko H.

    2005-01-01

    Background and Objectives: Studies of diagnostic accuracy most often report pairs of sensitivity and specificity. We demonstrate the advantage of using bivariate meta-regression models to analyze such data. Methods: We discuss the methodology of both the summary Receiver Operating Characteristic

  5. A comparison of bivariate and univariate QTL mapping in livestock populations

    Directory of Open Access Journals (Sweden)

    Sorensen Daniel

    2003-11-01

    Full Text Available Abstract This study presents a multivariate, variance component-based QTL mapping model implemented via restricted maximum likelihood (REML. The method was applied to investigate bivariate and univariate QTL mapping analyses, using simulated data. Specifically, we report results on the statistical power to detect a QTL and on the precision of parameter estimates using univariate and bivariate approaches. The model and methodology were also applied to study the effectiveness of partitioning the overall genetic correlation between two traits into a component due to many genes of small effect, and one due to the QTL. It is shown that when the QTL has a pleiotropic effect on two traits, a bivariate analysis leads to a higher statistical power of detecting the QTL and to a more precise estimate of the QTL's map position, in particular in the case when the QTL has a small effect on the trait. The increase in power is most marked in cases where the contributions of the QTL and of the polygenic components to the genetic correlation have opposite signs. The bivariate REML analysis can successfully partition the two components contributing to the genetic correlation between traits.

  6. Extreme-value limit of the convolution of exponential and multivariate normal distributions: Link to the Hüsler–Reiß distribution

    KAUST Repository

    Krupskii, Pavel

    2017-11-02

    The multivariate Hüsler–Reiß copula is obtained as a direct extreme-value limit from the convolution of a multivariate normal random vector and an exponential random variable multiplied by a vector of constants. It is shown how the set of Hüsler–Reiß parameters can be mapped to the parameters of this convolution model. Assuming there are no singular components in the Hüsler–Reiß copula, the convolution model leads to exact and approximate simulation methods. An application of simulation is to check if the Hüsler–Reiß copula with different parsimonious dependence structures provides adequate fit to some data consisting of multivariate extremes.

  7. The distribution of trace elements in normal human liver determined by semi-automated radiochemical neutron activation analysis

    International Nuclear Information System (INIS)

    Lievens, P.; Cornelis, R.; Hoste, J.; Versieck, J.

    1977-01-01

    The eight segments of five normal human livers are analysed for 25 trace elements by radiochemical NAA. This consists of an automated wet destruction of the samples and two distillations, followed by ion exchange procedures. Ru is used as triplecomparator for the standardization. Short-lived and matrixisotopes are standardized by the Bowen's kale powder. The results reveal that the coefficient of variation within the liver is smaller than 10% for the elements Cd, Cl, Cs, Cu, Fe, K, Mg, Mn, Rb, Se and Zn. The highest range observed for the elements As, Br, Co, Cr, Hg, La, Mo, Na and Sb within a liver is smaller than the range observed between the five livers. (T.G.)

  8. CELL AVERAGING CFAR DETECTOR WITH SCALE FACTOR CORRECTION THROUGH THE METHOD OF MOMENTS FOR THE LOG-NORMAL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    José Raúl Machado Fernández

    2018-01-01

    Full Text Available Se presenta el nuevo detector LN-MoM-CA-CFAR que tiene una desviación reducida en la tasa de probabilidad de falsa alarma operacional con respecto al valor concebido de diseño. La solución corrige un problema fundamental de los procesadores CFAR que ha sido ignora-do en múltiples desarrollos. En efecto, la mayoría de los esquemas previamente propuestos tratan con los cambios bruscos del nivel del clutter mientras que la presente solución corrige los cambios lentos estadísticos de la señal de fondo. Se ha demostrado que estos tienen una influencia marcada en la selección del factor de ajuste multiplicativo CFAR, y consecuen-temente en el mantenimiento de la probabilidad de falsa alarma. Los autores aprovecharon la alta precisión que se alcanza en la estimación del parámetro de forma Log-Normal con el MoM, y la amplia aplicación de esta distribución en la modelación del clutter, para crear una arquitectura que ofrece resultados precisos y con bajo costo computacional. Luego de un procesamiento intensivo de 100 millones de muestras Log-Normal, se creó un esquema que, mejorando el desempeño del clásico CA-CFAR a través de la corrección continua de su fac-tor de ajuste, opera con una excelente estabilidad alcanzando una desviación de solamente 0,2884 % para la probabilidad de falsa alarma de 0,01.

  9. Interaction between a normal shock wave and a turbulent boundary layer at high transonic speeds. Part 1: Pressure distribution

    Science.gov (United States)

    Messiter, A. F.

    1979-01-01

    Analytical solutions are derived which incorporate additional physical effects as higher order terms for the case when the sonic line is very close to the wall. The functional form used for the undisturbed velocity profile is described to indicate how various parameters will be calculated for later comparison with experiment. The basic solutions for the pressure distribution are derived. Corrections are added for flow along a wall having longitudinal curvature and for flow in a circular pipe, and comparisons with available experimental data are shown.

  10. Determination of PVB interlayer’s shear modulus and its effect on normal stress distribution in laminated glass panels

    Science.gov (United States)

    Hána, T.; Eliášová, M.; Machalická, K.; Vokáč, M.

    2017-10-01

    Noticing the current architecture, there are many examples of glass bearing members such as beams, panes, ribs stairs or even columns. Most of these elements are made of laminated glass from panes bonded by polymer interlayer so the task of transferring shear forces between the glass panes needs to be investigated due to the lack of knowledge. This transfer depends on stiffness of polymer material, which is affected by temperature and load duration. It is essential to catch the safe side with limit cases when designing these members if the exact material behaviour is not specified. There are lots of interlayers for structural laminated glass applications available on a market. Most of them exhibit different properties, which need to be experimentally verified. This paper is focused on tangent shear modulus of PVB (polyvinyl-buthyral) interlayer and its effect on the stress distribution in glass panes when loaded. This distribution may be determined experimentally or numerically, respectively. This enables to design structural laminated glass members more effectively regarding price and safety. Furthermore, this is the way, how to extend the use of laminated glass in architectural design.

  11. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Directory of Open Access Journals (Sweden)

    Tadashi Watabe

    Full Text Available PURPOSE: Acetylcholinesterase (AChE inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11C-Donepezil (DNP and the AChE activity in the normal rat, with special focus on the adrenal glands. METHODS: The distribution of (11C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g. A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11C-DNP (45.0 ± 10.7 MBq. The whole-body distribution of the (11C-DNP PET was evaluated based on the Vt (total distribution volume by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. RESULTS: The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11C-DNP in the body (following the liver (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3, respectively, indicating that the distribution of (11C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively, indicating high activity of AChE in the adrenal glands. CONCLUSIONS: We demonstrated the whole-body distribution of (11C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  12. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Science.gov (United States)

    Watabe, Tadashi; Naka, Sadahiro; Ikeda, Hayato; Horitsugi, Genki; Kanai, Yasukazu; Isohashi, Kayako; Ishibashi, Mana; Kato, Hiroki; Shimosegawa, Eku; Watabe, Hiroshi; Hatazawa, Jun

    2014-01-01

    Acetylcholinesterase (AChE) inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11)C-Donepezil (DNP) and the AChE activity in the normal rat, with special focus on the adrenal glands. The distribution of (11)C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g). A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11)C-DNP (45.0 ± 10.7 MBq). The whole-body distribution of the (11)C-DNP PET was evaluated based on the Vt (total distribution volume) by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11)C-DNP in the body (following the liver) (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3), respectively), indicating that the distribution of (11)C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach) (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively), indicating high activity of AChE in the adrenal glands. We demonstrated the whole-body distribution of (11)C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11)C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  13. Stromal laminin chain distribution in normal, hyperplastic and malignant oral mucosa: relation to myofibroblast occurrence and vessel formation.

    Science.gov (United States)

    Franz, Marcus; Wolheim, Anke; Richter, Petra; Umbreit, Claudia; Dahse, Regine; Driemel, Oliver; Hyckel, Peter; Virtanen, Ismo; Kosmehl, Hartwig; Berndt, Alexander

    2010-04-01

    The contribution of stromal laminin chain expression to malignant potential, tumour stroma reorganization and vessel formation in oral squamous cell carcinoma (OSCC) is not fully understood. Therefore, the expression of the laminin chains alpha2, alpha3, alpha4, alpha5 and gamma2 in the stromal compartment/vascular structures in OSCC was analysed. Frozen tissue of OSCC (9x G1, 24x G2, 8x G3) and normal (2x)/hyperplastic (11x) oral mucosa was subjected to laminin chain and alpha-smooth muscle actin (ASMA) immunohistochemistry. Results were correlated to tumour grade. The relation of laminin chain positive vessels to total vessel number was assessed by immunofluorescence double labelling with CD31. Stromal laminin alpha2 chain significantly decreases and alpha3, alpha4, alpha5 and gamma2 chains and also ASMA significantly increase with rising grade. The amount of stromal alpha3, alpha4 and gamma2 chains significantly increased with rising ASMA positivity. There is a significant decrease in alpha3 chain positive vessels with neoplastic transformation. Mediated by myofibroblasts, OSCC development is associated with a stromal up-regulation of laminin isoforms possibly contributing to a migration promoting microenvironment. A vascular basement membrane reorganization concerning alpha3 and gamma2 chain laminins during tumour angioneogenesis is suggested.

  14. UPLC-MS method for quantification of pterostilbene and its application to comparative study of bioavailability and tissue distribution in normal and Lewis lung carcinoma bearing mice.

    Science.gov (United States)

    Deng, Li; Li, Yongzhi; Zhang, Xinshi; Chen, Bo; Deng, Yulin; Li, Yujuan

    2015-10-10

    A UPLC-MS method was developed for determination of pterostilbene (PTS) in plasma and tissues of mice. PTS was separated on Agilent Zorbax XDB-C18 column (50 × 2.1 mm, 1.8 μm) with gradient mobile phase at the flow rate of 0.2 ml/min. The detection was performed by negative ion electrospray ionization in multiple reaction monitoring mode. The linear calibration curve of PTS in mouse plasma and tissues ranged from 1.0 to 5000 and 0.50 to 500 ng/ml (r(2)>0.9979), respectively, with lowest limits of quantification (LLOQ) were between 0.5 and 2.0 ng/ml, respectively. The accuracy and precision of the assay were satisfactory. The validated method was applied to the study of bioavailability and tissue distribution of PTS in normal and Lewis lung carcinoma (LLC) bearing mice. The bioavailability of PTS (dose 14, 28 and 56 mg/kg) in normal mice were 11.9%, 13.9% and 26.4%, respectively; and the maximum level (82.1 ± 14.2 μg/g) was found in stomach (dose 28 mg/kg). The bioavailability, peak concentration (Cmax), time to peak concentration (Tmax) of PTS in LLC mice was increased compared with normal mice. The results indicated the UPLC-MS method is reliable and bioavailability and tissue distribution of PTS in normal and LLC mice were dramatically different. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Interactive Graphics System for the Study of Variance/Covariance Structures of Bivariate and Multivariate Normal Populations

    Science.gov (United States)

    1992-09-01

    students (who later become workers) who are complacent and uncreative . What is needed Is an educational process that promotes active learning. An educational...concepts Figure 1.1. Typical Mathematics Learning Process the student and encourages passive and uncreative behavior. Such a learning system promotes...text(7); If length(number) = 0 then Cell_Rite(strg(miatrx.data[i~j])) E8 else Cell Rite(numiber); end; { Re Writ ) procedure initialize; var lij: integer

  16. Histomorphometric Assessment of Cancellous and Cortical Bone Material Distribution in the Proximal Humerus of Normal and Osteoporotic Individuals

    Science.gov (United States)

    Sprecher, Christoph M.; Schmidutz, Florian; Helfen, Tobias; Richards, R. Geoff; Blauth, Michael; Milz, Stefan

    2015-01-01

    Abstract Osteoporosis is a systemic disorder predominantly affecting postmenopausal women but also men at an advanced age. Both genders may suffer from low-energy fractures of, for example, the proximal humerus when reduction of the bone stock or/and quality has occurred. The aim of the current study was to compare the amount of bone in typical fracture zones of the proximal humerus in osteoporotic and non-osteoporotic individuals. The amount of bone in the proximal humerus was determined histomorphometrically in frontal plane sections. The donor bones were allocated to normal and osteoporotic groups using the T-score from distal radius DXA measurements of the same extremities. The T-score evaluation was done according to WHO criteria. Regional thickness of the subchondral plate and the metaphyseal cortical bone were measured using interactive image analysis. At all measured locations the amount of cancellous bone was significantly lower in individuals from the osteoporotic group compared to the non-osteoporotic one. The osteoporotic group showed more significant differences between regions of the same bone than the non-osteoporotic group. In both groups the subchondral cancellous bone and the subchondral plate were least affected by bone loss. In contrast, the medial metaphyseal region in the osteoporotic group exhibited higher bone loss in comparison to the lateral side. This observation may explain prevailing fracture patterns, which frequently involve compression fractures and certainly has an influence on the stability of implants placed in this medial region. It should be considered when planning the anchoring of osteosynthesis materials in osteoporotic patients with fractures of the proximal humerus. PMID:26705200

  17. Normal distribution and medullary-to-cortical shift of Nestin-expressing cells in acute renal ischemia.

    Science.gov (United States)

    Patschan, D; Michurina, T; Shi, H K; Dolff, S; Brodsky, S V; Vasilieva, T; Cohen-Gould, L; Winaver, J; Chander, P N; Enikolopov, G; Goligorsky, M S

    2007-04-01

    Nestin, a marker of multi-lineage stem and progenitor cells, is a member of intermediate filament family, which is expressed in neuroepithelial stem cells, several embryonic cell types, including mesonephric mesenchyme, endothelial cells of developing blood vessels, and in the adult kidney. We used Nestin-green fluorescent protein (GFP) transgenic mice to characterize its expression in normal and post-ischemic kidneys. Nestin-GFP-expressing cells were detected in large clusters within the papilla, along the vasa rectae, and, less prominently, in the glomeruli and juxta-glomerular arterioles. In mice subjected to 30 min bilateral renal ischemia, glomerular, endothelial, and perivascular cells showed increased Nestin expression. In the post-ischemic period, there was an increase in fluorescence intensity with no significant changes in the total number of Nestin-GFP-expressing cells. Time-lapse fluorescence microscopy performed before and after ischemia ruled out the possibility of engraftment by the circulating Nestin-expressing cells, at least within the first 3 h post-ischemia. Incubation of non-perfused kidney sections resulted in a medullary-to-cortical migration of Nestin-GFP-positive cells with the rate of expansion of their front averaging 40 microm/30 min during the first 3 h and was detectable already after 30 min of incubation. Explant matrigel cultures of the kidney and aorta exhibited sprouting angiogenesis with cells co-expressing Nestin and endothelial marker, Tie-2. In conclusion, several lines of circumstantial evidence identify a sub-population of Nestin-expressing cells with the mural cells, which are recruited in the post-ischemic period to migrate from the medulla toward the renal cortex. These migrating Nestin-positive cells may be involved in the process of post-ischemic tissue regeneration.

  18. Normal probability plots for evaluation of seeds distribution mechanisms in seedersGráficos de probabilidade normal para avaliação de mecanismos de distribuição de sementes em semeadoras

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Viliotti

    2012-05-01

    Full Text Available Currently great emphasis is given for seed metering that assist rigorous demands in relation to longitudinal distribution of seeds, as well as to the index of fails in spacing laws, breaks and double seeds. The evaluation of these variable demands much time and work of attainment of data and processing. The objective of this work went propose to use of graphs of normal probability, facilitating the treatment of the data and decreasing the time of processing. The evaluation methodology consists in the counting of broken seeds, fail spacing and double seeds through the measure of the spacing among seeds, preliminary experiments through combinations of treatments had been carried through whose factors of variation were the level of the reservoir of seeds, the leveling of the seed metering, the speed of displacement and dosage of seeds. The evaluation was carried through in two parts, first through preliminary experiments for elaboration of the graphs of normal probability and later in experiments with bigger sampling for evaluation of the influence of the factors most important. It was done the evaluation of seed metering of rotating internal ring, and the amount of necessary data for the evaluation was very decreased through of the graphs of normal probability that facilitated to prioritize only the significant factors. The dosage of seeds was factor that more important because factor (D have greater significance.Atualmente grande ênfase é dada para mecanismos dosadores de sementes, que devem atender exigências cada vez maiores em relação à qualidade de distribuição longitudinal, principalmente em relação ao índice de falhas, quebras e duplas. A avaliação destas variáveis demanda muito tempo e trabalho, tanto para obtenção de dados como para processamento. O objetivo deste trabalho foi propor a utilização de gráficos de probabilidade normal para facilitar o tratamento dos dados relativos aos fatores que possivelmente influem na

  19. Tolerance limits and tolerance intervals for ratios of normal random variables using a bootstrap calibration.

    Science.gov (United States)

    Flouri, Marilena; Zhai, Shuyan; Mathew, Thomas; Bebu, Ionut

    2017-05-01

    This paper addresses the problem of deriving one-sided tolerance limits and two-sided tolerance intervals for a ratio of two random variables that follow a bivariate normal distribution, or a lognormal/normal distribution. The methodology that is developed uses nonparametric tolerance limits based on a parametric bootstrap sample, coupled with a bootstrap calibration in order to improve accuracy. The methodology is also adopted for computing confidence limits for the median of the ratio random variable. Numerical results are reported to demonstrate the accuracy of the proposed approach. The methodology is illustrated using examples where ratio random variables are of interest: an example on the radioactivity count in reverse transcriptase assays and an example from the area of cost-effectiveness analysis in health economics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    Science.gov (United States)

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (pcompared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  1. Bivariate functional data clustering: grouping streams based on a varying coefficient model of the stream water and air temperature relationship

    Science.gov (United States)

    H. Li; X. Deng; Andy Dolloff; E. P. Smith

    2015-01-01

    A novel clustering method for bivariate functional data is proposed to group streams based on their water–air temperature relationship. A distance measure is developed for bivariate curves by using a time-varying coefficient model and a weighting scheme. This distance is also adjusted by spatial correlation of streams via the variogram. Therefore, the proposed...

  2. Distribution and migration of aftershocks of the 2010 Mw 7.4 Ogasawara Islands intraplate normal-faulting earthquake related to a fracture zone in the Pacific plate

    Science.gov (United States)

    Obana, Koichiro; Takahashi, Tsutomu; No, Tetsuo; Kaiho, Yuka; Kodaira, Shuichi; Yamashita, Mikiya; Sato, Takeshi; Nakamura, Takeshi

    2014-04-01

    describe the aftershocks of a Mw 7.4 intraplate normal-faulting earthquake that occurred 150 km east Ogasawara (Bonin) Islands, Japan, on 21 December 2010. It occurred beneath the outer trench slope of the Izu-Ogasawara trench, where the Pacific plate subducts beneath the Philippine Sea plate. Aftershock observations using ocean bottom seismographs (OBSs) began soon after the earthquake and multichannel seismic reflection surveys were conducted across the aftershock area. Aftershocks were distributed in a NW-SE belt 140 km long, oblique to the N-S trench axis. They formed three subparallel lineations along a fracture zone in the Pacific plate. The OBS observations combined with data from stations on Chichi-jima and Haha-jima Islands revealed a migration of the aftershock activity. The first hour, which likely outlines the main shock rupture, was limited to an 80 km long area in the central part of the subsequent aftershock area. The first hour activity occurred mainly around, and appears to have been influenced by, nearby large seamounts and oceanic plateau, such as the Ogasawara Plateau and the Uyeda Ridge. Over the following days, the aftershocks expanded beyond or into these seamounts and plateau. The aftershock distribution and migration suggest that crustal heterogeneities related to a fracture zone and large seamounts and oceanic plateau in the incoming Pacific plate affected the rupture of the main shock. Such preexisting structures may influence intraplate normal-faulting earthquakes in other regions of plate flexure prior to subduction.

  3. Bivariate Developmental Relations between Calculations and Word Problems: A Latent Change Approach.

    Science.gov (United States)

    Gilbert, Jennifer K; Fuchs, Lynn S

    2017-10-01

    The relation between 2 forms of mathematical cognition, calculations and word problems, was examined. Across grades 2-3, performance of 328 children (mean starting age 7.63 [ SD =0.43]) was assessed 3 times. Comparison of a priori latent change score models indicated a dual change model, with consistently positive but slowing growth, described development in each domain better than a constant or proportional change model. The bivariate model including change models for both calculations and word problems indicated prior calculation performance and change were not predictors of subsequent word-problem change, and prior word-problem performance and change were not predictors of subsequent calculation change. Results were comparable for boys versus girls. The bivariate model, along with correlations among intercepts and slopes, suggest calculation and word-problem development are related, but through an external set of overlapping factors. Exploratory supplemental analyses corroborate findings and provide direction for future study.

  4. Distribution of muscarinic cholinergic receptor proteins m1 to m4 in area 17 of normal and monocularly deprived rhesus monkeys.

    Science.gov (United States)

    Tigges, M; Tigges, J; Rees, H; Rye, D; Levey, A I

    1997-11-10

    Antibodies to muscarinic cholinergic receptor proteins m1 to m4 were used in striate cortex tissue of normal rhesus monkeys to determine the laminar distribution of these proteins with special attention to geniculorecipient layers. The normal patterns were compared to those of monkeys whose ocular dominance system had been altered by visual deprivation. In normal monkeys, immunoreactivity of all four proteins was localized in complex laminar patterns; m1 was densest in layers 2, 3, and 6, followed by layer 5. In contrast, m2 reactivity was densest in lower layer 4C and in 4A; the latter exhibited a honeycomb pattern. Layers 2 and 3 displayed alternating dense and light regions; this pattern was complementary to that of cytochrome oxidase (CytOx). Laminar immunoreactivity for the m3 receptor was similar to the CytOx pattern, including a honeycomb in 4A and a pattern of alternating darker and lighter patches in layers 2/3. Antibody to m4 reacted most densely with layers 1, 2, 3, and 5, layers 2 and 3 exhibited alternating dark and light regions, and layer 4A had a faint honeycomb. Layer 4C was the lightest band. The differential distribution of these four muscarinic receptor subtypes suggests distinct roles in cholinergic modulation of visual processing in the primate striate cortex. Furthermore, all four muscarinic receptors appear to be insensitive to elimination of visual input via monocular occlusion from birth, to deprivation of pattern vision in one eye during a specific time period in adulthood, and to long-term retinal injury.

  5. Can the bivariate Hurst exponent be higher than an average of the separate Hurst exponents?

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2015-01-01

    Roč. 431, č. 1 (2015), s. 124-127 ISSN 0378-4371 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : Correlations * Power- law cross-correlations * Bivariate Hurst exponent * Spectrum coherence Subject RIV: AH - Economics Impact factor: 1.785, year: 2015 http://library.utia.cas.cz/separaty/2015/E/kristoufek-0452314.pdf

  6. The approximation of bivariate Chlodowsky-Sz?sz-Kantorovich-Charlier-type operators

    OpenAIRE

    Agrawal, Purshottam Narain; Baxhaku, Behar; Chauhan, Ruchi

    2017-01-01

    Abstract In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre’s K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum) of these operators and study the degree of approximation by means of the Lipschitz class of Bög...

  7. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Science.gov (United States)

    Zscheischler, Jakob; Orth, Rene; Seneviratne, Sonia I.

    2017-07-01

    Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature-precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 %) than models relying directly on temperature and precipitation as predictors (36 %). Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate-crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  8. Comparison of plantar pressure distribution in subjects with normal and flat feet during gait DOI: 10.5007/1980-0037.2010v12n4p290

    Directory of Open Access Journals (Sweden)

    Patrik Felipe Nazario

    2010-01-01

    Full Text Available The aim of this study was to determine the possible relationship between loss of the normal medial longitudinal arch measured by the height of the navicular bone in a static situation and variables related to plantar pressure distribution measured in a dynamic situation. Eleven men (21 ± 3 years, 74 ± 10 kg and 175 ± 4 cm participated in the study. The Novel Emed-AT System was used for the acquisition of plantar pressure distribution data (peak pressure, mean pressure, contact area, and relative load at a sampling rate of 50 Hz. The navicular drop test proposed by Brody (1982 was used to assess the height of the navicular bone for classification of the subjects. The results were compared by the Mann-Whitney U test, with the level of significance set at p ≤ 0.05. Differences were observed between the two groups in the mid-foot region for all variables studied, with the observation of higher mean values in subjects with flat feet. There were also significant differences in contact area, relative load, peak pressure, and mean pressure between groups. The present study demonstrates the importance of paying attention to subjects with flat feet because changes in plantar pressure distribution are associated with discomfort and injuries.

  9. Finite-size effects in transcript sequencing count distribution: its power-law correction necessarily precedes downstream normalization and comparative analysis.

    Science.gov (United States)

    Wong, Wing-Cheong; Ng, Hong-Kiat; Tantoso, Erwin; Soong, Richie; Eisenhaber, Frank

    2018-02-12

    signal-to-noise ratio by 50% and the statistical/detection sensitivity by as high as 30% regardless of the downstream mapping and normalization methods. Most importantly, the power-law correction improves concordance in significant calls among different normalization methods of a data series averagely by 22%. When presented with a higher sequence depth (4 times difference), the improvement in concordance is asymmetrical (32% for the higher sequencing depth instance versus 13% for the lower instance) and demonstrates that the simple power-law correction can increase significant detection with higher sequencing depths. Finally, the correction dramatically enhances the statistical conclusions and eludes the metastasis potential of the NUGC3 cell line against AGS of our dilution analysis. The finite-size effects due to undersampling generally plagues transcript count data with reproducibility issues but can be minimized through a simple power-law correction of the count distribution. This distribution correction has direct implication on the biological interpretation of the study and the rigor of the scientific findings. This article was reviewed by Oliviero Carugo, Thomas Dandekar and Sandor Pongor.

  10. Immunohistochemical distribution of laminin-332 and collagen type IV in the basement membrane of normal horses and horses with induced laminitis.

    Science.gov (United States)

    Visser, M B; Pollitt, C C

    2011-07-01

    The basement membrane (BM) is a thin layer of extracellular matrix that regulates cell functions as well as providing support to tissues of the body. Primary components of the BM of epithelial tissues are laminin-332 (Ln-332) and collagen type IV. Equine laminitis is a disease characterized by destruction and dislocation of the hoof lamellar BM. Immunohistochemistry was used to characterize the distribution of Ln-332 and collagen type IV in the organs of normal horses and these proteins were found to be widespread. Analysis of a panel of tissue samples from horses with experimentally-induced laminitis revealed that Ln-332 and collagen type IV degradation occurs in the skin and stomach in addition to the hoof lamellae. These findings suggest that BM degradation is common to many epithelial tissues during equine laminitis and suggests a role for systemic trigger factors in this disease. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  11. Comparison of GLUT1, GLUT3, and GLUT4 mRNA and the subcellular distribution of their proteins in normal human muscle

    Science.gov (United States)

    Stuart, C. A.; Wen, G.; Gustafson, W. C.; Thompson, E. A.

    2000-01-01

    Basal, "insulin-independent" glucose uptake into skeletal muscle is provided by glucose transporters positioned at the plasma membrane. The relative amount of the three glucose transporters expressed in muscle has not been previously quantified. Using a combination of qualitative and quantitative ribonuclease protection assay (RPA) methods, we found in normal human muscle that GLUT1, GLUT3, and GLUT4 mRNA were expressed at 90 +/- 10, 46 +/- 4, and 156 +/- 12 copies/ng RNA, respectively. Muscle was fractionated by DNase digestion and differential sedimentation into membrane fractions enriched in plasma membranes (PM) or low-density microsomes (LDM). GLUT1 and GLUT4 proteins were distributed 57% to 67% in LDM, whereas GLUT3 protein was at least 88% in the PM-enriched fractions. These data suggest that basal glucose uptake into resting human muscle could be provided in part by each of these three isoforms.

  12. The N'ormal Distribution

    Indian Academy of Sciences (India)

    (b ~ a) f. (22). By (20)-(22) we now obtain. (23). We still have to find the mysterious constant Co. For this letting a ~ -00, b ~ 00 we see that l.h.s. of (23) is unity. Hence. (24). Note that we can write. -00 -00. To evaluate the double integral, using polar coordinates viz. x = r cos B, y = r sin B (and not forgetting the Jacobian of.

  13. Feasibility of quantification of the distribution of blood flow in the normal human fetal circulation using CMR: a cross-sectional study.

    Science.gov (United States)

    Seed, Mike; van Amerom, Joshua F P; Yoo, Shi-Joon; Al Nafisi, Bahiyah; Grosse-Wortmann, Lars; Jaeggi, Edgar; Jansz, Michael S; Macgowan, Christopher K

    2012-11-26

    We present the first phase contrast (PC) cardiovascular magnetic resonance (CMR) measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG). A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30-39 weeks). Flow was measured in the major fetal vessels and indexed to the fetal weight. There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96). Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO) 540 ± 101, main pulmonary artery (MPA) 327 ± 68, ascending aorta (AAo) 198 ± 38, superior vena cava (SVC) 147 ± 46, ductus arteriosus (DA) 220 ± 39,pulmonary blood flow (PBF) 106 ± 59,descending aorta (DAo) 273 ± 85, umbilical vein (UV) 160 ± 62, foramen ovale (FO)107 ± 54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60 ± 4, AAo37 ± 4, SVC 28 ± 7, DA 41 ± 8, PBF 19 ± 10, DAo50 ± 12, UV 30 ± 9, FO 21 ± 12. This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.

  14. Bivariate spline solution of time dependent nonlinear PDE for a population density over irregular domains.

    Science.gov (United States)

    Gutierrez, Juan B; Lai, Ming-Jun; Slavov, George

    2015-12-01

    We study a time dependent partial differential equation (PDE) which arises from classic models in ecology involving logistic growth with Allee effect by introducing a discrete weak solution. Existence, uniqueness and stability of the discrete weak solutions are discussed. We use bivariate splines to approximate the discrete weak solution of the nonlinear PDE. A computational algorithm is designed to solve this PDE. A convergence analysis of the algorithm is presented. We present some simulations of population development over some irregular domains. Finally, we discuss applications in epidemiology and other ecological problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. A comparison between multivariate and bivariate analysis used in marketing research

    Directory of Open Access Journals (Sweden)

    Constantin, C.

    2012-01-01

    Full Text Available This paper is about an instrumental research conducted in order to compare the information given by two multivariate data analysis in comparison with the usual bivariate analysis. The outcomes of the research reveal that sometimes the multivariate methods use more information from a certain variable, but sometimes they use only a part of the information considered the most important for certain associations. For this reason, a researcher should use both categories of data analysis in order to obtain entirely useful information.

  16. A COMPARISON OF SOME ROBUST BIVARIATE CONTROL CHARTS FOR INDIVIDUAL OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    Moustafa Omar Ahmed Abu - Shawiesh

    2014-06-01

    Full Text Available This paper proposed and considered some bivariate control charts to monitor individual observations from a statistical process control. Usual control charts which use mean and variance-covariance estimators are sensitive to outliers. We consider the following robust alternatives to the classical Hoteling's T2: T2MedMAD, T2MCD, T2MVE a simulation study has been conducted to compare the performance of these control charts. Two real life data are analyzed to illustrate the application of these robust alternatives.

  17. The approximation of bivariate Chlodowsky-Szász-Kantorovich-Charlier-type operators

    Directory of Open Access Journals (Sweden)

    Purshottam Narain Agrawal

    2017-08-01

    Full Text Available Abstract In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre’s K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum of these operators and study the degree of approximation by means of the Lipschitz class of Bögel continuous functions. Finally, we present some graphical examples to illustrate the rate of convergence of the operators under consideration.

  18. The approximation of bivariate Chlodowsky-Szász-Kantorovich-Charlier-type operators.

    Science.gov (United States)

    Agrawal, Purshottam Narain; Baxhaku, Behar; Chauhan, Ruchi

    2017-01-01

    In this paper, we introduce a bivariate Kantorovich variant of combination of Szász and Chlodowsky operators based on Charlier polynomials. Then, we study local approximation properties for these operators. Also, we estimate the approximation order in terms of Peetre's K-functional and partial moduli of continuity. Furthermore, we introduce the associated GBS-case (Generalized Boolean Sum) of these operators and study the degree of approximation by means of the Lipschitz class of Bögel continuous functions. Finally, we present some graphical examples to illustrate the rate of convergence of the operators under consideration.

  19. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  20. Estimating the biomass of unevenly distributed aquatic vegetation in a lake using the normalized water-adjusted vegetation index and scale transformation method.

    Science.gov (United States)

    Gao, Yongnian; Gao, Junfeng; Wang, Jing; Wang, Shuangshuang; Li, Qin; Zhai, Shuhua; Zhou, Ya

    2017-12-01

    Satellite remote sensing is advantageous for the mapping and monitoring of aquatic vegetation biomass at large spatial scales. We proposed a scale transformation (CT) method of converting the field sampling-site biomass from the quadrat to pixel scale and a new normalized water-adjusted vegetation index (NWAVI) based on remotely sensed imagery for the biomass estimation of aquatic vegetation (excluding emergent vegetation). We used a modeling approach based on the proposed CT method and NWAVI as well as statistical analyses including linear, quadratic, logarithmic, cubic, exponential, inverse and power regression to estimate the aquatic vegetation biomass, and we evaluated the performance of the biomass estimation. We mapped the spatial distribution and temporal change of the aquatic vegetation biomass using a geographic information system in a test lake in different months. The exponential regression models based on CT and the NWAVI had optimal adjusted R 2 , F and Sig. values in both May and August 2013. The scatter plots of the observed versus the predicted biomass showed that most of the validated field sites were near the 1:1 line. The RMSE, ARE and RE values were small. The spatial distribution and change of the aquatic vegetation biomass in the study area showed clear variability. Among the NWAVI-based and other vegetation index-based models, the CT and NWAVI-based models had the largest adjusted R 2 , F and the smallest ARE values in both tests. The proposed modeling scheme is effective for the biomass estimation of aquatic vegetation in lakes. It indicated that the proposed method can provide a most accurate spatial distribution map of aquatic vegetation biomass for lake ecological management. More accurate biomass maps of aquatic vegetation are essential for implementing conservation policy and for reducing uncertainties in our understanding of the lake carbon cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Bivariate tensor product [Formula: see text]-analogue of Kantorovich-type Bernstein-Stancu-Schurer operators.

    Science.gov (United States)

    Cai, Qing-Bo; Xu, Xiao-Wei; Zhou, Guorong

    2017-01-01

    In this paper, we construct a bivariate tensor product generalization of Kantorovich-type Bernstein-Stancu-Schurer operators based on the concept of [Formula: see text]-integers. We obtain moments and central moments of these operators, give the rate of convergence by using the complete modulus of continuity for the bivariate case and estimate a convergence theorem for the Lipschitz continuous functions. We also give some graphs and numerical examples to illustrate the convergence properties of these operators to certain functions.

  2. A bivariate space-time downscaler under space and time misalignment.

    Science.gov (United States)

    Berrocal, Veronica J; Gelfand, Alan E; Holland, David M

    2010-12-01

    Ozone and particulate matter PM(2.5) are co-pollutants that have long been associated with increased public health risks. Information on concentration levels for both pollutants come from two sources: monitoring sites and output from complex numerical models that produce concentration surfaces over large spatial regions. In this paper, we offer a fully-model based approach for fusing these two sources of information for the pair of co-pollutants which is computationally feasible over large spatial regions and long periods of time. Due to the association between concentration levels of the two environmental contaminants, it is expected that information regarding one will help to improve prediction of the other. Misalignment is an obvious issue since the monitoring networks for the two contaminants only partly intersect and because the collection rate for PM(2.5) is typically less frequent than that for ozone.Extending previous work in Berrocal et al. (2009), we introduce a bivariate downscaler that provides a flexible class of bivariate space-time assimilation models. We discuss computational issues for model fitting and analyze a dataset for ozone and PM(2.5) for the ozone season during year 2002. We show a modest improvement in predictive performance, not surprising in a setting where we can anticipate only a small gain.

  3. Semiparametric probit models with univariate and bivariate current-status data.

    Science.gov (United States)

    Liu, Hao; Qin, Jing

    2018-03-01

    Multivariate current-status data are frequently encountered in biomedical and public health studies. Semiparametric regression models have been extensively studied for univariate current-status data, but most existing estimation procedures are computationally intensive, involving either penalization or smoothing techniques. It becomes more challenging for the analysis of multivariate current-status data. In this article, we study the maximum likelihood estimations for univariate and bivariate current-status data under the semiparametric probit regression models. We present a simple computational procedure combining the expectation-maximization algorithm with the pool-adjacent-violators algorithm for solving the monotone constraint on the baseline function. Asymptotic properties of the maximum likelihood estimators are investigated, including the calculation of the explicit information bound for univariate current-status data, as well as the asymptotic consistency and convergence rate for bivariate current-status data. Extensive simulation studies showed that the proposed computational procedures performed well under small or moderate sample sizes. We demonstrate the estimation procedure with two real data examples in the areas of diabetic and HIV research. © 2017, The International Biometric Society.

  4. Inheritance of dermatoglyphic traits in twins: univariate and bivariate variance decomposition analysis.

    Science.gov (United States)

    Karmakar, Bibha; Malkin, Ida; Kobyliansky, Eugene

    2012-01-01

    Dermatoglyphic traits in a sample of twins were analyzed to estimate the resemblance between MZ and DZ twins and to evaluate the mode of inheritance by using the maximum likelihood-based Variance decomposition analysis. The additive genetic variance component was significant in both sexes for four traits--PII, AB_RC, RC_HB, and ATD_L. AB RC and RC_HB had significant sex differences in means, whereas PII and ATD_L did not. The results of the Bivariate Variance decomposition analysis revealed that PII and RC_HB have a significant correlation in both genetic and residual components. Significant correlation in the additive genetic variance between AB_RC and ATD_L was observed. The same analysis only for the females sub-sample in the three traits RBL, RBR and AB_DIS shows that the additive genetic RBR component was significant and the AB_DIS sibling component was not significant while others cannot be constrained to zero. The additive variance for AB DIS sibling component was not significant. The three components additive, sibling and residual were significantly correlated between each pair of traits revealed by the Bivariate Variance decomposition analysis.

  5. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, E.R.B.; Towbin, J.A. (Baylor College of Medicine, Houston, TX (United States)); Engh, G. van den; Trask, B.J. (Lawrence Livermore National Lab., CA (United States))

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  6. Bivariate Random Effects Meta-analysis of Diagnostic Studies Using Generalized Linear Mixed Models

    Science.gov (United States)

    GUO, HONGFEI; ZHOU, YIJIE

    2011-01-01

    Bivariate random effect models are currently one of the main methods recommended to synthesize diagnostic test accuracy studies. However, only the logit-transformation on sensitivity and specificity has been previously considered in the literature. In this paper, we consider a bivariate generalized linear mixed model to jointly model the sensitivities and specificities, and discuss the estimation of the summary receiver operating characteristic curve (ROC) and the area under the ROC curve (AUC). As the special cases of this model, we discuss the commonly used logit, probit and complementary log-log transformations. To evaluate the impact of misspecification of the link functions on the estimation, we present two case studies and a set of simulation studies. Our study suggests that point estimation of the median sensitivity and specificity, and AUC is relatively robust to the misspecification of the link functions. However, the misspecification of link functions has a noticeable impact on the standard error estimation and the 95% confidence interval coverage, which emphasizes the importance of choosing an appropriate link function to make statistical inference. PMID:19959794

  7. Bivariate pointing movements on large touch screens: investigating the validity of a refined Fitts' Law.

    Science.gov (United States)

    Bützler, Jennifer; Vetter, Sebastian; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    On the basis of three empirical studies Fitts' Law was refined for bivariate pointing tasks on large touch screens. In the first study different target width parameters were investigated. The second study considered the effect of the motion angle. Based on the results of the two studies a refined model for movement time in human-computer interaction was formulated. A third study, which is described here in detail, concerns the validation of the refined model. For the validation study 20 subjects had to execute a bivariate pointing task on a large touch screen. In the experimental task 250 rectangular target objects were displayed at a randomly chosen position on the screen covering a broad range of ID values (ID= [1.01; 4.88]). Compared to existing refinements of Fitts' Law, the new model shows highest predictive validity. A promising field of application of the model is the ergonomic design and evaluation of project management software. By using the refined model, software designers can calculate a priori the appropriate angular position and the size of buttons, menus or icons.

  8. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  9. Comparison of plantar-pressure distribution and clinical impact of anatomically shaped sandals, off-the-shelf sandals and normal walking shoes in patients with central metatarsalgia.

    Science.gov (United States)

    Schuh, Reinhard; Seegmueller, Jessica; Wanivenhaus, Axel H; Windhager, Reinhard; Sabeti-Aschraf, Manuel

    2014-11-01

    Metatarsalgia is one of the most frequent pathological conditions of the foot and ankle. Numerous studies exist on plantar-pressure characteristics in various types of shoes. However, to the best of our knowledge, plantar-pressure distribution and clinical effects in sandals has not as yet been the the focus of any study. Twenty-two patients (42 feet) with central metatarsalgia were assessed. Time and distance until symptom occurrence in terms of metatarsalgia were evaluated for normal walking shoes (WS), standard sandals (SS) and anatomically shaped, custom-made sandals with a metatarsal pad (AS). Pain intensity was measured with the visual analogue (VAS), and clinical assessment was performed with the American Orthopaedic Foot and Ankle Society (AOFAS) score for the respective shoes. Additionally, plantar-pressure distribution was assessed with the emed-at platform (Novel GmbH) and the F-scan insole system (Tekscan Inc.), respectively. The average walking distance until symptoms occurred was 1,894 m [standard deviation (SD) 1,196 m) for WS, 1,812 m (SD 1,079 m) for SS and 3,407 m (SD 1,817 m) for AS (p < 0.01). Mean duration until occurrence of symptoms was 22.3 min (SD 14.9 min) for the WS, 21.8 min (SD 13.4 min) for the SS and 42.0 min (SD 23.0 min) for the AS (p < 0.01). Plantar-pressure parameters were significantly reduced in the forefoot region for the AS compared with the other walking devices. The results of this study reveal that a modified standard sandal can significantly influence the onset of metatarsalgia, as increased walking time and distance in these patients was observed.

  10. Normal distribution of urinary polyphenol excretion among Egyptian males 7-14 years old and changes following nutritional intervention with tomato juice (Lycopersicon esculentum).

    Science.gov (United States)

    Hussein, Laila; Medina, Alexander; Barrionnevo, Ana; Lammuela-Raventos, Rosa M; Andres-Lacueva, Cristina

    2009-06-01

    The urinary flavonoids are considered a reliable biomarker for the intake of polyphenol-rich foods. To assess the normal distribution of urinary polyphenol [PP] excretion among healthy male children and adolescents on a typical Egyptian diet. To follow up the impact of nutritional intervention with tomato juice on the urinary excretion of [PP]. Forty-nine male subjects 7-14 years old collected a 24-h urine sample and filled a dietary record during a 7-day period. A daily serving of 230 g fresh tomato juice was followed for 18 days in a subgroup. Total urinary [PP] excretions were measured before and after termination of the intervention program. The total urinary [PP] was analyzed after a clean-up solid-phase extraction step by the Folin-Ciocalteu reagent in the 96 micro plates. The results were expressed as gallic acid equivalents (GAE). The urinary [PP] excretion averaged 48.6+/-5.5 mg GAE/24 h, equivalent to 89.5+/-8.4 mg GAE/g creatinine. The mean urinary [PP] excretion increased significantly (Ptomato juice (287.4+/-64.3 mg GAE/g creatinine) compared with the respective mean baseline level (94.5+/-8.92 mg GAE/g creatinine). Clinical laboratory reference limits for urinary polyphenols are presented for Egyptian male children and adolescents. Measuring the urinary polyphenol excretion proved a good biomarker for the dietary polyphenol intake and the results demonstrated that tomato [PP] was highly bioavailable in the human body.

  11. Comparisons of the pharmacokinetic and tissue distribution profiles of withanolide B after intragastric administration of the effective part of Datura metel L. in normal and psoriasis guinea pigs.

    Science.gov (United States)

    Yang, Lianrong; Meng, Xin; Kuang, Haixue

    2018-04-15

    A simple, highly sensitive ultra-performance liquid chromatography- electrospray ionization-mass spectrometry (LC-ESI-MS) method has been developed to quantify of withanolide B and obakunone (IS) in guinea pig plasma and tissues, and to compare the pharmacokinetics and tissue distribution of withanolide B in normal and psoriasis guinea pigs. After mixing with IS, plasma and tissues were pretreated by protein precipitation with methanol. Chromatographic separation was performed on a C18 column using aqueous (0.1% formic acid) and acetonitrile (0.1% formic acid) solutions at 0.4 mL/min as the mobile phase. The gradient program was selected (0-4.0 min, 2-98% B; 4.0-4.5 min, 98-2% B; and 4.5-5 min, 2% B). Detection was performed on a 4000 QTRAP UPLC-ESI-MS/MS system from AB Sciex in the multiple reaction monitoring (MRM) mode. Withanolide B and obakunone (IS) were monitored under positive ionization conditions. The optimized mass transition ion-pairs (m/z) for quantitation were 455.1/109.4 for withanolide B and 455.1/161.1 for obakunone. Copyright © 2018. Published by Elsevier B.V.

  12. A Bivariate Chebyshev Spectral Collocation Quasilinearization Method for Nonlinear Evolution Parabolic Equations

    Directory of Open Access Journals (Sweden)

    S. S. Motsa

    2014-01-01

    Full Text Available This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs. The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  13. A composite likelihood method for bivariate meta-analysis in diagnostic systematic reviews.

    Science.gov (United States)

    Chen, Yong; Liu, Yulun; Ning, Jing; Nie, Lei; Zhu, Hongjian; Chu, Haitao

    2017-04-01

    Diagnostic systematic review is a vital step in the evaluation of diagnostic technologies. In many applications, it involves pooling pairs of sensitivity and specificity of a dichotomized diagnostic test from multiple studies. We propose a composite likelihood (CL) method for bivariate meta-analysis in diagnostic systematic reviews. This method provides an alternative way to make inference on diagnostic measures such as sensitivity, specificity, likelihood ratios, and diagnostic odds ratio. Its main advantages over the standard likelihood method are the avoidance of the nonconvergence problem, which is nontrivial when the number of studies is relatively small, the computational simplicity, and some robustness to model misspecifications. Simulation studies show that the CL method maintains high relative efficiency compared to that of the standard likelihood method. We illustrate our method in a diagnostic review of the performance of contemporary diagnostic imaging technologies for detecting metastases in patients with melanoma.

  14. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  15. SU-C-BRC-04: Efficient Dose Calculation Algorithm for FFF IMRT with a Simplified Bivariate Gaussian Source Model

    Energy Technology Data Exchange (ETDEWEB)

    Li, F; Park, J; Barraclough, B; Lu, B; Li, J; Liu, C; Yan, G [University Florida, Gainesville, FL (United States)

    2016-06-15

    Purpose: To develop an efficient and accurate independent dose calculation algorithm with a simplified analytical source model for the quality assurance and safe delivery of Flattening Filter Free (FFF)-IMRT on an Elekta Versa HD. Methods: The source model consisted of a point source and a 2D bivariate Gaussian source, respectively modeling the primary photons and the combined effect of head scatter, monitor chamber backscatter and collimator exchange effect. The in-air fluence was firstly calculated by back-projecting the edges of beam defining devices onto the source plane and integrating the visible source distribution. The effect of the rounded MLC leaf end, tongue-and-groove and interleaf transmission was taken into account in the back-projection. The in-air fluence was then modified with a fourth degree polynomial modeling the cone-shaped dose distribution of FFF beams. Planar dose distribution was obtained by convolving the in-air fluence with a dose deposition kernel (DDK) consisting of the sum of three 2D Gaussian functions. The parameters of the source model and the DDK were commissioned using measured in-air output factors (Sc) and cross beam profiles, respectively. A novel method was used to eliminate the volume averaging effect of ion chambers in determining the DDK. Planar dose distributions of five head-and-neck FFF-IMRT plans were calculated and compared against measurements performed with a 2D diode array (MapCHECK™) to validate the accuracy of the algorithm. Results: The proposed source model predicted Sc for both 6MV and 10MV with an accuracy better than 0.1%. With a stringent gamma criterion (2%/2mm/local difference), the passing rate of the FFF-IMRT dose calculation was 97.2±2.6%. Conclusion: The removal of the flattening filter represents a simplification of the head structure which allows the use of a simpler source model for very accurate dose calculation. The proposed algorithm offers an effective way to ensure the safe delivery of FFF-IMRT.

  16. Bivariate tensor product ( p , q $(p, q$ -analogue of Kantorovich-type Bernstein-Stancu-Schurer operators

    Directory of Open Access Journals (Sweden)

    Qing-Bo Cai

    2017-11-01

    Full Text Available Abstract In this paper, we construct a bivariate tensor product generalization of Kantorovich-type Bernstein-Stancu-Schurer operators based on the concept of ( p , q $(p, q$ -integers. We obtain moments and central moments of these operators, give the rate of convergence by using the complete modulus of continuity for the bivariate case and estimate a convergence theorem for the Lipschitz continuous functions. We also give some graphs and numerical examples to illustrate the convergence properties of these operators to certain functions.

  17. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  18. Perceived social support and academic achievement: cross-lagged panel and bivariate growth curve analyses.

    Science.gov (United States)

    Mackinnon, Sean P

    2012-04-01

    As students transition to post-secondary education, they experience considerable stress and declines in academic performance. Perceived social support is thought to improve academic achievement by reducing stress. Longitudinal designs with three or more waves are needed in this area because they permit stronger causal inferences and help disentangle the direction of relationships. This study uses a cross-lagged panel and a bivariate growth curve analysis with a three-wave longitudinal design. Participants include 10,445 students (56% female; 12.6% born outside of Canada) transitioning to post-secondary education from ages 15-19. Self-report measures of academic achievement and a generalized measure of perceived social support were used. An increase in average relative standing in academic achievement predicted an increase in average relative standing on perceived social support 2 years later, but the reverse was not true. High levels of perceived social support at age 15 did not protect against declines in academic achievement over time. In sum, perceived social support appears to have no bearing on adolescents' future academic performance, despite commonly held assumptions of its importance.

  19. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    Science.gov (United States)

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  20. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    Science.gov (United States)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  1. Improved deadzone modeling for bivariate wavelet shrinkage-based image denoising

    Science.gov (United States)

    DelMarco, Stephen

    2016-05-01

    Modern image processing performed on-board low Size, Weight, and Power (SWaP) platforms, must provide high- performance while simultaneously reducing memory footprint, power consumption, and computational complexity. Image preprocessing, along with downstream image exploitation algorithms such as object detection and recognition, and georegistration, place a heavy burden on power and processing resources. Image preprocessing often includes image denoising to improve data quality for downstream exploitation algorithms. High-performance image denoising is typically performed in the wavelet domain, where noise generally spreads and the wavelet transform compactly captures high information-bearing image characteristics. In this paper, we improve modeling fidelity of a previously-developed, computationally-efficient wavelet-based denoising algorithm. The modeling improvements enhance denoising performance without significantly increasing computational cost, thus making the approach suitable for low-SWAP platforms. Specifically, this paper presents modeling improvements to the Sendur-Selesnick model (SSM) which implements a bivariate wavelet shrinkage denoising algorithm that exploits interscale dependency between wavelet coefficients. We formulate optimization problems for parameters controlling deadzone size which leads to improved denoising performance. Two formulations are provided; one with a simple, closed form solution which we use for numerical result generation, and the second as an integral equation formulation involving elliptic integrals. We generate image denoising performance results over different image sets drawn from public domain imagery, and investigate the effect of wavelet filter tap length on denoising performance. We demonstrate denoising performance improvement when using the enhanced modeling over performance obtained with the baseline SSM model.

  2. Multiscale Fluctuation Features of the Dynamic Correlation between Bivariate Time Series

    Directory of Open Access Journals (Sweden)

    Meihui Jiang

    2016-01-01

    Full Text Available The fluctuation of the dynamic correlation between bivariate time series has some special features on the time-frequency domain. In order to study these fluctuation features, this paper built the dynamic correlation network models using two kinds of time series as sample data. After studying the dynamic correlation networks at different time-scales, we found that the correlation between time series is a dynamic process. The correlation is strong and stable in the long term, but it is weak and unstable in the short and medium term. There are key correlation modes which can effectively indicate the trend of the correlation. The transmission characteristics of correlation modes show that it is easier to judge the trend of the fluctuation of the correlation between time series from the short term to long term. The evolution of media capability of the correlation modes shows that the transmission media in the long term have higher value to predict the trend of correlation. This work does not only propose a new perspective to analyze the correlation between time series but also provide important information for investors and decision makers.

  3. Neural Systems with Numerically Matched Input-Output Statistic: Isotonic Bivariate Statistical Modeling

    Directory of Open Access Journals (Sweden)

    Simone Fiori

    2007-07-01

    Full Text Available Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are “holes” in the data or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure.

  4. A Basic Bivariate Structure of Personality Attributes Evident Across Nine Languages.

    Science.gov (United States)

    Saucier, Gerard; Thalmayer, Amber Gayle; Payne, Doris L; Carlson, Robert; Sanogo, Lamine; Ole-Kotikash, Leonard; Church, A Timothy; Katigbak, Marcia S; Somer, Oya; Szarota, Piotr; Szirmák, Zsofia; Zhou, Xinyue

    2014-02-01

    Here, two studies seek to characterize a parsimonious common-denominator personality structure with optimal cross-cultural replicability. Personality differences are observed in all human populations and cultures, but lexicons for personality attributes contain so many distinctions that parsimony is lacking. Models stipulating the most important attributes have been formulated by experts or by empirical studies drawing on experience in a very limited range of cultures. Factor analyses of personality lexicons of nine languages of diverse provenance (Chinese, Korean, Filipino, Turkish, Greek, Polish, Hungarian, Maasai, and Senoufo) were examined, and their common structure was compared to that of several prominent models in psychology. A parsimonious bivariate model showed evidence of substantial convergence and ubiquity across cultures. Analyses involving key markers of these dimensions in English indicate that they are broad dimensions involving the overlapping content of the interpersonal circumplex, models of communion and agency, and morality/warmth and competence. These "Big Two" dimensions-Social Self-Regulation and Dynamism-provide a common-denominator model involving the two most crucial axes of personality variation, ubiquitous across cultures. The Big Two might serve as an umbrella model serving to link diverse theoretical models and associated research literatures. © 2013 Wiley Periodicals, Inc.

  5. The Role of Wealth and Health in Insurance Choice: Bivariate Probit Analysis in China

    Directory of Open Access Journals (Sweden)

    Yiding Yue

    2014-01-01

    Full Text Available This paper captures the correlation between the choices of health insurance and pension insurance using the bivariate probit model and then studies the effect of wealth and health on insurance choice. Our empirical evidence shows that people who participate in a health care program are more likely to participate in a pension plan at the same time, while wealth and health have different effects on the choices of the health care program and the pension program. Generally, the higher an individual’s wealth level is, the more likelihood he will participate in a health care program; but wealth has no effect on the participation of pension. Health status has opposite effects on choices of health care programs and pension plans; the poorer an individual’s health is, the more likely he is to participate in health care programs, while the better health he enjoys, the more likely he is to participate in pension plans. When the investigation scope narrows down to commercial insurance, there is only a significant effect of health status on commercial health insurance. The commercial insurance choice and the insurance choice of the agricultural population are more complicated.

  6. The Analysis of Bankruptcy Risk Using the Normal Distribution Gauss-Laplace in Case of a Company is the Most Modern Romanian Sea-River Port on the Danube

    Directory of Open Access Journals (Sweden)

    Rodica Pripoaie

    2015-08-01

    Full Text Available This work presents the application of the normal distribution Gauss-Laplace in case of a company is the most modern Romanian sea-river port on the Danube, specialized service providers, with a handling capacity of approx. 20,000,000 tons / year. The normal distribution Gauss-Laplace is the most known and used probability distribution, because it surprises better the evolution of economic and financial phenomena. Around the average, which has the greatest frequency, gravitate values more to less distant than average, but with the same standard deviation. It is noted that, although used in the forecasting calculations, analysis of profitability threshold - even ignores the risk of decisional operations (regarding deviations between the forecast and achievements, which may, in certain circumstances, influence much the activity of the company. This can be held into account when carefully studying the evolution of turnover follows a law of probability. In case not exist any information on the law of probability of turnover and no reason that one case appear more than another, according of Laplace law, we consider that these cases are uniformly distributed, therefore they follow a normal distribution.

  7. Maximum likelihood estimation of the parameters of a bivariate Gaussian-Weibull distribution from machine stress-rated data

    Science.gov (United States)

    Steve P. Verrill; David E. Kretschmann; James W. Evans

    2016-01-01

    Two important wood properties are stiffness (modulus of elasticity, MOE) and bending strength (modulus of rupture, MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two- or threeparameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of wood...

  8. Bivariate genome-wide association analyses identified genetic pleiotropic effects for bone mineral density and alcohol drinking in Caucasians.

    Science.gov (United States)

    Lu, Shan; Zhao, Lan-Juan; Chen, Xiang-Ding; Papasian, Christopher J; Wu, Ke-Hao; Tan, Li-Jun; Wang, Zhuo-Er; Pei, Yu-Fang; Tian, Qing; Deng, Hong-Wen

    2017-11-01

    Several studies indicated bone mineral density (BMD) and alcohol intake might share common genetic factors. The study aimed to explore potential SNPs/genes related to both phenotypes in US Caucasians at the genome-wide level. A bivariate genome-wide association study (GWAS) was performed in 2069 unrelated participants. Regular drinking was graded as 1, 2, 3, 4, 5, or 6, representing drinking alcohol never, less than once, once or twice, three to six times, seven to ten times, or more than ten times per week respectively. Hip, spine, and whole body BMDs were measured. The bivariate GWAS was conducted on the basis of a bivariate linear regression model. Sex-stratified association analyses were performed in the male and female subgroups. In males, the most significant association signal was detected in SNP rs685395 in DYNC2H1 with bivariate spine BMD and alcohol drinking (P = 1.94 × 10 -8 ). SNP rs685395 and five other SNPs, rs657752, rs614902, rs682851, rs626330, and rs689295, located in the same haplotype block in DYNC2H1 were the top ten most significant SNPs in the bivariate GWAS in males. Additionally, two SNPs in GRIK4 in males and three SNPs in OPRM1 in females were suggestively associated with BMDs (of the hip, spine, and whole body) and alcohol drinking. Nine SNPs in IL1RN were only suggestively associated with female whole body BMD and alcohol drinking. Our study indicated that DYNC2H1 may contribute to the genetic mechanisms of both spine BMD and alcohol drinking in male Caucasians. Moreover, our study suggested potential pleiotropic roles of OPRM1 and IL1RN in females and GRIK4 in males underlying variation of both BMD and alcohol drinking.

  9. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klaauw, B.; Koning, R.H.

    2003-01-01

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  10. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klauw, B.; Koning, R.H.

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  11. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Directory of Open Access Journals (Sweden)

    Jakob Nikolas Kather

    Full Text Available Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions.In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images.To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images.Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  12. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Science.gov (United States)

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K; Schad, Lothar R; Zöllner, Frank Gerrit

    2015-01-01

    Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  13. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    International Nuclear Information System (INIS)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez

    2011-01-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  14. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez, E-mail: valter.costa@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  15. Effect of catchment properties and flood generation regime on copula selection for bivariate flood frequency analysis

    Science.gov (United States)

    Filipova, Valeriya; Lawrence, Deborah; Klempe, Harald

    2018-02-01

    Applying copula-based bivariate flood frequency analysis is advantageous because the results provide information on both the flood peak and volume. More data are, however, required for such an analysis, and it is often the case that only data series with a limited record length are available. To overcome this issue of limited record length, data regarding climatic and geomorphological properties can be used to complement statistical methods. In this paper, we present a study of 27 catchments located throughout Norway, in which we assess whether catchment properties, flood generation processes and flood regime have an effect on the correlation between flood peak and volume and, in turn, on the selection of copulas. To achieve this, the annual maximum flood events were first classified into events generated primarily by rainfall, snowmelt or a combination of these. The catchments were then classified into flood regime, depending on the predominant flood generation process producing the annual maximum flood events. A contingency table and Fisher's exact test were used to determine the factors that affect the selection of copulas in the study area. The results show that the two-parameter copulas BB1 and BB7 are more commonly selected in catchments with high steepness, high mean annual runoff and rainfall flood regime. These findings suggest that in these types of catchments, the dependence structure between flood peak and volume is more complex and cannot be modeled effectively using a one-parameter copula. The results illustrate that by relating copula types to flood regime and catchment properties, additional information can be supplied for selecting copulas in catchments with limited data.

  16. Distribution of the FMR1 gene in females by race/ethnicity: women with diminished ovarian reserve versus women with normal fertility (SWAN study).

    Science.gov (United States)

    Pastore, Lisa M; Young, Steven L; Manichaikul, Ani; Baker, Valerie L; Wang, Xin Q; Finkelstein, Joel S

    2017-01-01

    To study whether reported, but inconsistent, associations between the FMR1 CGG repeat lengths in the intermediate, high normal, or low normal range differentiate women diagnosed with diminished ovarian reserve (DOR) from population controls and whether associations vary by race/ethnic group. Case-control study. Academic and private fertility clinics. DOR cases (n = 129; 95 Whites, 22 Asian, 12 other) from five U.S. fertility clinics were clinically diagnosed, with regular menses and no fragile X syndrome family history. Normal fertility controls (n = 803; 386 Whites, 219 African-Americans, 102 Japanese, 96 Chinese) from the United States-based SWAN Study had one or more menstrual period in the 3 months pre-enrollment, one or more pregnancy, no history of infertility or hormone therapy, and menopause ≥46 years. Previously, the SWAN Chinese and Japanese groups had similar FMR1 CGG repeat lengths, thus they were combined. None. FMR1 CGG repeat lengths. Median CGG repeats were nearly identical by case/control group. DOR cases had fewer CGG repeats in the shorter FMR1 allele than controls among Whites, but this was not significant among Asians. White cases had fewer CGG repeats in the shorter allele than Asian cases. No significant differences were found in the high normal/intermediate range between cases and controls or by race/ethnic group within cases in the longer allele. This study refutes prior reports of an association between DOR and high normal/intermediate repeats and confirms an association between DOR and low normal repeats in Whites. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  17. Normal values of regional left ventricular myocardial thickness, mass and distribution-assessed by 320-detector computed tomography angiography in the Copenhagen General Population Study

    DEFF Research Database (Denmark)

    Hindsø, Louise; Fuchs, Andreas; Kühl, Jørgen Tobias

    2017-01-01

    ) and thinnest in the mid-ventricular anterior wall (segment 7; men = 5.6 mm; women = 4.5 mm) for both men and women. However, the regional LVMD differed between men and women, with the LV being most heterogenic in women. The normal human LV is morphologically heterogenic, and showed same overall pattern...

  18. Efeitos da transformação de uma variável com distribuição normal em sua inversa sobre os parâmetros de sua distribuição usando técnicas de Monte Carlo Effects of transforming a normally distributed variable into its inverse on parameters of the distribution using Monte Carlo techniques

    Directory of Open Access Journals (Sweden)

    Mirella Leme Franco Geraldini Sirol

    2006-06-01

    Full Text Available Foram realizados quatro estudos de simulação para verificar a distribuição de inversas de variáveis com distribuição normal, em função de diferentes variâncias, médias, pontos de truncamentos e tamanhos amostrais. As variáveis simuladas foram GMD, com distribuição normal, representando o ganho médio diário e DIAS, obtido a partir da inversa de GMD, representando dias para se obter determinado peso. Em todos os estudos, foi utilizado o sistema SAS® (1990 para simulação dos dados e para posterior análise dos resultados. As médias amostrais de DIAS foram dependentes dos desvios-padrão utilizados na simulação. As análises de regressão mostraram redução da média e do desvio-padrão de DIAS em função do aumento na média de GMD. A inclusão de um ponto de truncamento entre 10 e 25% do valor da média de GMD reduziu a média de GMD e aumentou a de DIAS, quando o coeficiente de variação de GMD foi superior a 25%. O efeito do tamanho dos grupos nas médias de GMD e DIAS não foi significativo, mas o desvio-padrão e CV amostrais médios de GMD aumentaram com o tamanho do grupo. Em virtude da dependência entre a média e o desvio-padrão e da variação observada nos desvios-padrão de DIAS em função do tamanho do grupo, a utilização de DIAS como critério de seleção pode diminuir a acurácia da variação. Portanto, para a substituição de GMD por DIAS, é necessária a utilização de um método de análise robusto o suficiente para a eliminação da heterogeneidade de variância.Four simulation studies were conducted to verify the distribution of the inverse of variables with normal distribution, relatively to variances, averages, truncation points and sample sizes. The variables simulated were GMD, with normal distribution and representing average daily gain, and DIAS defined as a multiple of the inverse of GMD and representing days to reach a fixed body weight. The SAS® (1990 system was used, for simulation

  19. Genetic determinant of trabecular bone score (TBS) and bone mineral density: A bivariate analysis.

    Science.gov (United States)

    Ho-Pham, Lan T; Hans, Didier; Doan, Minh C; Mai, Linh D; Nguyen, Tuan V

    2016-11-01

    This study sought to estimate the extent of genetic influence on the variation in trabecular bone score (TBS). We found that genetic factors accounted for ~45% of variance in TBS, and that the co-variation between TBS and bone density is partially determined by genetic factors. Trabecular bone score has emerged as an important predictor of fragility fracture, but factors underlying the individual differences in TBS have not been explored. In this study, we sought to determine the genetic contribution to the variation of TBS in the general population. The study included 556 women and 189 men from 265 families. The individuals aged 53years (SD 11). We measured lumbar spine bone mineral density (BMD; Hologic Horizon) and then derived the TBS from the same Hologic scan where BMD was derived. A biometric model was applied to the data to partition the variance of TBS into two components: one due to additive genetic factors, and one due to environmental factors. The index of heritability was estimated as the ratio of genetic variance to total variance of a trait. Bivariate genetic analysis was conducted to estimate the genetic correlation between TBS and BMD measurements. TBS was strongly correlated with lumbar spine BMD (r=0.73; P<0.001). On average TBS in men was higher than women, after adjusting age and height which are significantly associated with both TBS and lumbar spine BMD. The age and height adjusted index of heritability of TBS was 0.46 (95% CI, 0.39-0.54), which was not much different from that of LSBMD (0.44; 95% CI, 0.31-0.55). Moreover, the genetic correlation between TBS and LSBMD was 0.35 (95% CI, 0.21-0.46), between TBS and femoral neck BMD was 0.21 (95% CI, 0.10-0.33). Approximately 45% of the variance in TBS is under genetic influence, and this effect magnitude is similar to that of lumbar spine BMD. This finding provides a scientific justification for the search for specific genetic variants that may be associated with TBS and fracture risk

  20. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  1. Distribution of normal human left ventricular myofiber stress at end diastole and end systole: a target for in silico design of heart failure treatments.

    Science.gov (United States)

    Genet, Martin; Lee, Lik Chuan; Nguyen, Rebecca; Haraldsson, Henrik; Acevedo-Bolton, Gabriel; Zhang, Zhihong; Ge, Liang; Ordovas, Karen; Kozerke, Sebastian; Guccione, Julius M

    2014-07-15

    Ventricular wall stress is believed to be responsible for many physical mechanisms taking place in the human heart, including ventricular remodeling, which is frequently associated with heart failure. Therefore, normalization of ventricular wall stress is the cornerstone of many existing and new treatments for heart failure. In this paper, we sought to construct reference maps of normal ventricular wall stress in humans that could be used as a target for in silico optimization studies of existing and potential new treatments for heart failure. To do so, we constructed personalized computational models of the left ventricles of five normal human subjects using magnetic resonance images and the finite-element method. These models were calibrated using left ventricular volume data extracted from magnetic resonance imaging (MRI) and validated through comparison with strain measurements from tagged MRI (950 ± 170 strain comparisons/subject). The calibrated passive material parameter values were C0 = 0.115 ± 0.008 kPa and B0 = 14.4 ± 3.18; the active material parameter value was Tmax = 143 ± 11.1 kPa. These values could serve as a reference for future construction of normal human left ventricular computational models. The differences between the predicted and the measured circumferential and longitudinal strains in each subject were 3.4 ± 6.3 and 0.5 ± 5.9%, respectively. The predicted end-diastolic and end-systolic myofiber stress fields for the five subjects were 2.21 ± 0.58 and 16.54 ± 4.73 kPa, respectively. Thus these stresses could serve as targets for in silico design of heart failure treatments. Copyright © 2014 the American Physiological Society.

  2. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Directory of Open Access Journals (Sweden)

    J. Zscheischler

    2017-07-01

    Full Text Available Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature–precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 % than models relying directly on temperature and precipitation as predictors (36 %. Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate–crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  3. Quantitative analysis of senile plaques in Alzheimer disease: observation of log-normal size distribution and molecular epidemiology of differences associated with apolipoprotein E genotype and trisomy 21 (Down syndrome).

    Science.gov (United States)

    Hyman, B T; West, H L; Rebeck, G W; Buldyrev, S V; Mantegna, R N; Ukleja, M; Havlin, S; Stanley, H E

    1995-04-11

    The discovery that the epsilon 4 allele of the apolipoprotein E (apoE) gene is a putative risk factor for Alzheimer disease (AD) in the general population has highlighted the role of genetic influences in this extremely common and disabling illness. It has long been recognized that another genetic abnormality, trisomy 21 (Down syndrome), is associated with early and severe development of AD neuropathological lesions. It remains a challenge, however, to understand how these facts relate to the pathological changes in the brains of AD patients. We used computerized image analysis to examine the size distribution of one of the characteristic neuropathological lesions in AD, deposits of A beta peptide in senile plaques (SPs). Surprisingly, we find that a log-normal distribution fits the SP size distribution quite well, motivating a porous model of SP morphogenesis. We then analyzed SP size distribution curves in genotypically defined subgroups of AD patients. The data demonstrate that both apoE epsilon 4/AD and trisomy 21/AD lead to increased amyloid deposition, but by apparently different mechanisms. The size distribution curve is shifted toward larger plaques in trisomy 21/AD, probably reflecting increased A beta production. In apoE epsilon 4/AD, the size distribution is unchanged but the number of SP is increased compared to apoE epsilon 3, suggesting increased probability of SP initiation. These results demonstrate that subgroups of AD patients defined on the basis of molecular characteristics have quantitatively different neuropathological phenotypes.

  4. Impact of polymorphisms in WFS1 on prediabetic phenotypes in a population-based sample of middle-aged people with normal and abnormal glucose regulation

    DEFF Research Database (Denmark)

    Sparsø, T; Andersen, G; Albrechtsen, Anders

    2008-01-01

    .025) and decreased 30-min serum insulin levels (p = 0.047) after an oral glucose load. In glucose-tolerant individuals the same allele was associated with increased fasting serum insulin concentration (p = 0.019) and homeostasis model assessment of insulin resistance (HOMA-IR; p = 0.026). To study the complex...... interaction of WFS1 rs734312 on insulin release and insulin resistance we introduced Hotelling's T (2) test. Assuming bivariate normal distribution, we constructed standard error ellipses of the insulinogenic index and HOMA-IR when stratified according to glucose tolerance status around the means of each WFS1...... rs734312 genotype level. The interaction term between individuals with normal glucose tolerance and abnormal glucose regulation on the insulinogenic index and HOMA-IR was significantly associated with the traits (p = 0.0017). CONCLUSIONS/INTERPRETATION: Type 2 diabetes-associated risk alleles of WFS1...

  5. Normal thoracic and abdominal distribution of 2-deoxy-2-[18F]fluoro-D-glucose (18FDG) in adult cats.

    Science.gov (United States)

    LeBlanc, Amy K; Wall, Jon S; Morandi, Federica; Kennel, Stephen J; Stuckey, Alan; Jakoby, Bjoern; Townsend, David W; Daniel, Gregory B

    2009-01-01

    Positron emission tomography (PET) with 2-deoxy-2-[18sF]fluoro-D-glucose (18FDG) is an important imaging modality for diagnosis and staging of human neoplastic disease. The purpose of this study is to describe the normal is 18FDG uptake in adult cats. Six adult healthy female cats were used. Cats were sedated and then injected intravenously with 74.0 +/- 13.0 (mean +/- SD) MBq of 18FDG. General anesthesia was induced and cats were placed in ventral recumbancy on the PET scanner's bed. Static images using multiple bed positions were acquired approximately 60-90 min after injection. A transmission scan was acquired at each bed position utilizing a 57Co point source to perform attenuation and scatter correction. Regions of interest (ROIs) were drawn over the liver, right and left renal cortices, left ventricular wall, and wall of ascending and descending colonic segments. Standardized uptake values (SUV) were calculated using an established formula. Kidneys and intestinal tract had relatively intense uptake of 18FDG; liver activity was intermediate; the spleen was not identified in any of the cats. Cardiac activity was variable but intense activity was noted in the left ventricular myocardium in most cats. No appreciable lung uptake was noted. Mean +/- SD SUV values were calculated. This study established the normal pattern of uptake of 18FDG in adult cats and provided baseline data for comparison with future studies evaluating a variety of neoplastic and nonneoplastic diseases.

  6. Application of Airborne LiDAR Data and Geographic Information Systems (GIS to Develop a Distributed Generation System for the Town of Normal, IL

    Directory of Open Access Journals (Sweden)

    Jin H. Jo

    2015-03-01

    Full Text Available Distributed generation allows a variety of small, modular power-generating technologies to be combined with load management and energy storage systems to improve the quality and reliability of our electricity supply. As part of the US Environmental Protection Agency's effort to reduce CO2 emissions from existing power plants by 30% by 2030, distributed generation through solar photovoltaic systems provides a viable option for mitigating the negative impacts of centralized fossil fuel plants. This study conducted a detailed analysis to identify the rooftops in a town in Central Illinois that are suitable for distributed generation solar photovoltaic systems with airborn LiDAR data and to quantify their energy generation potential with an energy performance model. By utilizing the available roof space of the 9,718 buildings in the case study area, a total of 39.27 MW solar photovoltaic systems can provide electrical generation of 53,061 MWh annually. The unique methodology utilized for this assessment of a town's solar potential provides an effective way to invest in a more sustainable energy future and ensure economic stability.

  7. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    Science.gov (United States)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  8. Bivariate genetic analyses of stuttering and nonfluency in a large sample of 5-year old twins

    NARCIS (Netherlands)

    van Beijsterveldt, C.E.M.; Felsenfeld, S.; Boomsma, D.I.

    2010-01-01

    Purpose: Behavioral genetic studies of speech fluency have focused on participants who present with clinical stuttering. Knowledge about genetic influences on the development and regulation of normal speech fluency is limited. The primary aims of this study were to identify the heritability of

  9. Histopathological confirmation of similar intramucosal distribution of fluorescein in both intravenous administration and local mucosal application for probe-based confocal laser endomicroscopy of the normal stomach.

    Science.gov (United States)

    Nonaka, Kouichi; Ohata, Ken; Ban, Shinichi; Ichihara, Shin; Takasugi, Rumi; Minato, Yohei; Tashima, Tomoaki; Matsuyama, Yasushi; Takita, Maiko; Matsuhashi, Nobuyuki; Neumann, Helmut

    2015-12-16

    Probe-based confocal laser endomicroscopy (pCLE) is capable of acquiring in vivo magnified cross-section images of the gastric mucosa. Intravenous injection of fluorescein sodium is used for confocal imaging. However, it is still under debate if local administration of the dye to the mucosa is also effective for confocal imaging as it is not yet clear if topical application also reveals the intramucosal distribution of fluorescein. The objective of this study was to evaluate the intramucosal distribution of fluorescein sodium after topical application and to compare the distribution to the conventional intravenous injection used for confocal imaging. pCLE of the stomach uninfected with Helicobacter pylori was performed in a healthy male employing intravenous administration and local mucosal application of fluorescein. The mucosa of the lower gastric body was biopsied 1 min and 5 min after intravenous administration or local mucosal application of fluorescein, and the distribution of fluorescein in the biopsy samples was examined histologically. Green fluorescence was already observed in the cytoplasm of fundic glandular cells in the biopsied deep mucosa 1 min after local mucosal application of fluorescein. It was also observed in the foveolar lumen and inter-foveolar lamina propria, although it was noted at only a few sites. In the tissue biopsied 5 min after the local mucosal application of fluorescein, green fluorescence was more frequently noted in the cytoplasm of fundic glandular cells than in that 1 min after the local mucosal application of fluorescein, although obvious green fluorescence was not identified in the foveolar lumen or inter-foveolar lamina propria. The distribution of intravenously administered fluorescein in the cytoplasm of fundic glandular cells was also clearly observed similarly to that after local mucosal application of fluorescein. Green fluorescence in more cells was observed in many cells 5 min after intravenous administration compared

  10. Pursuing Normality

    DEFF Research Database (Denmark)

    Madsen, Louise Sofia; Handberg, Charlotte

    2018-01-01

    BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease...

  11. Vibrational spectra and potential energy distributions for 4,5-dichloro-3-hydroxypyridazine by density functional theory and normal coordinate calculations.

    Science.gov (United States)

    Krishnakumar, V; Ramasamy, R

    2005-09-01

    The solid phase FT-IR and FT-Raman spectra of 4,5-dichloro-3-hydroxypyridazine have been recorded in the regions 4000-400 cm(-1) and 3500-100 cm(-1), respectively. The spectra were interpreted with the aid of normal coordinate analysis following a full structure optimization and force field calculations based on the density functional theory (DFT) using the standard B3LYP/6-31G* and B3LYP/6-311+G** method and basis set combinations. The DFT force field transformed to natural internal coordinates was corrected by a well-established set of scale factors that were found to be transferable to the title compound. The IR and Raman spectra were predicted theoretically and compared with the experimental spectra.

  12. Distribution of myofibroblast cells and microvessels around invasive ductal carcinoma of the breast and comparing with the adjacent range of their normal-to-DCIS zones.

    Science.gov (United States)

    Dabiri, Shahriar; Talebi, Amin; Shahryari, Jahanbanoo; Meymandi, Manzoumeh Shamsi; Safizadeh, Hossein

    2013-02-01

    This study seeks to determine the relationships between manifestation of myofibroblasts in the stroma tissue of hyperplastic pre-invasive breast lesions to invasive cancer by investigating clinicopathological data of patients, their effect on steroid receptor expression and HER2, and angiogenesis according to CD34 antigen expression. 100 cases of invasive ductal carcinoma were immunohistochemically investigated for the presence of smooth muscle actin (SMA), ER/PR, HER2, anti-CD34 antibody and microvessel count (MVC). Patients were scored in four different zones of invasive areas: invasive cancer, DCIS, fibrocystic disease ± ductal intraepithelial neoplasia (FCD ± DIN), and normal tissue.  There was a significant difference in stromal myofibroblasts between all areas except for the stroma of DCIS and FCD ± DIN (P normal areas (P = 0.054). There was a significant difference in MVC observed in all areas except for DCIS and FCD ± DIN (P < 0.001). We noted significant inverse correlations between MVC, HER2 expression, and the numbers of involved lymph nodes in invasive cancer and DCIS (P < 0.001). Most MVC were present in grade I, with the least frequent observed in grade III cases in the stroma of invasive cancer, DCIS and FCD ± DIN (P < 0.001).  Angiogenesis can be observed before any significant myofibroblastic changes in the pre-invasive breast lesions. The elevated content of myofibroblasts in stroma of tumor; probably may be a worse prognostic factor  and the steps from atypical epithelial hyperplasia to DCIS and then to the invasive carcinoma do not appear to be always part of a linear progression.

  13. Primary testicular failure in Klinefelter's syndrome: the use of bivariate luteinizing hormone-testosterone reference charts

    DEFF Research Database (Denmark)

    Aksglaede, Lise; Andersson, Anna-Maria; Jørgensen, Niels

    2007-01-01

    The diagnosis of androgen deficiency is based on clinical features and confirmatory low serum testosterone levels. In early primary testicular failure, a rise in serum LH levels suggests inadequate androgen action for the individual's physiological requirements despite a serum testosterone level...... within the normal range. The combined evaluation of serum LH and testosterone levels in the evaluation of testicular failure has not been widely advocated....

  14. Influences of magnetic particle-particle interactions on orientational distributions and rheological properties for a colloidal dispersion composed of rod-like particle with a magnetic moment normal to the particle axis

    Science.gov (United States)

    Hayasaka, Ryo; Aoshima, Masayuki; Satoh, Akira

    We have investigated mainly the influences of magnetic particle-particle interactions on the orientational distribution and viscosity of a semi-dense dispersion, which is composed of rod-like particles with a magnetic moment magnetized normal to the particle axis. In addition, the influences of the magnetic field strength, shear rate, and random forces on the orientational distribution and rheological properties have been clarified. The mean field approximation has been applied to take into account magnetic interactions between rod-like particles. The basic equation of the orientational distribution function has been derived from the balance of torques and solved by the numerical analysis method. The results obtained here are summarized as follows. For a strong magnetic field, the rotational motion of the rod-like particle is restricted in a plane normal to the shearing plane since the magnetic moment of the particle is restricted in the magnetic field direction. Under circumstances of a very strong magnetic interaction between particles, the magnetic moment is strongly restricted in the magnetic field direction, so that the particle has a tendency to incline in the flow direction with the magnetic moment pointing to the magnetic field direction. For a strong shear flow, a directional characteristic of rod-like particles is enhanced, and this leads to a more significant one-peak-type distribution of the orientational distribution function. Magnetic interactions between particles do not contribute to the increase in the viscosity because the mean-field vector has only a component along the magnetic field direction.

  15. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...... Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process...

  16. Whole-Body Distribution of Donepezil as an Acetylcholinesterase Inhibitor after Oral Administration in Normal Human Subjects: A 11C-donepezil PET Study

    Directory of Open Access Journals (Sweden)

    Ikuko Mochida

    2017-01-01

    Full Text Available Objective(s: It is difficult to investigate the whole-body distribution of an orally administered drug by means of positron emission tomography (PET, owing to the short physical half-life of radionuclides, especially when 11C-labeled compounds are tested. Therefore, we aimed to examine the whole-body distribution of donepezil (DNP as an acetylcholinesterase inhibitor by means of 11C-DNP PET imaging, combined with the oral administration of pharmacological doses of DNP.Methods: We studied 14 healthy volunteers, divided into group A (n=4 and group B (n=10. At first, we studied four females (mean age: 57.3±4.5 y, three of whom underwent 11C-DNP PET scan at 2.5 h after the oral administration of 1 mg and 30 μg of DNP, respectively, while one patient was scanned following the oral administration of 30 μg of DNP (group A. Then, we studied five females and five males (48.3±6.1 y, who underwent 11C-DNP PET scan, without the oral administration of DNP (group B. Plasma DNP concentration upon scanning was measured by tandem mass spectrometry. Arterialized venous blood samples were collected periodically to measure plasma radioactivity and metabolites. In group A, 11C-DNP PET scan of the brain and whole body continued for 60 and 20 min, respectively. Subjects in group B underwent sequential whole-body scan for 60 min. The regional uptake of 11C-DNP was analyzed by measuring the standard uptake value (SUV through setting regions of interest on major organs with reference CT.Results: In group A, plasma DNP concentration was significantly correlated with the orally administered dose of DNP. The mean plasma concentration was 2.00 nM (n=3 after 1 mg oral administration and 0.06 nM (n=4 after 30 μg oral administration. No significant difference in plasma radioactivity or fraction of metabolites was found between groups A and B. High 11C-DNP accumulation was found in the liver, stomach, pancreas, brain, salivary glands, bone marrow, and myocardium in groups A

  17. Whole-Body Distribution of Donepezil as an Acetylcholinesterase Inhibitor after Oral Administration in Normal Human Subjects: A 11C-donepezil PET Study

    Science.gov (United States)

    Mochida, Ikuko; Shimosegawa, Eku; Kanai, Yasukazu; Naka, Sadahiro; Matsunaga, Keiko; Isohashi, Kayako; Horitsugi, Genki; Watabe, Tadashi; Kato, Hiroki; Hatazawa, Jun

    2017-01-01

    Objective(s): It is difficult to investigate the whole-body distribution of an orally administered drug by means of positron emission tomography (PET), owing to the short physical half-life of radionuclides, especially when 11C-labeled compounds are tested. Therefore, we aimed to examine the whole-body distribution of donepezil (DNP) as an acetylcholinesterase inhibitor by means of 11C-DNP PET imaging, combined with the oral administration of pharmacological doses of DNP. Methods: We studied 14 healthy volunteers, divided into group A (n=4) and group B (n=10). At first, we studied four females (mean age: 57.3±4.5 y), three of whom underwent 11C-DNP PET scan at 2.5 h after the oral administration of 1 mg and 30 µg of DNP, respectively, while one patient was scanned following the oral administration of 30 µg of DNP (group A). Then, we studied five females and five males (48.3±6.1 y), who underwent 11C-DNP PET scan, without the oral administration of DNP (group B). Plasma DNP concentration upon scanning was measured by tandem mass spectrometry. Arterialized venous blood samples were collected periodically to measure plasma radioactivity and metabolites. In group A, 11C-DNP PET scan of the brain and whole body continued for 60 and 20 min, respectively. Subjects in group B underwent sequential whole-body scan for 60 min. The regional uptake of 11C-DNP was analyzed by measuring the standard uptake value (SUV) through setting regions of interest on major organs with reference CT. Results: In group A, plasma DNP concentration was significantly correlated with the orally administered dose of DNP. The mean plasma concentration was 2.00 nM (n=3) after 1 mg oral administration and 0.06 nM (n=4) after 30 µg oral administration. No significant difference in plasma radioactivity or fraction of metabolites was found between groups A and B. High 11C-DNP accumulation was found in the liver, stomach, pancreas, brain, salivary glands, bone marrow, and myocardium in groups A and B

  18. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Predicting the Size of Sunspot Cycle 24 on the Basis of Single- and Bi-Variate Geomagnetic Precursor Methods

    Science.gov (United States)

    Wilson, Robert M.; Hathaway, David H.

    2009-01-01

    Examined are single- and bi-variate geomagnetic precursors for predicting the maximum amplitude (RM) of a sunspot cycle several years in advance. The best single-variate fit is one based on the average of the ap index 36 mo prior to cycle minimum occurrence (E(Rm)), having a coefficient of correlation (r) equal to 0.97 and a standard error of estimate (se) equal to 9.3. Presuming cycle 24 not to be a statistical outlier and its minimum in March 2008, the fit suggests cycle 24 s RM to be about 69 +/- 20 (the 90% prediction interval). The weighted mean prediction of 11 statistically important single-variate fits is 116 +/- 34. The best bi-variate fit is one based on the maximum and minimum values of the 12-mma of the ap index; i.e., APM# and APm*, where # means the value post-E(RM) for the preceding cycle and * means the value in the vicinity of cycle minimum, having r = 0.98 and se = 8.2. It predicts cycle 24 s RM to be about 92 +/- 27. The weighted mean prediction of 22 statistically important bi-variate fits is 112 32. Thus, cycle 24's RM is expected to lie somewhere within the range of about 82 to 144. Also examined are the late-cycle 23 behaviors of geomagnetic indices and solar wind velocity in comparison to the mean behaviors of cycles 2023 and the geomagnetic indices of cycle 14 (RM = 64.2), the weakest sunspot cycle of the modern era.

  20. Comparative tissue distribution profiles of five major bio-active components in normal and blood deficiency rats after oral administration of Danggui Buxue Decoction by UPLC-TQ/MS.

    Science.gov (United States)

    Shi, Xuqin; Tang, Yuping; Zhu, Huaxu; Li, Weixia; Li, Zhenhao; Li, Wei; Duan, Jin-ao

    2014-01-01

    Astragali Radix (AR) and Angelicae Sinensis Radix (ASR) were frequently combined and used in China as herbal pair called as Danggui Buxue Decoction (DBD) for treatment of blood deficiency syndrome, such as women's ailments. This study is to investigate the tissue distribution profiles of five major bio-active constituents (ferulic acid, caffeic acid, calycosin-7-O-β-glucoside, ononin and astragaloside IV) in DBD after oral administration of DBD in blood deficiency rats, and to compare the difference between normal and blood deficiency rats. The blood deficiency rats were induced by bleeding from orbit at the dosages of 5.0mLkg(-1) every day, and the experimental period was 12 days. At the finally day of experimental period, both normal and blood deficiency rats were orally administrated with DBD, and then the tissues samples were collected at different time points. Ferulic acid, caffeic acid, calycosin-7-O-β-glucoside, ononin and astragaloside IV in different tissues were detected simultaneously by UPLC-TQ/MS, and the histograms were drawn. The results showed that the overall trend was CLiver>CKidney>CHeart>CSpleen>CLung, CC-30min>CM-30min>CM-60min>CC-5min>CM-5min>CC-60min>CM-240min>CC-240min. The contents of the detected compounds in liver were more than that in other tissues no matter in normal or blood deficiency rats. Compared to normal rats, partial contents of the compounds in blood deficiency rats' tissues at different time points had significant difference (Pdistribution investigation in blood deficiency animals which is conducted by bleeding. And the results demonstrated that the five DBD components in normal and blood deficiency rats had obvious differences in some organs and time points, suggesting that the blood flow and perfusion rate of the organ were altered in blood deficiency animals. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Socioeconomic Status and Health: A New Approach to the Measurement of Bivariate Inequality.

    Science.gov (United States)

    Erreygers, Guido; Kessels, Roselinde

    2017-06-23

    We suggest an alternative way to construct a family of indices of socioeconomic inequality of health. Our indices belong to the broad category of linear indices. In contrast to rank-dependent indices, which are defined in terms of the ranks of the socioeconomic variable and the levels of the health variable, our indices are based on the levels of both the socioeconomic and the health variable. We also indicate how the indices can be modified in order to introduce sensitivity to inequality in the socioeconomic distribution and to inequality in the health distribution. As an empirical illustration, we make a comparative study of the relation between income and well-being in 16 European countries using data from the Survey of Health, Ageing and Retirement in Europe (SHARE) Wave 4.

  2. Socioeconomic Status and Health: A New Approach to the Measurement of Bivariate Inequality

    Science.gov (United States)

    Kessels, Roselinde

    2017-01-01

    We suggest an alternative way to construct a family of indices of socioeconomic inequality of health. Our indices belong to the broad category of linear indices. In contrast to rank-dependent indices, which are defined in terms of the ranks of the socioeconomic variable and the levels of the health variable, our indices are based on the levels of both the socioeconomic and the health variable. We also indicate how the indices can be modified in order to introduce sensitivity to inequality in the socioeconomic distribution and to inequality in the health distribution. As an empirical illustration, we make a comparative study of the relation between income and well-being in 16 European countries using data from the Survey of Health, Ageing and Retirement in Europe (SHARE) Wave 4. PMID:28644405

  3. A new research paradigm for bivariate allometry: combining ANOVA and non-linear regression.

    Science.gov (United States)

    Packard, Gary C

    2018-04-06

    A novel statistical routine is presented here for exploring and comparing patterns of allometric variation in two or more groups of subjects. The routine combines elements of the analysis of variance (ANOVA) with non-linear regression to achieve the equivalent of an analysis of covariance (ANCOVA) on curvilinear data. The starting point is a three-parameter power equation to which a categorical variable has been added to identify membership by each subject in a specific group or treatment. The protocol differs from earlier ones in that different assumptions can be made about the form for random error in the full statistical model (i.e. normal and homoscedastic, normal and heteroscedastic, lognormal and heteroscedastic). The general equation and several modifications thereof were used to study allometric variation in field metabolic rates of marsupial and placental mammals. The allometric equations for both marsupials and placentals have an explicit, non-zero intercept, but the allometric exponent is higher in the equation for placentals than in that for marsupials. The approach followed here is extraordinarily versatile, and it has wider application in allometry than standard ANCOVA performed on logarithmic transformations. © 2018. Published by The Company of Biologists Ltd.

  4. Bivariate modelling of the financial development-fossil fuel consumption nexus in Ghana

    OpenAIRE

    Yeboah Asuamah, Samuel

    2017-01-01

    The present paper modelled the relationship between financial developments and fossil fuel energy consumption in Ghana for the period 1970-2011 by applying Autoregressive Distributed Lad Model (ARDL). The findings of the paper on the cointegration test indicate significant evidence of cointegration between fossil fuel consumption and financial development. The findings seem to suggest that financial development is an explanatory variable in fossil fuel consumption management in achieving sust...

  5. Comparative study of the distribution of the alpha-subunits of voltage-gated sodium channels in normal and axotomized rat dorsal root ganglion neurons.

    Science.gov (United States)

    Fukuoka, Tetsuo; Kobayashi, Kimiko; Yamanaka, Hiroki; Obata, Koichi; Dai, Yi; Noguchi, Koichi

    2008-09-10

    We compared the distribution of the alpha-subunit mRNAs of voltage-gated sodium channels Nav1.1-1.3 and Nav1.6-1.9 and a related channel, Nax, in histochemically identified neuronal subpopulations of the rat dorsal root ganglia (DRG). In the naïve DRG, the expression of Nav1.1 and Nav1.6 was restricted to A-fiber neurons, and they were preferentially expressed by TrkC neurons, suggesting that proprioceptive neurons possess these channels. Nav1.7, -1.8, and -1.9 mRNAs were more abundant in C-fiber neurons compared with A-fiber ones. Nax was evenly expressed in both populations. Although Nav1.8 and -1.9 were preferentially expressed by TrkA neurons, other alpha-subunits were expressed independently of TrkA expression. Actually, all IB4(+) neurons expressed both Nav1.8 and -1.9, and relatively limited subpopulations of IB4(+) neurons (3% and 12%, respectively) expressed Nav1.1 and/or Nav1.6. These findings provide useful information in interpreting the electrophysiological characteristics of some neuronal subpopulations of naïve DRG. After L5 spinal nerve ligation, Nav1.3 mRNA was up-regulated mainly in A-fiber neurons in the ipsilateral L5 DRG. Although previous studies demonstrated that nerve growth factor (NGF) and glial cell-derived neurotrophic factor (GDNF) reversed this up-regulation, the Nav1.3 induction was independent of either TrkA or GFRalpha1 expression, suggesting that the induction of Nav1.3 may be one of the common responses of axotomized DRG neurons without a direct relationship to NGF/GDNF supply. (c) 2008 Wiley-Liss, Inc.

  6. Diagnostic performance of des-γ-carboxy prothrombin (DCP) for hepatocellular carcinoma: a bivariate meta-analysis.

    Science.gov (United States)

    Gao, P; Li, M; Tian, Q B; Liu, Dian-Wu

    2012-01-01

    Serum markers are needed to be developed to specifically diagnose Hepatocellular carcinoma (HCC). Des-γ-carboxy prothrombin (DCP) is a promising tool with limited expense and widely accessibility, but the reported results have been controversial. In order to review the performance of DCP for the diagnosis of HCC, the meta-analysis was performed. After a systematic review of relevant studies, the sensitivity, specificity, positive and negative likelihood ratios (PLR and NLR, respectively) were pooled using a bivariate meta-analysis. Potential between-study heterogeneity was explored by meta-regression model. The post-test probability and the likelihood ratio scattergram to evaluate clinical usefulness were calculated. Based on literature review of 20 publications, the overall sensitivity, specificity, PLR and NLR of DCP for the detection of HCC were 67% (95%CI, 58%-74%), 92% (95%CI, 88%-94%), 7.9 (95%CI, 5.6-11.2) and 0.36 (95%CI, 0.29-0.46), respectively. The area under the bivariate summary receiving operating characteristics curve was 0.89 (95%CI, 0.85-0.92). Significant heterogeneity was present. In conclusion, the major role of DCP is the moderate confirmation of HCC. More prospective studies of DCP are needed in future.

  7. The gravitational distribution of ventilation-perfusion ratio is more uniform in prone than supine posture in the normal human lung

    Science.gov (United States)

    Sá, Rui Carlos; Theilmann, Rebecca J.; Buxton, Richard B.; Prisk, G. Kim; Hopkins, Susan R.

    2013-01-01

    The gravitational gradient of intrapleural pressure is suggested to be less in prone posture than supine. Thus the gravitational distribution of ventilation is expected to be more uniform prone, potentially affecting regional ventilation-perfusion (V̇a/Q̇) ratio. Using a novel functional lung magnetic resonance imaging technique to measure regional V̇a/Q̇ ratio, the gravitational gradients in proton density, ventilation, perfusion, and V̇a/Q̇ ratio were measured in prone and supine posture. Data were acquired in seven healthy subjects in a single sagittal slice of the right lung at functional residual capacity. Regional specific ventilation images quantified using specific ventilation imaging and proton density images obtained using a fast gradient-echo sequence were registered and smoothed to calculate regional alveolar ventilation. Perfusion was measured using arterial spin labeling. Ventilation (ml·min−1·ml−1) images were combined on a voxel-by-voxel basis with smoothed perfusion (ml·min−1·ml−1) images to obtain regional V̇a/Q̇ ratio. Data were averaged for voxels within 1-cm gravitational planes, starting from the most gravitationally dependent lung. The slope of the relationship between alveolar ventilation and vertical height was less prone than supine (−0.17 ± 0.10 ml·min−1·ml−1·cm−1 supine, −0.040 ± 0.03 prone ml·min−1·ml−1·cm−1, P = 0.02) as was the slope of the perfusion-height relationship (−0.14 ± 0.05 ml·min−1·ml−1·cm−1 supine, −0.08 ± 0.09 prone ml·min−1·ml−1·cm−1, P = 0.02). There was a significant gravitational gradient in V̇a/Q̇ ratio in both postures (P < 0.05) that was less in prone (0.09 ± 0.08 cm−1 supine, 0.04 ± 0.03 cm−1 prone, P = 0.04). The gravitational gradients in ventilation, perfusion, and regional V̇a/Q̇ ratio were greater supine than prone, suggesting an interplay between thoracic cavity configuration, airway and vascular tree anatomy, and the effects of

  8. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  9. Fitting a Bivariate Measurement Error Model for Episodically Consumed Dietary Components

    KAUST Repository

    Zhang, Saijuan

    2011-01-06

    There has been great public health interest in estimating usual, i.e., long-term average, intake of episodically consumed dietary components that are not consumed daily by everyone, e.g., fish, red meat and whole grains. Short-term measurements of episodically consumed dietary components have zero-inflated skewed distributions. So-called two-part models have been developed for such data in order to correct for measurement error due to within-person variation and to estimate the distribution of usual intake of the dietary component in the univariate case. However, there is arguably much greater public health interest in the usual intake of an episodically consumed dietary component adjusted for energy (caloric) intake, e.g., ounces of whole grains per 1000 kilo-calories, which reflects usual dietary composition and adjusts for different total amounts of caloric intake. Because of this public health interest, it is important to have models to fit such data, and it is important that the model-fitting methods can be applied to all episodically consumed dietary components.We have recently developed a nonlinear mixed effects model (Kipnis, et al., 2010), and have fit it by maximum likelihood using nonlinear mixed effects programs and methodology (the SAS NLMIXED procedure). Maximum likelihood fitting of such a nonlinear mixed model is generally slow because of 3-dimensional adaptive Gaussian quadrature, and there are times when the programs either fail to converge or converge to models with a singular covariance matrix. For these reasons, we develop a Monte-Carlo (MCMC) computation of fitting this model, which allows for both frequentist and Bayesian inference. There are technical challenges to developing this solution because one of the covariance matrices in the model is patterned. Our main application is to the National Institutes of Health (NIH)-AARP Diet and Health Study, where we illustrate our methods for modeling the energy-adjusted usual intake of fish and whole

  10. Comparisons of pharmacokinetic and tissue distribution profile of four major bioactive components after oral administration of Xiang-Fu-Si-Wu Decoction effective fraction in normal and dysmenorrheal symptom rats.

    Science.gov (United States)

    Liu, Pei; Li, Wei; Li, Zhen-hao; Qian, Da-wei; Guo, Jian-ming; Shang, Er-xin; Su, Shu-lan; Tang, Yu-ping; Duan, Jin-ao

    2014-07-03

    Xiang-Fu-Si-Wu Decoction (XFSWD) has been widely used to treat primary dysmenorrhea in clinical practice for hundreds of years and shown great efficacy. One fraction of XFSWD, which was an elution product by macroporous adsorption resin from aqueous extract solution with 60% ethanol (XFSWE), showed great analgesic effect. The present study was conducted to investigate the possible pharmacokinetic and tissue distribution profiles of four major bioactive constituents (berberine, protopine, tetrahydrocoptisine and tetrahydropalmatine) after oral administration of XFSWE in dysmenorrheal symptom rats, and to compare the difference between normal and dysmenorrheal symptom rats. Estradiol benzoate and oxytocin were used to produce dysmenorrheal symptom rat model. The experimental period was seven days. At the final day of experimental period, both normal and dysmenorrheal symptom rats were orally administrated with XFSWE, and then the blood and tissues samples were collected at different time points. Berberine, protopine, tetrahydrocoptisine and tetrahydropalmatine in blood and tissue samples were determined by LC-MS/MS. Pharmacokinetic parameters were calculated from the plasma concentration-time data using non-compartmental methods. The differences of pharmacokinetic parameters among groups were tested by one-way analysis of variance (ANOVA). There were statistically significant differences (Pnormal and dysmenorrheal symptom rats that orally administered with same dosage of XFSWE. In tissue distribution study, the results showed that the overall trend was C(Spleen)>C(Liver)>C(Kidney)>C(Uterus)>C(Heart)>C(Lung)>C(Ovary)>C(Brain)>C(Thymus), C(M-60 min)>C(M-120 min)>C(M-30 min)>C(C-60 min)>C(C-120 min)>C(C-30 min). The contents of protopine in liver, spleen and uterus were more than that in other tissues of dysmenorrheal symptom rats. Compared to normal rats, partial contents of the compounds in dysmenorrheal symptom rats׳ tissues at different time points had significant

  11. Flash flood susceptibility analysis and its mapping using different bivariate models in Iran: a comparison between Shannon's entropy, statistical index, and weighting factor models.

    Science.gov (United States)

    Khosravi, Khabat; Pourghasemi, Hamid Reza; Chapi, Kamran; Bahri, Masoumeh

    2016-12-01

    Flooding is a very common worldwide natural hazard causing large-scale casualties every year; Iran is not immune to this thread as well. Comprehensive flood susceptibility mapping is very important to reduce losses of lives and properties. Thus, the aim of this study is to map susceptibility to flooding by different bivariate statistical methods including Shannon's entropy (SE), statistical index (SI), and weighting factor (Wf). In this regard, model performance evaluation is also carried out in Haraz Watershed, Mazandaran Province, Iran. In the first step, 211 flood locations were identified by the documentary sources and field inventories, of which 70% (151 positions) were used for flood susceptibility modeling and 30% (60 positions) for evaluation and verification of the model. In the second step, ten influential factors in flooding were chosen, namely slope angle, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, rainfall, geology, land use, and normalized difference vegetation index (NDVI). In the next step, flood susceptibility maps were prepared by these four methods in ArcGIS. As the last step, receiver operating characteristic (ROC) curve was drawn and the area under the curve (AUC) was calculated for quantitative assessment of each model. The results showed that the best model to estimate the susceptibility to flooding in Haraz Watershed was SI model with the prediction and success rates of 99.71 and 98.72%, respectively, followed by Wf and SE models with the AUC values of 98.1 and 96.57% for the success rate, and 97.6 and 92.42% for the prediction rate, respectively. In the SI and Wf models, the highest and lowest important parameters were the distance from river and geology. Flood susceptibility maps are informative for managers and decision makers in Haraz Watershed in order to contemplate measures to reduce human and financial losses.

  12. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering Based on the Newly Developed Self-consistent RC/EMIC Waves Model by Khazanov et al. [2006

    Science.gov (United States)

    Khazanov, G. V.; Gallagher, D. L.; Gamayunov, K.

    2007-01-01

    It is well known that the effects of EMIC waves on RC ion and RB electron dynamics strongly depend on such particle/wave characteristics as the phase-space distribution function, frequency, wave-normal angle, wave energy, and the form of wave spectral energy density. Therefore, realistic characteristics of EMIC waves should be properly determined by modeling the RC-EMIC waves evolution self-consistently. Such a selfconsistent model progressively has been developing by Khaznnov et al. [2002-2006]. It solves a system of two coupled kinetic equations: one equation describes the RC ion dynamics and another equation describes the energy density evolution of EMIC waves. Using this model, we present the effectiveness of relativistic electron scattering and compare our results with previous work in this area of research.

  13. Suspended sediment concentration and particle size distribution ...

    Indian Academy of Sciences (India)

    The relationship between SSC and particle size distribution (PSD) were correlated with HMC by using bivariate and multivariate regression models. Proposed models were then selected based on statistical criteria. The results showed high correlation between dissolved and particulate chromium content with efficiency ...

  14. Landslide susceptibility analysis in central Vietnam based on an incomplete landslide inventory: Comparison of a new method to calculate weighting factors by means of bivariate statistics

    Science.gov (United States)

    Meinhardt, Markus; Fink, Manfred; Tünschel, Hannes

    2015-04-01

    Vietnam is regarded as a country strongly impacted by climate change. Population and economic growth result in additional pressures on the ecosystems in the region. In particular, changes in landuse and precipitation extremes lead to a higher landslide susceptibility in the study area (approx. 12,400 km2), located in central Vietnam and impacted by a tropical monsoon climate. Hence, this natural hazard is a serious problem in the study area. A probability assessment of landslides is therefore undertaken through the use of bivariate statistics. However, the landslide inventory based only on field campaigns does not cover the whole area. To avoid a systematic bias due to the limited mapping area, the investigated regions are depicted as the viewshed in the calculations. On this basis, the distribution of the landslides is evaluated in relation to the maps of 13 parameters, showing the strongest correlation to distance to roads and precipitation increase. An additional weighting of the input parameters leads to better results, since some parameters contribute more to landslides than others. The method developed in this work is based on the validation of different parameter sets used within the statistical index method. It is called "omit error" because always omitting another parameter leads to the weightings, which describe how strong every single parameter improves or reduces the objective function. Furthermore, this approach is used to find a better input parameter set by excluding some parameters. After this optimization, nine input parameters are left, and they are weighted by the omit error method, providing the best susceptibility map with a success rate of 92.9% and a prediction rate of 92.3%. This is an improvement of 4.4% and 4.2%, respectively, compared to the basic statistical index method with the 13 input parameters.

  15. Stochastic Frontier Models with Dependent Errors based on Normal and Exponential Margins || Modelos de frontera estocástica con errores dependientes basados en márgenes normal y exponencial

    Directory of Open Access Journals (Sweden)

    Gómez-Déniz, Emilio

    2017-06-01

    Full Text Available Following the recent work of Gómez-Déniz and Pérez-Rodríguez (2014, this paper extends the results obtained there to the normal-exponential distribution with dependence. Accordingly, the main aim of the present paper is to enhance stochastic production frontier and stochastic cost frontier modelling by proposing a bivariate distribution for dependent errors which allows us to nest the classical models. Closed-form expressions for the error term and technical efficiency are provided. An illustration using real data from the econometric literature is provided to show the applicability of the model proposed. || Continuando el reciente trabajo de Gómez-Déniz y Pérez-Rodríguez (2014, el presente artículo extiende los resultados obtenidos a la distribución normal-exponencial con dependencia. En consecuencia, el principal propósito de este artículo es mejorar el modelado de la frontera estocástica tanto de producción como de coste proponiendo para ello una distribución bivariante para errores dependientes que nos permitan encajar los modelos clásicos. Se obtienen las expresiones en forma cerrada para el término de error y la eficiencia técnica. Se ilustra la aplicabilidad del modelo propouesto usando datos reales existentes en la literatura econométrica.

  16. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  17. Normal Pressure Hydrocephalus (NPH)

    Science.gov (United States)

    ... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...

  18. ABACUS: an entropy-based cumulative bivariate statistic robust to rare variants and different direction of genotype effect.

    Science.gov (United States)

    Di Camillo, Barbara; Sambo, Francesco; Toffolo, Gianna; Cobelli, Claudio

    2014-02-01

    In the past years, both sequencing and microarray have been widely used to search for relations between genetic variations and predisposition to complex pathologies such as diabetes or neurological disorders. These studies, however, have been able to explain only a small fraction of disease heritability, possibly because complex pathologies cannot be referred to few dysfunctional genes, but are rather heterogeneous and multicausal, as a result of a combination of rare and common variants possibly impairing multiple regulatory pathways. Rare variants, though, are difficult to detect, especially when the effects of causal variants are in different directions, i.e. with protective and detrimental effects. Here, we propose ABACUS, an Algorithm based on a BivAriate CUmulative Statistic to identify single nucleotide polymorphisms (SNPs) significantly associated with a disease within predefined sets of SNPs such as pathways or genomic regions. ABACUS is robust to the concurrent presence of SNPs with protective and detrimental effects and of common and rare variants; moreover, it is powerful even when few SNPs in the SNP-set are associated with the phenotype. We assessed ABACUS performance on simulated and real data and compared it with three state-of-the-art methods. When ABACUS was applied to type 1 and 2 diabetes data, besides observing a wide overlap with already known associations, we found a number of biologically sound pathways, which might shed light on diabetes mechanism and etiology. ABACUS is available at http://www.dei.unipd.it/∼dicamill/pagine/Software.html.

  19. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...

  20. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    NARCIS (Netherlands)

    M.C. Medina-Gomez (Carolina); J.P. Kemp (John); Dimou, N.L. (Niki L.); Kreiner, E. (Eskil); A. Chesi (Alessandra); B.S. Zemel (Babette S.); K. Bønnelykke (Klaus); Boer, C.G. (Cindy G.); T.S. Ahluwalia (Tarunveer Singh); H. Bisgaard; E. Evangelou (Evangelos); D.H.M. Heppe (Denise); Bonewald, L.F. (Lynda F.); Gorski, J.P. (Jeffrey P.); M. Ghanbari (Mohsen); S. Demissie (Serkalem); Duque, G. (Gustavo); M.T. Maurano (Matthew T.); D.P. Kiel (Douglas P.); Y.-H. Hsu (Yi-Hsiang); B.C.J. van der Eerden (Bram); Ackert-Bicknell, C. (Cheryl); S. Reppe (Sjur); K.M. Gautvik (Kaare); Raastad, T. (Truls); D. Karasik (David); J. van de Peppel (Jeroen); V.W.V. Jaddoe (Vincent); A.G. Uitterlinden (André); J.H. Tobias (Jon); S.F.A. Grant (Struan); Bagos, P.G. (Pantelis G.); D.M. Evans (David); F. Rivadeneira Ramirez (Fernando)

    2017-01-01

    markdownabstractBone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body

  1. A comparison of the effect of 5-bromodeoxyuridine substitution on 33258 Hoechst- and DAPI-fluorescence of isolated chromosomes by bivariate flow karyotyping

    NARCIS (Netherlands)

    Buys, C. H.; Mesa, J.; van der Veen, A. Y.; Aten, J. A.

    1986-01-01

    Application of the fluorescent DNA-intercalator propidium iodide for stabilization of the mitotic chromosome structure during isolation of chromosomes from V79 Chinese hamster cells and subsequent staining with the fluorochromes 33258 Hoechst or DAPI allowed bivariate flow karyotyping of isolated

  2. Normal regional distribution of cerebral blood flow in dogs: comparison between (99m) Tc-ethylcysteinate dimer and (99m) Tc- hexamethylpropylene amine oxime single photon emission computed tomography.

    Science.gov (United States)

    Adriaens, Antita; Polis, Ingeborgh; Waelbers, Tim; Vandermeulen, Eva; Dobbeleir, André; De Spiegeleer, Bart; Peremans, Kathelijne

    2013-01-01

    Functional imaging provides important insights into canine brain pathologies such as behavioral problems. Two (99m) Tc-labeled single photon emission computed tomography (SPECT) cerebral blood flow tracers-ethylcysteinate dimer (ECD) and hexamethylpropylene amine oxime (HMPAO)-are commonly used in human medicine and have been used previously in dogs but intrasubject comparison of both tracers in dogs is lacking. Therefore, this study investigated whether regional distribution differences between both tracers occur in dogs as is reported in humans. Eight beagles underwent two SPECT examinations first with (99m) Tc-ECD and followed by (99m) Tc-HMPAO. SPECT scanning was performed with a triple head gamma camera equipped with ultrahigh resolution parallel hole collimators. Images were reconstructed using filtered backprojection with a Butterworth filter. Emission data were fitted to a template permitting semiquantification using predefined regions or volumes of interest (VOIs). For each VOI, perfusion indices were calculated by normalizing the regional counts per voxel to total brain counts per voxel. The obtained perfusion indices for each region for both tracers were compared with a paired Student's T-test. Significant (P < 0.05) regional differences were seen in the subcortical region and the cerebellum. Both tracers can be used to visualize regional cerebral blood flow in dogs, however, due to the observed regional differences, they are not entirely interchangeable. © 2013 Veterinary Radiology & Ultrasound.

  3. Nível crítico pelo critério da distribuição normal reduzida: uma nova proposta para interpretação de análise foliar Critical level through the reduced normal distribution approach: a new proposal for interpretation of foliar analysis

    Directory of Open Access Journals (Sweden)

    Celsemy E. Maia

    2001-05-01

    Full Text Available Objetivou-se, com este trabalho, desenvolver uma metodologia através de um embasamento estatístico, para determinação de nível crítico em tecido vegetal, oriunda de condições de campo. A obtenção do nível crítico pela distribuição contínua de probabilidade, é uma nova proposta para interpretação de análise foliar, baseada na distribuição normal reduzida. Para isto, são necessários dados de produtividade (P e de Q, donde Q é definido como a relação entre P e n i (Q= P/n i, e n i é o teor do nutriente de que se deseja encontrar o nível crítico. Inicialmente encontra-se P que representa 90% da máxima, pela equação P(90% = 1,281552s1 + X e para o cálculo de Q que 90% do valor máximo pela equação Q = 1,281552s2 + X onde X e s1 são a média aritmética e o desvio-padrão de P e X e s2, a média e o desvio-padrão de Q. O nível crítico é obtido por NCi = (1,281552s1 + X/(1,281552s2 + X. O nível crítico foliar determinado pela metodologia da distribuição contínua de probabilidade permitiu calcular-se, para a cultura do café, valores dentro da faixa de referência recomendada pela literatura.This study develops a methodology through a statistical method, for the determination of critical level in foliar analysis. The obtaining of the critical level with the continuous distribution of probability is a new proposal for foliar analysis interpretation, based on the reduced normal distribution. For this purpose it is necessary to have data of productivity (P and of Q, defined as the relationship between P and n i (Q = P/n i, where n i is the content of the nutrient for which critical level is to be found. Initially Pr which represents 90% of the maxim, is calculated with the equation P(90% = 1.281552s1 + x1 and for the calculation of Q 90% of the maximum value with the equation Q = 1.281552s2 + x2 where x1 and s1 are the arithmetic average and the standard deviation of P and x2 and s2 the average and the standard

  4. Quasi-bivariate variational mode decomposition as a tool of scale analysis in wall-bounded turbulence

    Science.gov (United States)

    Wang, Wenkang; Pan, Chong; Wang, Jinjun

    2018-01-01

    The identification and separation of multi-scale coherent structures is a critical task for the study of scale interaction in wall-bounded turbulence. Here, we propose a quasi-bivariate variational mode decomposition (QB-VMD) method to extract structures with various scales from instantaneous two-dimensional (2D) velocity field which has only one primary dimension. This method is developed from the one-dimensional VMD algorithm proposed by Dragomiretskiy and Zosso (IEEE Trans Signal Process 62:531-544, 2014) to cope with a quasi-2D scenario. It poses the feature of length-scale bandwidth constraint along the decomposed dimension, together with the central frequency re-balancing along the non-decomposed dimension. The feasibility of this method is tested on both a synthetic flow field and a turbulent boundary layer at moderate Reynolds number (Re_{τ } = 3458) measured by 2D particle image velocimetry (PIV). Some other popular scale separation tools, including pseudo-bi-dimensional empirical mode decomposition (PB-EMD), bi-dimensional EMD (B-EMD) and proper orthogonal decomposition (POD), are also tested for comparison. Among all these methods, QB-VMD shows advantages in both scale characterization and energy recovery. More importantly, the mode mixing problem, which degrades the performance of EMD-based methods, is avoided or minimized in QB-VMD. Finally, QB-VMD analysis of the wall-parallel plane in the log layer (at y/δ = 0.12) of the studied turbulent boundary layer shows the coexistence of large- or very large-scale motions (LSMs or VLSMs) and inner-scaled structures, which can be fully decomposed in both physical and spectral domains.

  5. Regional cerebral blood flow as assessed by principal component analysis and 99mTc-HMPAO SPET in healthy subjects at rest: normal distribution and effect of age and gender

    International Nuclear Information System (INIS)

    Pagani, M.; Salmaso, D.; Jonsson, C.; Hatherly, R.; Larsson, S.A.; Jacobsson, H.; Waegner, A.

    2002-01-01

    The increasing implementation of standardisation techniques in brain research and clinical diagnosis has highlighted the importance of reliable baseline data from normal control subjects for inter-subject analysis. In this context, knowledge of the regional cerebral blood flow (rCBF) distribution in normal ageing is a factor of the utmost importance. In the present study, rCBF was investigated in 50 healthy volunteers (25 men, 25 women), aged 31-78 years, who were examined at rest by means of single-photon emission tomography (SPET) using technetium-99m d,l-hexamethylpropylene amine oxime (HMPAO). After normalising the CBF data, 27 left and 27 right volumes of interest (VOIs) were selected and automatically outlined by standardisation software (computerised brain atlas). The heavy load of flow data thus obtained was reduced in number and grouped in factors by means of principal component analysis (PCA). PCA extracted 12 components explaining 81% of the variance and including the vast majority of cortical and subcortical regions. Analysis of variance and regression analyses were performed for rCBF, age and gender before PCA was applied and subsequently for each single extracted factor. There was a significantly higher CBF on the right side than on the left side (P<0.001). In the overall analysis, a significant decrease was found in CBF (P=0.05) with increasing age, and this decrease was particularly evident in the left hemisphere (P=0.006). When gender was specifically analysed, CBF was found to decrease significantly with increasing age in females (P=0.037) but not in males. Furthermore, a significant decrease in rCBF with increasing age was found in the brain vertex (P=0.05), left frontotemporal cortex (P=0.012) and temporocingulate cortex (P=0.003). By contrast, relative rCBF in central structures increased with age (P=0.001). The ability of standardisation software and PCA to identify functionally connected brain regions might contribute to a better

  6. Testing for normality

    CERN Document Server

    Thode, Henry C

    2002-01-01

    Describes the selection, design, theory, and application of tests for normality. Covers robust estimation, test power, and univariate and multivariate normality. Contains tests ofr multivariate normality and coordinate-dependent and invariant approaches.

  7. Explorations in statistics: the assumption of normality.

    Science.gov (United States)

    Curran-Everett, Douglas

    2017-09-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This twelfth installment of Explorations in Statistics explores the assumption of normality, an assumption essential to the meaningful interpretation of a t test. Although the data themselves can be consistent with a normal distribution, they need not be. Instead, it is the theoretical distribution of the sample mean or the theoretical distribution of the difference between sample means that must be roughly normal. The most versatile approach to assess normality is to bootstrap the sample mean, the difference between sample means, or t itself. We can then assess whether the distributions of these bootstrap statistics are consistent with a normal distribution by studying their normal quantile plots. If we suspect that an inference we make from a t test may not be justified-if we suspect that the theoretical distribution of the sample mean or the theoretical distribution of the difference between sample means is not normal-then we can use a permutation method to analyze our data. Copyright © 2017 the American Physiological Society.

  8. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Long-lead station-scale prediction of hydrological droughts in South Korea based on bivariate pattern-based downscaling

    Science.gov (United States)

    Sohn, Soo-Jin; Tam, Chi-Yung

    2016-05-01

    Capturing climatic variations in boreal winter to spring (December-May) is essential for properly predicting droughts in South Korea. This study investigates the variability and predictability of the South Korean climate during this extended season, based on observations from 60 station locations and multi-model ensemble (MME) hindcast experiments (1983/1984-2005/2006) archived at the APEC Climate Center (APCC). Multivariate empirical orthogonal function (EOF) analysis results based on observations show that the first two leading modes of winter-to-spring precipitation and temperature variability, which together account for ~80 % of the total variance, are characterized by regional-scale anomalies covering the whole South Korean territory. These modes were also closely related to some of the recurrent large-scale circulation changes in the northern hemisphere during the same season. Consistent with the above, examination of the standardized precipitation evapotranspiration index (SPEI) indicates that drought conditions in South Korea tend to be accompanied by regional-to-continental-scale circulation anomalies over East Asia to the western north Pacific. Motivated by the aforementioned findings on the spatial-temporal coherence among station-scale precipitation and temperature anomalies, a new bivariate and pattern-based downscaling method was developed. The novelty of this method is that precipitation and temperature data were first filtered using multivariate EOFs to enhance their spatial-temporal coherence, before being linked to large-scale circulation variables using canonical correlation analysis (CCA). To test its applicability and to investigate its related potential predictability, a perfect empirical model was first constructed with observed datasets as predictors. Next, a model output statistics (MOS)-type hybrid dynamical-statistical model was developed, using products from nine one-tier climate models as inputs. It was found that, with model sea

  10. Evaluation of Factors Affecting E-Bike Involved Crash and E-Bike License Plate Use in China Using a Bivariate Probit Model

    OpenAIRE

    Guo, Yanyong; Zhou, Jibiao; Wu, Yao; Chen, Jingxu

    2017-01-01

    The primary objective of this study is to evaluate factors affecting e-bike involved crash and license plate use in China. E-bike crashes data were collected from police database and completed through a telephone interview. Noncrash samples were collected by a questionnaire survey. A bivariate probit (BP) model was developed to simultaneously examine the significant factors associated with e-bike involved crash and e-bike license plate and to account for the correlations between them. Margina...

  11. A Hybrid ANN-GA Model to Prediction of Bivariate Binary Responses: Application to Joint Prediction of Occurrence of Heart Block and Death in Patients with Myocardial Infarction.

    Science.gov (United States)

    Mirian, Negin-Sadat; Sedehi, Morteza; Kheiri, Soleiman; Ahmadi, Ali

    2016-01-01

    In medical studies, when the joint prediction about occurrence of two events should be anticipated, a statistical bivariate model is used. Due to the limitations of usual statistical models, other methods such as Artificial Neural Network (ANN) and hybrid models could be used. In this paper, we propose a hybrid Artificial Neural Network-Genetic Algorithm (ANN-GA) model to prediction the occurrence of heart block and death in myocardial infarction (MI) patients simultaneously. For fitting and comparing the models, 263 new patients with definite diagnosis of MI hospitalized in Cardiology Ward of Hajar Hospital, Shahrekord, Iran, from March, 2014 to March, 2016 were enrolled. Occurrence of heart block and death were employed as bivariate binary outcomes. Bivariate Logistic Regression (BLR), ANN and hybrid ANN-GA models were fitted to data. Prediction accuracy was used to compare the models. The codes were written in Matlab 2013a and Zelig package in R3.2.2. The prediction accuracy of BLR, ANN and hybrid ANN-GA models was obtained 77.7%, 83.69% and 93.85% for the training and 78.48%, 84.81% and 96.2% for the test data, respectively. In both training and test data set, hybrid ANN-GA model had better accuracy. ANN model could be a suitable alternative for modeling and predicting bivariate binary responses when the presuppositions of statistical models are not met in actual data. In addition, using optimization methods, such as hybrid ANN-GA model, could improve precision of ANN model.

  12. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis

    OpenAIRE

    Shields, Katherine F.; Bain, Robert E.S.; Cronk, Ryan; Wright, Jim A.; Bartram, Jamie

    2015-01-01

    Background Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. Objectives We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. Methods We performed a bivari...

  13. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus.

    Science.gov (United States)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L; Kreiner, Eskil; Chesi, Alessandra; Zemel, Babette S; Bønnelykke, Klaus; Boer, Cindy G; Ahluwalia, Tarunveer S; Bisgaard, Hans; Evangelou, Evangelos; Heppe, Denise H M; Bonewald, Lynda F; Gorski, Jeffrey P; Ghanbari, Mohsen; Demissie, Serkalem; Duque, Gustavo; Maurano, Matthew T; Kiel, Douglas P; Hsu, Yi-Hsiang; C J van der Eerden, Bram; Ackert-Bicknell, Cheryl; Reppe, Sjur; Gautvik, Kaare M; Raastad, Truls; Karasik, David; van de Peppel, Jeroen; Jaddoe, Vincent W V; Uitterlinden, André G; Tobias, Jonathan H; Grant, Struan F A; Bagos, Pantelis G; Evans, David M; Rivadeneira, Fernando

    2017-07-25

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone mineral density (TBLH-BMD) regions in 10,414 children. The estimated SNP heritability is 43% (95% CI: 34-52%) for TBLH-BMD, and 39% (95% CI: 30-48%) for TB-LM, with a shared genetic component of 43% (95% CI: 29-56%). We identify variants with pleiotropic effects in eight loci, including seven established bone mineral density loci: WNT4, GALNT3, MEPE, CPED1/WNT16, TNFSF11, RIN3, and PPP6R3/LRP5. Variants in the TOM1L2/SREBF1 locus exert opposing effects TB-LM and TBLH-BMD, and have a stronger association with the former trait. We show that SREBF1 is expressed in murine and human osteoblasts, as well as in human muscle tissue. This is the first bivariate GWAS meta-analysis to demonstrate genetic factors with pleiotropic effects on bone mineral density and lean mass.Bone mineral density and lean skeletal mass are heritable traits. Here, Medina-Gomez and colleagues perform bivariate GWAS analyses of total body lean mass and bone mass density in children, and show genetic loci with pleiotropic effects on both traits.

  14. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...... mineral density (TBLH-BMD) regions in 10,414 children. The estimated SNP heritability is 43% (95% CI: 34-52%) for TBLH-BMD, and 39% (95% CI: 30-48%) for TB-LM, with a shared genetic component of 43% (95% CI: 29-56%). We identify variants with pleiotropic effects in eight loci, including seven established...... as in human muscle tissue. This is the first bivariate GWAS meta-analysis to demonstrate genetic factors with pleiotropic effects on bone mineral density and lean mass.Bone mineral density and lean skeletal mass are heritable traits. Here, Medina-Gomez and colleagues perform bivariate GWAS analyses of total...

  15. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  16. ESTIMATION OF THE SCALE PARAMETER FROM THE RAYLEIGH DISTRIBUTION FROM TYPE II SINGLY AND DOUBLY CENSORED DATA

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2009-01-01

    Full Text Available As common as the normal distribution is the Rayleigh distribution which occurs in works on radar, properties of sine wave plus-noise, etc. Rayleigh (1880 derived it from the amplitude of sound resulting from many important sources. The Rayleigh distribution is widely used in communication engineering, reliability analysis and applied statistics. Since the Rayleigh distribution has linearly increasing rate, it is appropriate for components which might not have manufacturing defects but age rapidly with time. Several types of electro-vacum devices have this feature. It is connected with one dimension and two dimensions random walk and is some times referred to as a random walk frequency distribution. It is a special case of Weibull distribution (1951 of wide applicability. It can be easily derived from the bivariate normal distribution with and p = 0. For further application of Rayleigh distribution, we refer to Johnson and Kotz (1994. Adatia (1995 has obtained the best linear unbiased estimator of the Rayleigh scale parameter based on fairly large censored samples. Dyer and Whisend (1973 obtained the BLUE of scale parameter based on type II censored samples for small N = 2(15. With the advance of computer technology it is now possible to obtain BLUE for large samples. Hirai (1978 obtained the estimate of the scale parameter from the Rayleigh distribution singly type II censored from the left side and right side and variances of the scale parameter. In this paper, we estimate the scale parameter of type II singly and doubly censored data from the Rayleigh distribution using Blom’s (1958 unbiased nearly best estimates and compare the efficiency of this estimate with BLUE and MLE.

  17. A new technique for the use of microspheres for the study of intra-cortical distribution of renal blood flow. Results for a normal and sodium overloaded rat. Report of internship performed in the Laboratoire de Physiologie Physico-Chimique (C.E.N. Saclay)

    International Nuclear Information System (INIS)

    Poujeol, P.

    1972-06-01

    This academic work reports the simultaneous study on the same kidney of the distribution of glomerular filtrations and the distribution of blood flow rate in the renal cortex. Th author combined the technique of perfusion of sodium 14 C ferro-cyanide which allows the measurement of individual glomerular filtrations, and a technique based on the use of microspheres which allows the assessment of blood flow distribution in the glomeruli of different nephron classes. Experiments have been performed on a normal rat, and on a rat submitted to a chronic NaCl overload [fr

  18. A Skew-Normal Mixture Regression Model

    Science.gov (United States)

    Liu, Min; Lin, Tsung-I

    2014-01-01

    A challenge associated with traditional mixture regression models (MRMs), which rest on the assumption of normally distributed errors, is determining the number of unobserved groups. Specifically, even slight deviations from normality can lead to the detection of spurious classes. The current work aims to (a) examine how sensitive the commonly…

  19. Marital Satisfaction amongst Parents of Children with Attention Deficit Hyperactivity Disorder and Normal Children

    Directory of Open Access Journals (Sweden)

    Reza Rostami

    2012-09-01

    Full Text Available Objective: The aim of this study was to compare marital satisfactionbetween parents of children with attention deficit hyper activity disorder (ADHD and parents of normal children.Methods: In this study we have selected 400 parents (200 parents of children with ADHD and 200 parents of normal children, whose children age range was 6-18 years. Data were collected using Enrich marital satisfaction Questionnaire, Kiddie Schedule for Affective Disorders and Schizophrenia Present and Lifetime Version (K-SADS-PL and Conner’s Questionnaire (parent and self-report forms. For data analysis, SPSS software17, bivariate χ2- test, and independent t- test were used.Results: The mean of marital satisfaction in parents of normal children was higher than parents of ADHD children. In the bivariate χ2- test, the p value was less than 0.05, and the obtained t was more than the table-t (1.96, so it can be assumed that there is a significant difference between parents of normal children and those with ADHD children in their marital satisfaction. The level of marital satisfaction (strongly agree level was 2.8% lower among parents of ADHD children compared to parents of normal children.Conclusions: Findings indicate that parents with ADHD children have lower level of marital satisfaction than parents with normal children.

  20. Joint distributions for movements of elements in Sattolo\\'s and the ...

    African Journals Online (AJOL)

    In this paper, we are interested in the joint distribution of two elements j and k; we are able to compute the bivariate generating functions explicitly, although it is quite involved. From it, moments and limiting distributions can be deduced. Furthermore, we compute the probability that elements i and j ever change places in ...

  1. A Hybrid Forecasting Model Based on Bivariate Division and a Backpropagation Artificial Neural Network Optimized by Chaos Particle Swarm Optimization for Day-Ahead Electricity Price

    Directory of Open Access Journals (Sweden)

    Zhilong Wang

    2014-01-01

    Full Text Available In the electricity market, the electricity price plays an inevitable role. Nevertheless, accurate price forecasting, a vital factor affecting both government regulatory agencies and public power companies, remains a huge challenge and a critical problem. Determining how to address the accurate forecasting problem becomes an even more significant task in an era in which electricity is increasingly important. Based on the chaos particle swarm optimization (CPSO, the backpropagation artificial neural network (BPANN, and the idea of bivariate division, this paper proposes a bivariate division BPANN (BD-BPANN method and the CPSO-BD-BPANN method for forecasting electricity price. The former method creatively transforms the electricity demand and price to be a new variable, named DV, which is calculated using the division principle, to forecast the day-ahead electricity by multiplying the forecasted values of the DVs and forecasted values of the demand. Next, to improve the accuracy of BD-BPANN, chaos particle swarm optimization and BD-BPANN are synthesized to form a novel model, CPSO-BD-BPANN. In this study, CPSO is utilized to optimize the initial parameters of BD-BPANN to make its output more stable than the original model. Finally, two forecasting strategies are proposed regarding different situations.

  2. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors – Exploratory empirical analysis using a bivariate ordered probit model

    Directory of Open Access Journals (Sweden)

    Behram Wali

    2017-06-01

    Full Text Available The contemporary traffic safety research comprises little information on quantifying the simultaneous association between drink driving and speeding among fatally injured drivers. Potential correlation between driver's drink driving and speeding behavior poses a substantial methodological concern which needs investigation. This study therefore focused on investigating the simultaneous impact of socioeconomic factors, fatalities, vehicle ownership, health services and highway agency road safety policies on enforcement levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from WHO's global status report on road safety in 2013. The consistent and intuitive parameter estimates along with statistically significant correlation between response outcomes validates the statistical supremacy of bivariate ordered probit model. The results revealed that fatalities per thousand registered vehicles, hospital beds per hundred thousand population and road safety policies are associated with a likely medium or high effectiveness of enforcement levels of speed limit and drink driving laws, respectively. Also, the model encapsulates the effect of several other agency related variables and socio-economic status on the response outcomes. Marginal effects are reported for analyzing the impact of such factors on intermediate categories of response outcomes. The results of this study are expected to provide necessary insights to elemental enforcement programs. Also, marginal effects of explanatory variables may provide useful directions for formulating effective policy countermeasures for overcoming driver's speeding and drink driving behavior.

  3. Corners of normal matrices

    Indian Academy of Sciences (India)

    The structure of general normal matrices is far more complicated than that of two special kinds — hermitian and unitary. There are many interesting theorems for hermitian and unitary matrices whose extensions to arbitrary normal matrices have proved to be extremely recalcitrant (see e.g., [1]). The problem whose study we ...

  4. Normalized medical information visualization.

    Science.gov (United States)

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Somolinos, Roberto; Castro, Antonio; Velázquez, Iker; Moreno, Oscar; García-Pacheco, José L; Pascual, Mario; Salvador, Carlos H

    2015-01-01

    A new mark-up programming language is introduced in order to facilitate and improve the visualization of ISO/EN 13606 dual model-based normalized medical information. This is the first time that visualization of normalized medical information is addressed and the programming language is intended to be used by medical non-IT professionals.

  5. Baby Poop: What's Normal?

    Science.gov (United States)

    ... I'm breast-feeding my newborn and her bowel movements are yellow and mushy. Is this normal for baby poop? Answers from Jay L. Hoecker, M.D. Yellow, mushy bowel movements are perfectly normal for breast-fed babies. Still, ...

  6. Clinical usefulness of myocardial iodine-123-15-(p-iodophenyl)-3(R,S)-methyl-pentadecanoic acid distribution abnormality in patients with mitochondrial encephalomyopathy based on normal data file in bull's-eye polar map

    International Nuclear Information System (INIS)

    Takahashi, Nobukazu; Mitani, Isao; Sumita, Shinichi

    1998-01-01

    Visual interpretation of iodine-123-beta-15-(p-iodophenyl)-3(R,S)-methyl-pentadecanoic acid ( 123 I-BMIPP) myocardial images cannot easily detect mild reduction in tracer uptake. Objective assessment of myocardial 123 I-BMIPP maldistributions at rest was attempted using a bull's-eye map and its normal data file for detecting myocardial damage in patients with mitochondrial encephalomyopathy. Six patients, two with Kearns-Sayre syndrome and four with mitochondrial myopathy, encephalopathy, lactic acidosis, and strokelike episodes (MELAS), and 10 normal subjects were studied. Fractional myocardial uptake of 1 23 I-BMIPP was also measured by dynamic static imaging to assess the global myocardial free fatty acid. These data were compared with the cardiothoracic ratio measured by chest radiography and left ventricular ejection fraction assessed by echocardiography. Abnormal cardiothoracic ratio and lower ejection fraction were detected in only one patient with Kearns-Sayre syndrome. Abnormal fractional myocardial uptake was detected in two patients (1.61%, 1.91%), whereas abnormal regional 123 I-BMIPP uptake assessed by the bull's-eye map was detected in five patients (83%). All patients showed abnormal uptake in the anterior portion, and one showed progressive atrioventricular conduction abnormality and systolic dysfunction with extended 123 I-BMIPP abnormal uptake. The results suggest that assessment based on the normal data file in a bull's-eye polar map is clinically useful for detection of myocardial damage in patients with mitochondrial encephalomyopathy. (author)

  7. ON SKEW-NORMAL MODEL FOR ECONOMICALLY ACTIVE POPULATION

    Directory of Open Access Journals (Sweden)

    OLOSUNDE AKINLOLU A

    2011-04-01

    Full Text Available The literature related to skew-symmetric distribution have grown rapidly in recent years but at the moment no publication on its applications concerning the description of economically active data with this type of probability models. In this paper, we provided an extension to this skew-normal distribution, which is also part of the family of skewed class of normal but with additional shape parameters δ. Some properties of this distribution are presented and finally, we considered fitting it to economically active population data. The model exhibited a better behaviour when compared to normal and skew normal distributions.

  8. FULL-THICKNESS SMALL INTESTINE NECROSIS WITH MIDGUT VOLVULUS, DISTRIBUTED IN A PATCHY FASHION, IS REVERSIBLE WITH MODERATE BLOOD FLOW : RESUMPTION OF NORMAL FUNCTION TO NON-VIABLE INTESTINE

    OpenAIRE

    AMANO, HIZURU; UCHIDA, HIROO; KAWASHIMA, HIROSHI; TANAKA, YUJIRO; KISHIMOTO, HIROSHI

    2014-01-01

    ABSTRACT Midgut volvulus is a highly life-threatening condition that carries a high risk of short gut syndrome. We report a case of catastrophic neonatal midgut volvulus in which second-look laparotomy revealed apparently non-viable remnant small intestine but with a moderate blood supply. Full-thickness small intestine necrosis was distributed in a patchy fashion, with non-viable and necrotic areas distributed so widely that no portion of the intestine could be resected. A section of full-th...

  9. Making nuclear 'normal'

    International Nuclear Information System (INIS)

    Haehlen, Peter; Elmiger, Bruno

    2000-01-01

    The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be

  10. New spatial upscaling methods for multi-point measurements: From normal to p-normal

    Science.gov (United States)

    Liu, Feng; Li, Xin

    2017-12-01

    Careful attention must be given to determining whether the geophysical variables of interest are normally distributed, since the assumption of a normal distribution may not accurately reflect the probability distribution of some variables. As a generalization of the normal distribution, the p-normal distribution and its corresponding maximum likelihood estimation (the least power estimation, LPE) were introduced in upscaling methods for multi-point measurements. Six methods, including three normal-based methods, i.e., arithmetic average, least square estimation, block kriging, and three p-normal-based methods, i.e., LPE, geostatistics LPE and inverse distance weighted LPE are compared in two types of experiments: a synthetic experiment to evaluate the performance of the upscaling methods in terms of accuracy, stability and robustness, and a real-world experiment to produce real-world upscaling estimates using soil moisture data obtained from multi-scale observations. The results show that the p-normal-based methods produced lower mean absolute errors and outperformed the other techniques due to their universality and robustness. We conclude that introducing appropriate statistical parameters into an upscaling strategy can substantially improve the estimation, especially if the raw measurements are disorganized; however, further investigation is required to determine which parameter is the most effective among variance, spatial correlation information and parameter p.

  11. Normality in Analytical Psychology

    Science.gov (United States)

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  12. Normal Female Reproductive Anatomy

    Science.gov (United States)

    ... an inner lining called the endometrium. Normal female reproductive system anatomy. Topics/Categories: Anatomy -- Gynecologic Type: Color, Medical Illustration Source: National Cancer Institute Creator: Terese Winslow (Illustrator) AV Number: CDR609921 Date Created: November 17, 2014 Date Added: ...

  13. Normal growth and development

    Science.gov (United States)

    A child's growth and development can be divided into four periods: Infancy Preschool years Middle childhood years Adolescence Soon after birth, an infant normally loses about 5% to 10% of their birth weight. By about age ...

  14. Normal pressure hydrocephalus

    Science.gov (United States)

    Hydrocephalus - occult; Hydrocephalus - idiopathic; Hydrocephalus - adult; Hydrocephalus - communicating; Dementia - hydrocephalus; NPH ... Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. ... Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders ...

  15. Normal Functioning Family

    Science.gov (United States)

    ... Spread the Word Shop AAP Find a Pediatrician Family Life Medical Home Family Dynamics Adoption & Foster Care ... Español Text Size Email Print Share Normal Functioning Family Page Content Article Body Is there any way ...

  16. Normal Pressure Hydrocephalus

    Science.gov (United States)

    ... improves the chance of a good recovery. Without treatment, symptoms may worsen and cause death. What research is being done? The NINDS conducts and supports research on neurological disorders, including normal pressure hydrocephalus. Research on disorders such ...

  17. Interaction between a normal shock wave and a turbulent boundary layer at high transonic speeds. Part 1: Pressure distribution. Part 2: Wall shear stress. Part 3: Simplified formulas for the prediction of surface pressures and skin friction

    Science.gov (United States)

    Adamson, T. C., Jr.; Liou, M. S.; Messiter, A. F.

    1980-01-01

    An asymptotic description is derived for the interaction between a shock wave and a turbulent boundary layer in transonic flow, for a particular limiting case. The dimensionless difference between the external flow velocity and critical sound speed is taken to be much smaller than one, but large in comparison with the dimensionless friction velocity. The basic results are derived for a flat plate, and corrections for longitudinal wall curvature and for flow in a circular pipe are also shown. Solutions are given for the wall pressure distribution and the shape of the shock wave. Solutions for the wall shear stress are obtained, and a criterion for incipient separation is derived. Simplified solutions for both the wall pressure and skin friction distributions in the interaction region are given. These results are presented in a form suitable for use in computer programs.

  18. Normal-Force and Hinge-Moment Characteristics at Transonic Speeds of Flap-Type Ailerons at Three Spanwise Locations on a 4-Percent-Thick Sweptback-Wing-Body Model and Pressure-Distribution Measurements on an Inboard Aileron

    Science.gov (United States)

    Runckel, Jack F.; Hieser, Gerald

    1961-01-01

    An investigation has been conducted at the Langley 16-foot transonic tunnel to determine the loading characteristics of flap-type ailerons located at inboard, midspan, and outboard positions on a 45 deg. sweptback-wing-body combination. Aileron normal-force and hinge-moment data have been obtained at Mach numbers from 0.80 t o 1.03, at angles of attack up to about 27 deg., and at aileron deflections between approximately -15 deg. and 15 deg. Results of the investigation indicate that the loading over the ailerons was established by the wing-flow characteristics, and the loading shapes were irregular in the transonic speed range. The spanwise location of the aileron had little effect on the values of the slope of the curves of hinge-moment coefficient against aileron deflection, but the inboard aileron had the greatest value of the slope of the curves of hinge-moment coefficient against angle of attack and the outboard aileron had the least. Hinge-moment and aileron normal-force data taken with strain-gage instrumentation are compared with data obtained with pressure measurements.

  19. VolHOG: a volumetric object recognition approach based on bivariate histograms of oriented gradients for vertebra detection in cervical spine MRI.

    Science.gov (United States)

    Daenzer, Stefan; Freitag, Stefan; von Sachsen, Sandra; Steinke, Hanno; Groll, Mathias; Meixensberger, Jürgen; Leimert, Mario

    2014-08-01

    The automatic recognition of vertebrae in volumetric images is an important step toward automatic spinal diagnosis and therapy support systems. There are many applications such as the detection of pathologies and segmentation which would benefit from automatic initialization by the detection of vertebrae. One possible application is the initialization of local vertebral segmentation methods, eliminating the need for manual initialization by a human operator. Automating the initialization process would optimize the clinical workflow. However, automatic vertebra recognition in magnetic resonance (MR) images is a challenging task due to noise in images, pathological deformations of the spine, and image contrast variations. This work presents a fully automatic algorithm for 3D cervical vertebra detection in MR images. We propose a machine learning method for cervical vertebra detection based on new features combined with a linear support vector machine for classification. An algorithm for bivariate gradient orientation histogram generation from three-dimensional raster image data is introduced which allows us to describe three-dimensional objects using the authors' proposed bivariate histograms. A detailed performance evaluation on 21 T2-weighted MR images of the cervical vertebral region is given. A single model for cervical vertebrae C3-C7 is generated and evaluated. The results show that the generic model performs equally well for each of the cervical vertebrae C3-C7. The algorithm's performance is also evaluated on images containing various levels of artificial noise. The results indicate that the proposed algorithm achieves good results despite the presence of severe image noise. The proposed detection method delivers accurate locations of cervical vertebrae in MR images which can be used in diagnosis and therapy. In order to achieve absolute comparability with the results of future work, the authors are following an open data approach by making the image dataset

  20. Fitting statistical distributions the generalized lambda distribution and generalized bootstrap methods

    CERN Document Server

    Karian, Zaven A

    2000-01-01

    Throughout the physical and social sciences, researchers face the challenge of fitting statistical distributions to their data. Although the study of statistical modelling has made great strides in recent years, the number and variety of distributions to choose from-all with their own formulas, tables, diagrams, and general properties-continue to create problems. For a specific application, which of the dozens of distributions should one use? What if none of them fit well?Fitting Statistical Distributions helps answer those questions. Focusing on techniques used successfully across many fields, the authors present all of the relevant results related to the Generalized Lambda Distribution (GLD), the Generalized Bootstrap (GB), and Monte Carlo simulation (MC). They provide the tables, algorithms, and computer programs needed for fitting continuous probability distributions to data in a wide variety of circumstances-covering bivariate as well as univariate distributions, and including situations where moments do...

  1. Measurement of the normalized Z/γ* → μ+μ- transverse momentum distribution in p(bar p) collisions at √s = 1.96 TeV

    International Nuclear Information System (INIS)

    2010-01-01

    We present a new measurement of the Z/γ* transverse momentum distribution in the range 0-330 GeV, in proton anti-proton collisions at √s = 1.96 TeV. The measurement uses 0.97 fb -1 of integrated luminosity recorded by the D0 experiment and is the first using the Z/γ* → μ + μ - + X channel at this center-of-mass energy. This is also the first measurement of the Z/γ* transverse momentum distribution that presents the result at the level of particles entering the detector, minimizing dependence on theoretical models. As any momentum of the Z/γ* in the plane transverse to the incoming beams must be balanced by some recoiling system, primarily the result of QCD radiation in the initial state, this variable is an excellent probe of the underlying process. Tests of the predictions of QCD calculations and current event generators show they have varied success in describing the data. Using this measurement as an input to theoretical predictions will allow for a better description of hadron collider data and hence it will increase experimental sensitivity to rare signals.

  2. Measurement of the normalized $Z/\\gamma^* -> \\mu^+\\mu^-$ transverse momentum distribution in $p\\bar{p}$ collisions at $\\sqrt{s}=1.96$ TeV

    Energy Technology Data Exchange (ETDEWEB)

    Abazov, Victor Mukhamedovich; /Dubna, JINR; Abbott, Braden Keim; /Oklahoma U.; Abolins, Maris A.; /Michigan State U.; Acharya, Bannanje Sripath; /Tata Inst.; Adams, Mark Raymond; /Illinois U., Chicago; Adams, Todd; /Florida State U.; Alexeev, Guennadi D.; /Dubna, JINR; Alkhazov, Georgiy D.; /St. Petersburg, INP; Alton, Andrew K.; /Michigan U. /Augustana Coll., Sioux Falls; Alverson, George O.; /Northeastern U.; Alves, Gilvan Augusto; /Rio de Janeiro, CBPF /Nijmegen U.

    2010-06-01

    We present a new measurement of the Z/{gamma}* transverse momentum distribution in the range 0-330 GeV, in proton anti-proton collisions at {radical}s = 1.96 TeV. The measurement uses 0.97 fb{sup -1} of integrated luminosity recorded by the D0 experiment and is the first using the Z/{gamma}* {yields} {mu}{sup +}{mu}{sup -} + X channel at this center-of-mass energy. This is also the first measurement of the Z/{gamma}* transverse momentum distribution that presents the result at the level of particles entering the detector, minimizing dependence on theoretical models. As any momentum of the Z/{gamma}* in the plane transverse to the incoming beams must be balanced by some recoiling system, primarily the result of QCD radiation in the initial state, this variable is an excellent probe of the underlying process. Tests of the predictions of QCD calculations and current event generators show they have varied success in describing the data. Using this measurement as an input to theoretical predictions will allow for a better description of hadron collider data and hence it will increase experimental sensitivity to rare signals.

  3. Monitoring the normal body

    DEFF Research Database (Denmark)

    Nissen, Nina Konstantin; Holm, Lotte; Baarts, Charlotte

    2015-01-01

    provides us with knowledge about how to prevent future overweight or obesity. This paper investigates body size ideals and monitoring practices among normal-weight and moderately overweight people. Methods : The study is based on in-depth interviews combined with observations. 24 participants were...... recruited by strategic sampling based on self-reported BMI 18.5-29.9 kg/m2 and socio-demographic factors. Inductive analysis was conducted. Results : Normal-weight and moderately overweight people have clear ideals for their body size. Despite being normal weight or close to this, they construct a variety...... of practices for monitoring their bodies based on different kinds of calculations of weight and body size, observations of body shape, and measurements of bodily firmness. Biometric measurements are familiar to them as are health authorities' recommendations. Despite not belonging to an extreme BMI category...

  4. Efecto Zeeman Normal

    OpenAIRE

    Calderón Chamochumbi, Carlos

    2015-01-01

    Se describe el Efecto Zeeman Normal y se presenta una derivación general del torque experimentado por un dipolo magnético debido a su interacción con un campo magnético externo. Los cálculos correspondientes al elemento diferencial de energía potencial magnética y de la energía potencial magnética convencional son estándares. ABSTRACT: The Normal Zeeman Effect is described and a general derivation of the torque undergone by a magnetic dipole due to its interactio...

  5. The normal holonomy group

    International Nuclear Information System (INIS)

    Olmos, C.

    1990-05-01

    The restricted holonomy group of a Riemannian manifold is a compact Lie group and its representation on the tangent space is a product of irreducible representations and a trivial one. Each one of the non-trivial factors is either an orthogonal representation of a connected compact Lie group which acts transitively on the unit sphere or it is the isotropy representation of a single Riemannian symmetric space of rank ≥ 2. We prove that, all these properties are also true for the representation on the normal space of the restricted normal holonomy group of any submanifold of a space of constant curvature. 4 refs

  6. Medically-enhanced normality

    DEFF Research Database (Denmark)

    Møldrup, Claus; Traulsen, Janine Morgall; Almarsdóttir, Anna Birna

    2003-01-01

    Objective: To consider public perspectives on the use of medicines for non-medical purposes, a usage called medically-enhanced normality (MEN). Method: Examples from the literature were combined with empirical data derived from two Danish research projects: a Delphi internet study and a Telebus......, to optimise economic, working and family conditions. The term "doping" does not cover or explain the use of medicines as enhancement among healthy non-athletes. Conclusion: We recommend wider use of the term medically-enhanced normality as a conceptual framework for understanding and analysing perceptions...... of what is considered rational medicine use in contemporary society....

  7. Research on Normal Human Plantar Pressure Test

    Directory of Open Access Journals (Sweden)

    Liu Xi Yang

    2016-01-01

    Full Text Available FSR400 pressure sensor, nRF905 wireless transceiver and MSP40 SCM are used to design the insole pressure collection system, LabVIEW is used to make HMI of data acquisition, collecting a certain amount of normal human foot pressure data, statistical analysis of pressure distribution relations about five stages of swing phase during walking, using the grid closeness degree to identify plantar pressure distribution pattern recognition, and the algorithm simulation, experimental results demonstrated this method feasible.

  8. Central obesity and normal-weight central obesity among adults attending healthcare facilities in Buffalo City Metropolitan Municipality, South Africa: a cross-sectional study.

    Science.gov (United States)

    Owolabi, Eyitayo Omolara; Ter Goon, Daniel; Adeniyi, Oladele Vincent

    2017-12-28

    Central obesity (CO) confers a significant threat on the cardio-metabolic health of individuals, independently of overall obesity. Disparities in the measures of fat distribution lead to misclassification of individuals who are at risk of cardio-metabolic diseases. This study sought to determine the prevalence and correlates of central obesity and normal-weight central obesity among adults attending selected healthcare facilities in Buffalo City Metropolitan Municipality (BCMM), South Africa, assess their health risk and examine the association between central obesity and cardio-metabolic diseases among adults with normal weight, measured by body mass index (BMI). A cross-sectional survey of 998 adults was carried out at the three largest outpatient clinics in BCMM. Overall and central obesity were assessed using BMI, waist circumference (WC), waist-to-hip ratio (WHR) and waist-to-height ratio (WHTR). The WHO STEPwise questionnaire was used for data collection. Blood pressure and blood glucose were measured. Normal-weight central obesity was defined as CO among individuals with normal weight, as assessed by BMI. Health risk levels were assessed using the National Institute for Health and Clinical Excellence (NICE) BMI-WC composite index. Bivariate and multivariate analyses were used to determine the prevalence of CO, normal-weight central obesity and the predictors of CO. The mean age of participants was 42.6 (± 16.5) years. The prevalence of CO was 67.0, 58.0 and 71.0% by WC, WHR and WHTR, respectively. The prevalence of normal-weight central obesity was 26.9, 36.9 and 29.5% by WC, WHR and WHTR, respectively. About 41% of the participants had a very high health risk, 13% had increased risk or high risk and 33% had no health risk. Central obesity was significantly associated with hypertension but not associated with diabetes among those with normal weight (by BMI). Female sex, age over 30 years, marriage, secondary or tertiary level of education, non

  9. Corners of normal matrices

    Indian Academy of Sciences (India)

    ∗Department of Mathematics, University of Toronto, Toronto M5S 2E4, Canada. E-mail: rbh@isid.ac.in; choi@math.toronto.edu. To Kalyan Sinha on his sixtieth birthday. Abstract. We study various conditions on matrices B and C under which they can be the off-diagonal blocks of a partitioned normal matrix. Keywords.

  10. Normality in Analytical Psychology

    Directory of Open Access Journals (Sweden)

    Steve Myers

    2013-11-01

    Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  11. Normalized information distance

    NARCIS (Netherlands)

    Vitányi, P.M.B.; Balbach, F.J.; Cilibrasi, R.L.; Li, M.; Emmert-Streib, F.; Dehmer, M.

    2009-01-01

    The normalized information distance is a universal distance measure for objects of all kinds. It is based on Kolmogorov complexity and thus uncomputable, but there are ways to utilize it. First, compression algorithms can be used to approximate the Kolmogorov complexity if the objects have a string

  12. Histomorphometric Assessment of Cancellous and Cortical Bone Material Distribution in the Proximal Humerus of Normal and Osteoporotic Individuals: Significantly Reduced Bone Stock in the Metaphyseal and Subcapital Regions of Osteoporotic Individuals.

    Science.gov (United States)

    Sprecher, Christoph M; Schmidutz, Florian; Helfen, Tobias; Richards, R Geoff; Blauth, Michael; Milz, Stefan

    2015-12-01

    Osteoporosis is a systemic disorder predominantly affecting postmenopausal women but also men at an advanced age. Both genders may suffer from low-energy fractures of, for example, the proximal humerus when reduction of the bone stock or/and quality has occurred.The aim of the current study was to compare the amount of bone in typical fracture zones of the proximal humerus in osteoporotic and non-osteoporotic individuals.The amount of bone in the proximal humerus was determined histomorphometrically in frontal plane sections. The donor bones were allocated to normal and osteoporotic groups using the T-score from distal radius DXA measurements of the same extremities. The T-score evaluation was done according to WHO criteria. Regional thickness of the subchondral plate and the metaphyseal cortical bone were measured using interactive image analysis.At all measured locations the amount of cancellous bone was significantly lower in individuals from the osteoporotic group compared to the non-osteoporotic one. The osteoporotic group showed more significant differences between regions of the same bone than the non-osteoporotic group. In both groups the subchondral cancellous bone and the subchondral plate were least affected by bone loss. In contrast, the medial metaphyseal region in the osteoporotic group exhibited higher bone loss in comparison to the lateral side.This observation may explain prevailing fracture patterns, which frequently involve compression fractures and certainly has an influence on the stability of implants placed in this medial region. It should be considered when planning the anchoring of osteosynthesis materials in osteoporotic patients with fractures of the proximal humerus.

  13. Generalized Skew-Normal Negentropy and Its Application to Fish Condition Factor Time Series

    OpenAIRE

    Reinaldo B. Arellano-Valle; Javier E. Contreras-Reyes; Milan Stehlík

    2017-01-01

    The problem of measuring the disparity of a particular probability density function from a normal one has been addressed in several recent studies. The most used technique to deal with the problem has been exact expressions using information measures over particular distributions. In this paper, we consider a class of asymmetric distributions with a normal kernel, called Generalized Skew-Normal (GSN) distributions. We measure the degrees of disparity of these distributions from the normal dis...

  14. Testing against "normal" with environmental data.

    Science.gov (United States)

    Kilgour, Bruce W; Somers, Keith M; Barrett, Timothy J; Munkittrick, Kelly R; Francis, Anthony P

    2017-01-01

    Normal ranges are some fraction of a reference distribution deemed to represent an expected condition, typically 95%. They are frequently used as the basis for generic criteria for monitoring programs designed to test whether a sample is outside of "normal," as in reference-condition approach studies. Normal ranges are also the basis for criteria for more classic environmental effects monitoring programs designed to detect differences in mean responses between reference and exposure areas. Limits on normal ranges are estimated with error that varies depending largely on sample size. Direct comparison of a sample or a mean to estimated limits of a normal range will, with some frequency, lead to incorrect conclusions about whether a sample or a mean is inside or outside the normal range when the sample or the mean is near the limit. Those errors can have significant costs and risk implications. This article describes tests based on noncentral distributions that are appropriate for quantifying the likelihood that samples or means are outside a normal range. These noncentral tests reverse the burden of evidence (assuming that the sample or mean is at or outside normal), and thereby encourage proponents to collect more robust sample sizes that will demonstrate that the sample or mean is not at the limits or beyond the normal range. These noncentral equivalence and interval tests can be applied to uni- and multivariate responses, and to simple (e.g., upstream vs downstream) or more complex (e.g., before vs after, or upstream vs downstream) study designs. Statistical procedures for the various tests are illustrated with benthic invertebrate community data collected as part of the Regional Aquatics Monitoring Program (RAMP) in the vicinity of oil sands operations in northern Alberta, Canada. An Excel workbook with functions and calculations to carry out the various tests is provided in the online Supplemental Data. Integr Environ Assess Manag 2017;13:188-197. © 2016 SETAC

  15. The Absolute Normal Scores Test for Symmetry

    Science.gov (United States)

    Penfield, Douglas A.; Sachdeva, Darshan

    1976-01-01

    The absolute normal scores test is described as a test for the symmetry of a distribution of scores about a location parameter. The test is compared to the sign test and the Wilcoxon test as an alternative to the "t"-test. (Editor/RK)

  16. Normal Weight Dyslipidemia

    DEFF Research Database (Denmark)

    Ipsen, David Hojland; Tveden-Nyborg, Pernille; Lykkesfeldt, Jens

    2016-01-01

    Objective: The liver coordinates lipid metabolism and may play a vital role in the development of dyslipidemia, even in the absence of obesity. Normal weight dyslipidemia (NWD) and patients with nonalcoholic fatty liver disease (NAFLD) who do not have obesity constitute a unique subset...... of individuals characterized by dyslipidemia and metabolic deterioration. This review examined the available literature on the role of the liver in dyslipidemia and the metabolic characteristics of patients with NAFLD who do not have obesity. Methods: PubMed was searched using the following keywords: nonobese......, dyslipidemia, NAFLD, NWD, liver, and metabolically obese/unhealthy normal weight. Additionally, article bibliographies were screened, and relevant citations were retrieved. Studies were excluded if they had not measured relevant biomarkers of dyslipidemia. Results: NWD and NAFLD without obesity share a similar...

  17. Idiopathic Normal Pressure Hydrocephalus

    Directory of Open Access Journals (Sweden)

    Basant R. Nassar BS

    2016-04-01

    Full Text Available Idiopathic normal pressure hydrocephalus (iNPH is a potentially reversible neurodegenerative disease commonly characterized by a triad of dementia, gait, and urinary disturbance. Advancements in diagnosis and treatment have aided in properly identifying and improving symptoms in patients. However, a large proportion of iNPH patients remain either undiagnosed or misdiagnosed. Using PubMed search engine of keywords “normal pressure hydrocephalus,” “diagnosis,” “shunt treatment,” “biomarkers,” “gait disturbances,” “cognitive function,” “neuropsychology,” “imaging,” and “pathogenesis,” articles were obtained for this review. The majority of the articles were retrieved from the past 10 years. The purpose of this review article is to aid general practitioners in further understanding current findings on the pathogenesis, diagnosis, and treatment of iNPH.

  18. Neuroethics beyond Normal.

    Science.gov (United States)

    Shook, John R; Giordano, James

    2016-01-01

    An integrated and principled neuroethics offers ethical guidelines able to transcend conventional and medical reliance on normality standards. Elsewhere we have proposed four principles for wise guidance on human transformations. Principles like these are already urgently needed, as bio- and cyberenhancements are rapidly emerging. Context matters. Neither "treatments" nor "enhancements" are objectively identifiable apart from performance expectations, social contexts, and civic orders. Lessons learned from disability studies about enablement and inclusion suggest a fresh way to categorize modifications to the body and its performance. The term "enhancement" should be broken apart to permit recognition of enablements and augmentations, and kinds of radical augmentation for specialized performance. Augmentations affecting the self, self-worth, and self-identity of persons require heightened ethical scrutiny. Reversibility becomes the core problem, not the easy answer, as augmented persons may not cooperate with either decommissioning or displacement into unaccommodating societies. We conclude by indicating how our four principles of self-creativity, nonobsolescence, empowerment, and citizenship establish a neuroethics beyond normal that is better prepared for a future in which humans and their societies are going so far beyond normal.

  19. Ethics and "normal birth".

    Science.gov (United States)

    Lyerly, Anne Drapkin

    2012-12-01

    The concept of "normal birth" has been promoted as ideal by several international organizations, although debate about its meaning is ongoing. In this article, I examine the concept of normalcy to explore its ethical implications and raise a trio of concerns. First, in its emphasis on nonuse of technology as a goal, the concept of normalcy may marginalize women for whom medical intervention is necessary or beneficial. Second, in its emphasis on birth as a socially meaningful event, the mantra of normalcy may unintentionally avert attention to meaning in medically complicated births. Third, the emphasis on birth as a normal and healthy event may be a contributor to the long-standing tolerance for the dearth of evidence guiding the treatment of illness during pregnancy and the failure to responsibly and productively engage pregnant women in health research. Given these concerns, it is worth debating not just what "normal birth" means, but whether the term as an ideal earns its keep. © 2012, Copyright the Authors Journal compilation © 2012, Wiley Periodicals, Inc.

  20. Generalized Skew-Normal Negentropy and Its Application to Fish Condition Factor Time Series

    Directory of Open Access Journals (Sweden)

    Reinaldo B. Arellano-Valle

    2017-10-01

    Full Text Available The problem of measuring the disparity of a particular probability density function from a normal one has been addressed in several recent studies. The most used technique to deal with the problem has been exact expressions using information measures over particular distributions. In this paper, we consider a class of asymmetric distributions with a normal kernel, called Generalized Skew-Normal (GSN distributions. We measure the degrees of disparity of these distributions from the normal distribution by using exact expressions for the GSN negentropy in terms of cumulants. Specifically, we focus on skew-normal and modified skew-normal distributions. Then, we establish the Kullback–Leibler divergences between each GSN distribution and the normal one in terms of their negentropies to develop hypothesis testing for normality. Finally, we apply this result to condition factor time series of anchovies off northern Chile.

  1. Deviation from normal Boltzmann distribution of high-lying energy levels of iron atom excited by Okamoto-cavity microwave-induced plasmas using pure nitrogen and nitrogen–oxygen gases

    International Nuclear Information System (INIS)

    Wagatsuma, Kazuaki

    2015-01-01

    This paper describes several interesting excitation phenomena occurring in a microwave-induced plasma (MIP) excited with Okamoto-cavity, especially when a small amount of oxygen was mixed with nitrogen matrix in the composition of the plasma gas. An ion-to-atom ratio of iron, which was estimated from the intensity ratio of ion to atomic lines having almost the same excitation energy, was reduced by adding oxygen gas to the nitrogen MIP, eventually contributing to an enhancement in the emission intensities of the atomic lines. Furthermore, Boltzmann plots for iron atomic lines were observed in a wide range of the excitation energy from 3.4 to 6.9 eV, indicating that plots of the atomic lines having lower excitation energies (3.4 to 4.8 eV) were well fitted on a straight line while those having more than 5.5 eV deviated upwards from the linear relationship. This overpopulation would result from any other excitation process in addition to the thermal excitation that principally determines the Boltzmann distribution. A Penning-type collision with excited species of nitrogen molecules probably explains this additional excitation mechanism, in which the resulting iron ions recombine with captured electrons, followed by cascade de-excitations between closely-spaced excited levels just below the ionization limit. As a result, these high-lying levels might be more populated than the low-lying levels of iron atom. The ionization of iron would be caused less actively in the nitrogen–oxygen plasma than in a pure nitrogen plasma, because excited species of nitrogen molecule, which can provide the ionization energy in a collision with iron atom, are consumed through collisions with oxygen molecules to cause their dissociation. It was also observed that the overpopulation occurred to a lesser extent when oxygen gas was added to the nitrogen plasma. The reason for this was also attributed to decreased number density of the excited nitrogen species due to collisions with oxygen

  2. BIMOND3, Monotone Bivariate Interpolation

    International Nuclear Information System (INIS)

    Fritsch, F.N.; Carlson, R.E.

    2001-01-01

    1 - Description of program or function: BIMOND is a FORTRAN-77 subroutine for piecewise bi-cubic interpolation to data on a rectangular mesh, which reproduces the monotonousness of the data. A driver program, BIMOND1, is provided which reads data, computes the interpolating surface parameters, and evaluates the function on a mesh suitable for plotting. 2 - Method of solution: Monotonic piecewise bi-cubic Hermite interpolation is used. 3 - Restrictions on the complexity of the problem: The current version of the program can treat data which are monotone in only one of the independent variables, but cannot handle piecewise monotone data

  3. Statistical Modeling of Bivariate Data.

    Science.gov (United States)

    1982-08-01

    to one. Following Crain (1974), one may consider order m approximators m log f111(X) - k k (x) - c(e), asx ;b. (4.4.5) k,-r A m and attempt to find...literature. Consider the approximate model m log fn (x) = 7 ekk(x) + a G(x), aSx ;b, (44.8) " k=-Mn ’ where G(x) is a Gaussian process and n is a

  4. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    HIROSE,Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  5. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis.

    Science.gov (United States)

    Shields, Katherine F; Bain, Robert E S; Cronk, Ryan; Wright, Jim A; Bartram, Jamie

    2015-12-01

    Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water.

  6. Modelling the vicious circle between obesity and physical activity in children and adolescents using a bivariate probit model with endogenous regressors.

    Science.gov (United States)

    Yeh, C-Y; Chen, L-J; Ku, P-W; Chen, C-M

    2015-01-01

    The increasing prevalence of obesity in children and adolescents has become one of the most important public health issues around the world. Lack of physical activity is a risk factor for obesity, while being obese could reduce the likelihood of participating in physical activity. Failing to account for the endogeneity between obesity and physical activity would result in biased estimation. This study investigates the relationship between overweight and physical activity by taking endogeneity into consideration. It develops an endogenous bivariate probit model estimated by the maximum likelihood method. The data included 4008 boys and 4197 girls in the 5th-9th grades in Taiwan in 2007-2008. The relationship between overweight and physical activity is significantly negative in the endogenous model, but insignificant in the comparative exogenous model. This endogenous relationship presents a vicious circle in which lower levels of physical activity lead to overweight, while those who are already overweight engage in less physical activity. The results not only reveal the importance of endogenous treatment, but also demonstrate the robust negative relationship between these two factors. An emphasis should be put on overweight and obese children and adolescents in order to break the vicious circle. Promotion of physical activity by appropriate counselling programmes and peer support could be effective in reducing the prevalence of obesity in children and adolescents.

  7. Evaluation of Factors Affecting E-Bike Involved Crash and E-Bike License Plate Use in China Using a Bivariate Probit Model

    Directory of Open Access Journals (Sweden)

    Yanyong Guo

    2017-01-01

    Full Text Available The primary objective of this study is to evaluate factors affecting e-bike involved crash and license plate use in China. E-bike crashes data were collected from police database and completed through a telephone interview. Noncrash samples were collected by a questionnaire survey. A bivariate probit (BP model was developed to simultaneously examine the significant factors associated with e-bike involved crash and e-bike license plate and to account for the correlations between them. Marginal effects for contributory factors were calculated to quantify their impacts on the outcomes. The results show that several contributory factors, including gender, age, education level, driver license, car in household, experiences in using e-bike, law compliance, and aggressive driving behaviors, are found to have significant impacts on both e-bike involved crash and license plate use. Moreover, type of e-bike, frequency of using e-bike, impulse behavior, degree of riding experience, and risk perception scale are found to be associated with e-bike involved crash. It is also found that e-bike involved crash and e-bike license plate use are strongly correlated and are negative in direction. The result enhanced our comprehension of the factors related to e-bike involved crash and e-bike license plate use.

  8. Normal radiological findings

    International Nuclear Information System (INIS)

    Moeller, T.B.

    1987-01-01

    This book is intended for learners in radiology, presenting a wealth of normal radiological findings together with a systematic guide for appraisal and interpretation, and for formulation of reports. The text examples and criteria given will help beginners in learning to 'read' a radiograph, and to verify their conclusions by means of checklists and standard reports. The case material covers numerous illustrations from the following sectors: Skeletal radiography, mammography, tomography, contrast radiography, organ examination by intravenous techniques, arthrography and angiography, and specialized radiography, (ECB) With 184 figs [de

  9. Normal Untreated Jurkat Cells

    Science.gov (United States)

    2004-01-01

    Biomedical research offers hope for a variety of medical problems, from diabetes to the replacement of damaged bone and tissues. Bioreactors, which are used to grow cells and tissue cultures, play a major role in such research and production efforts. The objective of the research was to define a way to differentiate between effects due to microgravity and those due to possible stress from non-optimal spaceflight conditions. These Jurkat cells, a human acute T-cell leukemia was obtained to evaluate three types of potential experimental stressors: a) Temperature elevation; b) Serum starvation; and c) Centrifugal force. The data from previous spaceflight experiments showed that actin filaments and cell shape are significantly different for the control. These normal cells serve as the baseline for future spaceflight experiments.

  10. Normal shoulder: MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kieft, G.J.; Bloem, J.L.; Obermann, W.R.; Verbout, A.J.; Rozing, P.M.; Doornbos, J.

    1986-06-01

    Relatively poor spatial resolution has been obtained in magnetic resonance (MR) imaging of the shoulder because the shoulder can only be placed in the periphery of the magnetic field. The authors have devised an anatomically shaped surface coil that enables MR to demonstrate normal shoulder anatomy in different planes with high spatial resolution. In the axial plane anatomy analogous to that seen on computed tomographic (CT) scans can be demonstrated. Variations in scapular position (produced by patient positioning) may make reproducibility of sagittal and coronal plane images difficult by changing the relationship of the plane to the shoulder anatomy. Oblique planes, for which the angle is chosen from the axial image, have the advantage of easy reproducibility. Obliquely oriented structures and relationships are best seen in oblique plane images and can be evaluated in detail.

  11. Analysing risk factors of co-occurrence of schistosomiasis haematobium and hookworm using bivariate regression models: Case study of Chikwawa, Malawi

    Directory of Open Access Journals (Sweden)

    Bruce B.W. Phiri

    2016-06-01

    Full Text Available Schistosomiasis and soil-transmitted helminth (STH infections constitute a major public health problem in many parts of sub-Saharan Africa. In areas where prevalence of geo-helminths and schistosomes is high, co-infection with multiple parasite species is common, resulting in disproportionately elevated burden compared with single infections. Determining risk factors of co-infection intensity is important for better design of targeted interventions. In this paper, we examined risk factors of hookworm and S. haematobium co-infection intensity, in Chikwawa district, southern Malawi in 2005, using bivariate count models. Results show that hookworm and S. haematobium infections were much localised with small proportion of individuals harbouring more parasites especially among school-aged children. The risk of co-intensity with both hookworm and S. haematobium was high for all ages, although this diminished with increasing age, increased with fishing (hookworm: coefficient. = 12.29; 95% CI = 11.50–13.09; S. haematobium: 0.040; 95% CI = 0.0037, 3.832. Both infections were abundant in those with primary education (hookworm: coef. = 0.072; 95% CI = 0.056, 0.401 and S. haematobium: coef. = 0.286; 95% CI = 0.034, 0.538. However, much lower risk was observed for those who were farmers (hookworm: coef. = −0.349, 95% CI = −0.547,−0.150; S. haematobium: coef. −0.239, 95% CI = −0.406, −0.072. In conclusion, our findings suggest that efforts to control helminths infection should be co-integrated and health promotion campaigns should be aimed at school-going children and adults who are in constant contact with water.

  12. Diagnostic value of sTREM-1 in bronchoalveolar lavage fluid in ICU patients with bacterial lung infections: a bivariate meta-analysis.

    Science.gov (United States)

    Shi, Jia-Xin; Li, Jia-Shu; Hu, Rong; Li, Chun-Hua; Wen, Yan; Zheng, Hong; Zhang, Feng; Li, Qin

    2013-01-01

    The serum soluble triggering receptor expressed on myeloid cells-1 (sTREM-1) is a useful biomarker in differentiating bacterial infections from others. However, the diagnostic value of sTREM-1 in bronchoalveolar lavage fluid (BALF) in lung infections has not been well established. We performed a meta-analysis to assess the accuracy of sTREM-1 in BALF for diagnosis of bacterial lung infections in intensive care unit (ICU) patients. We searched PUBMED, EMBASE and Web of Knowledge (from January 1966 to October 2012) databases for relevant studies that reported diagnostic accuracy data of BALF sTREM-1 in the diagnosis of bacterial lung infections in ICU patients. Pooled sensitivity, specificity, and positive and negative likelihood ratios were calculated by a bivariate regression analysis. Measures of accuracy and Q point value (Q*) were calculated using summary receiver operating characteristic (SROC) curve. The potential between-studies heterogeneity was explored by subgroup analysis. Nine studies were included in the present meta-analysis. Overall, the prevalence was 50.6%; the sensitivity was 0.87 (95% confidence interval (CI), 0.72-0.95); the specificity was 0.79 (95% CI, 0.56-0.92); the positive likelihood ratio (PLR) was 4.18 (95% CI, 1.78-9.86); the negative likelihood ratio (NLR) was 0.16 (95% CI, 0.07-0.36), and the diagnostic odds ratio (DOR) was 25.60 (95% CI, 7.28-89.93). The area under the SROC curve was 0.91 (95% CI, 0.88-0.93), with a Q* of 0.83. Subgroup analysis showed that the assay method and cutoff value influenced the diagnostic accuracy of sTREM-1. BALF sTREM-1 is a useful biomarker of bacterial lung infections in ICU patients. Further studies are needed to confirm the optimized cutoff value.

  13. Individual loss reserving with the Multivariate Skew Normal distribution

    NARCIS (Netherlands)

    Pigeon, M.; Antonio, K.; Denuit, M.

    2012-01-01

    The evaluation of future cash flows and solvency capital recently gained importance in general insurance. To assist in this process, our paper proposes a novel loss reserving model, designed for individual claims in discrete time. We model the occurrence of claims, as well as their reporting delay,

  14. Some properties of normal moment distribution | Olosunde | Ife ...

    African Journals Online (AJOL)

    Ife Journal of Science. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 10, No 1 (2008) >. Log in or Register to get access to full text downloads. Username, Password, Remember me, or Register · Download this PDF file. The PDF file you selected should load here ...

  15. Institutionalizing Normal: Rethinking Composition's Precedence in Normal Schools

    Science.gov (United States)

    Skinnell, Ryan

    2013-01-01

    Composition historians have recently worked to recover histories of composition in normal schools. This essay argues, however, that historians have inadvertently misconstrued the role of normal schools in American education by inaccurately comparing rhetorical education in normal schools to rhetorical education in colleges and universities.…

  16. Normal human bone marrow and its variations in MRI

    International Nuclear Information System (INIS)

    Vahlensieck, M.; Schmidt, H.M.

    2000-01-01

    Physiology and age dependant changes of human bone marrow are described. The resulting normal distribution patterns of active and inactive bone marrow including the various contrasts on different MR-sequences are discussed. (orig.) [de

  17. A Comparative Dermatoglyphic Study of Autistic, Retarded, and Normal Children.

    Science.gov (United States)

    Hartin, Phillip J.; Barry, Robert J.

    1979-01-01

    Significant differences were found between the autistic and normal children for distribution of dermal patterns and ridge line disruption, but no significant differences were found for the total mean ridge counts or mean ridge count rankings. (Author)

  18. The Prevalence of the Metabolic Syndrome among normal weight ...

    African Journals Online (AJOL)

    These gave the prevalence of MS as 10.6%, 4.3% and 18.5% among all, male and female subjects respectively. MS was distributed evenly in the normal BMI range among the women while among the men more cases were found in the upper range of normal BMI. Prevalence increased with BMI. Conclusions. Individuals in ...

  19. Radiation effects in normal tissues

    International Nuclear Information System (INIS)

    Trott, K.R.; Herrmann, T.; Doerr, W.

    2002-01-01

    Knowledge of radiation effects in normal tissues is fundamental for optimal planning of radiotherapy. Therefore, this book presents a review on the following aspects: General pathogenesis of acute radiation effects in normal tissues; general pathogenesis of chronic radiation effects in normal tissues; quantification of acute and chronic radiation effects in normal tissues; pathogenesis, pathology and radiation biology of various organs and organ systems. (MG) [de

  20. Intensity distributions in fiber diffraction

    International Nuclear Information System (INIS)

    Millane, R.P.

    1990-01-01

    The probability distribution of X-ray intensities in fiber diffraction are different from those for single crystals (Wilson statistics) because of the cylindrical averaging of the diffraction data. Stubbs has recently determined the intensity distributions on a fiber diffraction pattern for a fixed number of overlapping Fourier-Bessel terms. Some properties of the amplitude and intensity distributions are derived here. It is shown that the amplitudes and intensities are approximately normally distributed (the distributions being asymptotically normal with increasing number of Fourier-Bessel terms). Improved approximations using an Edgeworth series are derived. Other statistical properties and some asymptotic expansions are also derived, and normalization of fiber diffraction amplitudes is discussed. The accuracies of the normal approximations are illustrated for particular fiber structures, and possible applications of intensity statistics in fiber diffraction are discussed. (orig.)