WorldWideScience

Sample records for normal distribution complex

  1. Preparation, distribution, stability and tumor imaging properties of [62Zn] Bleomycin complex in normal and tumor-bearing mice

    International Nuclear Information System (INIS)

    Jalilian, A.R.; Fateh, B.; Ghergherehchi, M.; Karimian, A.; Matloobi, M.; Moradkhani, S.; Kamalidehghan, M.; Tabeie, F.

    2003-01-01

    Backgrounds: Bleomycin (BLM) has been labeled with radioisotopes and widely used in therapy and diagnosis. In this study BLM was labeled with [ 62 Zn] zinc chloride for oncologic PET studies. Materials and methods: The complex was obtained at the P H=2 normal saline at 90 d eg C in 60 min. Radio-TLC showed on overall radiochemical yield of 95-97% (radiochemical purity>97%). Stability of complex was checked in vitro in mice and human plasma/urine. Results: Preliminary in vitro studies performed to determined complex stability and distribution of [ 62 Zn] BLM in normal and fibrosarcoma tumors in mice according to bio-distribution/imaging studies. Conclusion: [ 62 Zn] BLM can be used in PET oncology studies due to its suitable physico-chemical propertied as a diagnostic complex behavior in higher animals

  2. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  3. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  4. Computation of distribution of minimum resolution for log-normal distribution of chromatographic peak heights.

    Science.gov (United States)

    Davis, Joe M

    2011-10-28

    General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. A New Distribution-Random Limit Normal Distribution

    OpenAIRE

    Gong, Xiaolin; Yang, Shuzhen

    2013-01-01

    This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.

  6. Understanding a Normal Distribution of Data.

    Science.gov (United States)

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  7. The exp-normal distribution is infinitely divisible

    OpenAIRE

    Pinelis, Iosif

    2018-01-01

    Let $Z$ be a standard normal random variable (r.v.). It is shown that the distribution of the r.v. $\\ln|Z|$ is infinitely divisible; equivalently, the standard normal distribution considered as the distribution on the multiplicative group over $\\mathbb{R}\\setminus\\{0\\}$ is infinitely divisible.

  8. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  9. Quantiles for Finite Mixtures of Normal Distributions

    Science.gov (United States)

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  10. Characteristic functions of scale mixtures of multivariate skew-normal distributions

    KAUST Repository

    Kim, Hyoung-Moon

    2011-08-01

    We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew-normal distributions. In particular, we describe the characteristic function of skew-normal, skew-t, and other related distributions. © 2011 Elsevier Inc.

  11. Concentration distribution of trace elements: from normal distribution to Levy flights

    International Nuclear Information System (INIS)

    Kubala-Kukus, A.; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M.

    2003-01-01

    The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail

  12. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    Science.gov (United States)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  13. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  14. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    Science.gov (United States)

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  15. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  16. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...

  17. Ventilation-perfusion distribution in normal subjects.

    Science.gov (United States)

    Beck, Kenneth C; Johnson, Bruce D; Olson, Thomas P; Wilson, Theodore A

    2012-09-01

    Functional values of LogSD of the ventilation distribution (σ(V)) have been reported previously, but functional values of LogSD of the perfusion distribution (σ(q)) and the coefficient of correlation between ventilation and perfusion (ρ) have not been measured in humans. Here, we report values for σ(V), σ(q), and ρ obtained from wash-in data for three gases, helium and two soluble gases, acetylene and dimethyl ether. Normal subjects inspired gas containing the test gases, and the concentrations of the gases at end-expiration during the first 10 breaths were measured with the subjects at rest and at increasing levels of exercise. The regional distribution of ventilation and perfusion was described by a bivariate log-normal distribution with parameters σ(V), σ(q), and ρ, and these parameters were evaluated by matching the values of expired gas concentrations calculated for this distribution to the measured values. Values of cardiac output and LogSD ventilation/perfusion (Va/Q) were obtained. At rest, σ(q) is high (1.08 ± 0.12). With the onset of ventilation, σ(q) decreases to 0.85 ± 0.09 but remains higher than σ(V) (0.43 ± 0.09) at all exercise levels. Rho increases to 0.87 ± 0.07, and the value of LogSD Va/Q for light and moderate exercise is primarily the result of the difference between the magnitudes of σ(q) and σ(V). With known values for the parameters, the bivariate distribution describes the comprehensive distribution of ventilation and perfusion that underlies the distribution of the Va/Q ratio.

  18. Evaluating Transfer Entropy for Normal and y-Order Normal Distributions

    Czech Academy of Sciences Publication Activity Database

    Hlaváčková-Schindler, Kateřina; Toulias, T. L.; Kitsos, C. P.

    2016-01-01

    Roč. 17, č. 5 (2016), s. 1-20 ISSN 2231-0851 Institutional support: RVO:67985556 Keywords : Transfer entropy * time series * Kullback-Leibler divergence * causality * generalized normal distribution Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/hlavackova-schindler-0461261.pdf

  19. Modified Normal Demand Distributions in (R,S)-Inventory Models

    NARCIS (Netherlands)

    Strijbosch, L.W.G.; Moors, J.J.A.

    2003-01-01

    To model demand, the normal distribution is by far the most popular; the disadvantage that it takes negative values is taken for granted.This paper proposes two modi.cations of the normal distribution, both taking non-negative values only.Safety factors and order-up-to-levels for the familiar (R,

  20. Toxic effect of C60 fullerene-doxorubicin complex towards tumor and normal cells in vitro

    Directory of Open Access Journals (Sweden)

    Prylutska S. V.

    2014-09-01

    Full Text Available Creation of new nanostructures possessing high antitumor activity is an important problem of modern biotechnology. Aim. To evaluate cytotoxicity of created complex of pristine C60 fullerene with the anthracycline antibiotic doxorubicin (Dox, as well as of free C60 fullerene and Dox, towards different cell types – tumor, normal immunocompetent and hepatocytes. Methods. Measurement of size distribution for particles in C60 + Dox mixture was performed by a dynamic light scattering (DLS technique. Toxic effect of C60 + Dox complex in vitro towards tumor and normal cells was studied using the MTT assay. Results. DLS experiment demonstrated that the main fraction of the particles in C60 + Dox mixture had a diameter in the range of about 132 nm. The toxic effect of C60 + Dox complex towards normal (lymphocytes, macrophages, hepatocytes and tumor (Ehrlich ascites carcinoma, leukemia L1210, Lewis lung carcinoma cells was decreased by ~10–16 % and ~7–9 %, accordingly, compared with the same effect of free Dox. Conclusions. The created C60 + Dox composite may be considered as a new pharmacological agent that kills effectively tumor cells in vitro and simultaneously prevents a toxic effect of the free form of Dox on normal cells.

  1. Characteristic functions of scale mixtures of multivariate skew-normal distributions

    KAUST Repository

    Kim, Hyoung-Moon; Genton, Marc G.

    2011-01-01

    We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew

  2. On Complex Random Variables

    Directory of Open Access Journals (Sweden)

    Anwer Khurshid

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE In this paper, it is shown that a complex multivariate random variable  is a complex multivariate normal random variable of dimensionality if and only if all nondegenerate complex linear combinations of  have a complex univariate normal distribution. The characteristic function of  has been derived, and simpler forms of some theorems have been given using this characterization theorem without assuming that the variance-covariance matrix of the vector  is Hermitian positive definite. Marginal distributions of  have been given. In addition, a complex multivariate t-distribution has been defined and the density derived. A characterization of the complex multivariate t-distribution is given. A few possible uses of this distribution have been suggested.

  3. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  4. Normal and student´s t distributions and their applications

    CERN Document Server

    Ahsanullah, Mohammad; Shakil, Mohammad

    2014-01-01

    The most important properties of normal and Student t-distributions are presented. A number of applications of these properties are demonstrated. New related results dealing with the distributions of the sum, product and ratio of the independent normal and Student distributions are presented. The materials will be useful to the advanced undergraduate and graduate students and practitioners in the various fields of science and engineering.

  5. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Ferreira, Clé cio S.; Genton, Marc G.

    2018-01-01

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down

  6. LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS

    Energy Technology Data Exchange (ETDEWEB)

    Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu [Division of Science and Mathematics, New York University Abu Dhabi, P.O. Box 129188, Abu Dhabi (United Arab Emirates)

    2017-01-20

    Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.

  7. Normal distribution of standing balance for healthy Danish children

    DEFF Research Database (Denmark)

    Pedersen, Line Kjeldgaard; Ghasemi, Habib; Rahbek, Ole

    2013-01-01

    Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used in child......Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used...

  8. Transformation of an empirical distribution to normal distribution by the use of Johnson system of translation and symmetrical quantile method

    OpenAIRE

    Ludvík Friebel; Jana Friebelová

    2006-01-01

    This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...

  9. Modeling and stress analyses of a normal foot-ankle and a prosthetic foot-ankle complex.

    Science.gov (United States)

    Ozen, Mustafa; Sayman, Onur; Havitcioglu, Hasan

    2013-01-01

    Total ankle replacement (TAR) is a relatively new concept and is becoming more popular for treatment of ankle arthritis and fractures. Because of the high costs and difficulties of experimental studies, the developments of TAR prostheses are progressing very slowly. For this reason, the medical imaging techniques such as CT, and MR have become more and more useful. The finite element method (FEM) is a widely used technique to estimate the mechanical behaviors of materials and structures in engineering applications. FEM has also been increasingly applied to biomechanical analyses of human bones, tissues and organs, thanks to the development of both the computing capabilities and the medical imaging techniques. 3-D finite element models of the human foot and ankle from reconstruction of MR and CT images have been investigated by some authors. In this study, data of geometries (used in modeling) of a normal and a prosthetic foot and ankle were obtained from a 3D reconstruction of CT images. The segmentation software, MIMICS was used to generate the 3D images of the bony structures, soft tissues and components of prosthesis of normal and prosthetic ankle-foot complex. Except the spaces between the adjacent surface of the phalanges fused, metatarsals, cuneiforms, cuboid, navicular, talus and calcaneus bones, soft tissues and components of prosthesis were independently developed to form foot and ankle complex. SOLIDWORKS program was used to form the boundary surfaces of all model components and then the solid models were obtained from these boundary surfaces. Finite element analyses software, ABAQUS was used to perform the numerical stress analyses of these models for balanced standing position. Plantar pressure and von Mises stress distributions of the normal and prosthetic ankles were compared with each other. There was a peak pressure increase at the 4th metatarsal, first metatarsal and talus bones and a decrease at the intermediate cuneiform and calcaneus bones, in

  10. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  11. Mast cell distribution in normal adult skin

    NARCIS (Netherlands)

    A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)

    2005-01-01

    markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.

  12. Percentile estimation using the normal and lognormal probability distribution

    International Nuclear Information System (INIS)

    Bement, T.R.

    1980-01-01

    Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution

  13. A novel generalized normal distribution for human longevity and other negatively skewed data.

    Science.gov (United States)

    Robertson, Henry T; Allison, David B

    2012-01-01

    Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.

  14. Sketching Curves for Normal Distributions--Geometric Connections

    Science.gov (United States)

    Bosse, Michael J.

    2006-01-01

    Within statistics instruction, students are often requested to sketch the curve representing a normal distribution with a given mean and standard deviation. Unfortunately, these sketches are often notoriously imprecise. Poor sketches are usually the result of missing mathematical knowledge. This paper considers relationships which exist among…

  15. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  16. A general approach to double-moment normalization of drop size distributions

    NARCIS (Netherlands)

    Lee, G.W.; Zawadzki, I.; Szyrmer, W.; Sempere Torres, D.; Uijlenhoet, R.

    2004-01-01

    Normalization of drop size distributions (DSDs) is reexamined here. First, an extension of the scaling normalization that uses one moment of the DSD as a scaling parameter to a more general scaling normalization that uses two moments as scaling parameters of the normalization is presented. In

  17. Complex conjugate poles and parton distributions

    International Nuclear Information System (INIS)

    Tiburzi, B.C.; Detmold, W.; Miller, G.A.

    2003-01-01

    We calculate parton and generalized parton distributions in Minkowski space using a scalar propagator with a pair of complex conjugate poles. Correct spectral and support properties are obtained only after careful analytic continuation from Euclidean space. Alternately the quark distribution function can be calculated from modified cutting rules, which put the intermediate state on its complex mass shells. Distribution functions agree with those resulting from the model's Euclidean space double distribution which we calculate via nondiagonal matrix elements of twist-two operators. Thus one can use a wide class of analytic parametrizations of the quark propagator to connect Euclidean space Green functions to light-cone dominated amplitudes

  18. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    Science.gov (United States)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  19. Optimal transformations leading to normal distributions of positron emission tomography standardized uptake values

    Science.gov (United States)

    Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert

    2018-02-01

    The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P  >  0.10). Similar results were found for 18F-FLT PET SUV distributions (P  >  0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when

  20. An approach to normal forms of Kuramoto model with distributed delays and the effect of minimal delay

    Energy Technology Data Exchange (ETDEWEB)

    Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)

    2015-09-25

    Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.

  1. Generating log-normally distributed random numbers by using the Ziggurat algorithm

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2016-01-01

    Uncertainty analyses are usually based on the Monte Carlo method. Using an efficient random number generator(RNG) is a key element in success of Monte Carlo simulations. Log-normal distributed variates are very typical in NPP PSAs. This paper proposes an approach to generate log normally distributed variates based on the Ziggurat algorithm and evaluates the efficiency of the proposed Ziggurat RNG. The proposed RNG can be helpful to improve the uncertainty analysis of NPP PSAs. This paper focuses on evaluating the efficiency of the Ziggurat algorithm from a NPP PSA point of view. From this study, we can draw the following conclusions. - The Ziggurat algorithm is one of perfect random number generators to product normal distributed variates. - The Ziggurat algorithm is computationally much faster than the most commonly used method, Marsaglia polar method

  2. The retest distribution of the visual field summary index mean deviation is close to normal.

    Science.gov (United States)

    Anderson, Andrew J; Cheng, Allan C Y; Lau, Samantha; Le-Pham, Anne; Liu, Victor; Rahman, Farahnaz

    2016-09-01

    When modelling optimum strategies for how best to determine visual field progression in glaucoma, it is commonly assumed that the summary index mean deviation (MD) is normally distributed on repeated testing. Here we tested whether this assumption is correct. We obtained 42 reliable 24-2 Humphrey Field Analyzer SITA standard visual fields from one eye of each of five healthy young observers, with the first two fields excluded from analysis. Previous work has shown that although MD variability is higher in glaucoma, the shape of the MD distribution is similar to that found in normal visual fields. A Shapiro-Wilks test determined any deviation from normality. Kurtosis values for the distributions were also calculated. Data from each observer passed the Shapiro-Wilks normality test. Bootstrapped 95% confidence intervals for kurtosis encompassed the value for a normal distribution in four of five observers. When examined with quantile-quantile plots, distributions were close to normal and showed no consistent deviations across observers. The retest distribution of MD is not significantly different from normal in healthy observers, and so is likely also normally distributed - or nearly so - in those with glaucoma. Our results increase our confidence in the results of influential modelling studies where a normal distribution for MD was assumed. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.

  3. Austenite Grain Size Estimtion from Chord Lengths of Logarithmic-Normal Distribution

    Directory of Open Access Journals (Sweden)

    Adrian H.

    2017-12-01

    Full Text Available Linear section of grains in polyhedral material microstructure is a system of chords. The mean length of chords is the linear grain size of the microstructure. For the prior austenite grains of low alloy structural steels, the chord length is a random variable of gamma- or logarithmic-normal distribution. The statistical grain size estimation belongs to the quantitative metallographic problems. The so-called point estimation is a well known procedure. The interval estimation (grain size confidence interval for the gamma distribution was given elsewhere, but for the logarithmic-normal distribution is the subject of the present contribution. The statistical analysis is analogous to the one for the gamma distribution.

  4. Kullback–Leibler Divergence of the γ–ordered Normal over t–distribution

    OpenAIRE

    Toulias, T-L.; Kitsos, C-P.

    2012-01-01

    The aim of this paper is to evaluate and study the Kullback–Leibler divergence of the γ–ordered Normal distribution, a generalization of Normal distribution emerged from the generalized Fisher’s information measure, over the scaled t–distribution. We investigate this evaluation through a series of bounds and approximations while the asymptotic behavior of the divergence is also studied. Moreover, we obtain a generalization of the known Kullback–Leibler information measure betwe...

  5. Scale and shape mixtures of multivariate skew-normal distributions

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2018-02-26

    We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down the theoretical foundations for subsequent inference with this model. In particular, we study linear transformations, marginal distributions, selection representations, stochastic representations and hierarchical representations. We also describe an EM-type algorithm for maximum likelihood estimation of the parameters of the model and demonstrate its implementation on a wind dataset. Our family of multivariate distributions unifies and extends many existing models of the literature that can be seen as submodels of our proposal.

  6. Quantum arrival times and operator normalization

    International Nuclear Information System (INIS)

    Hegerfeldt, Gerhard C.; Seidel, Dirk; Gonzalo Muga, J.

    2003-01-01

    A recent approach to arrival times used the fluorescence of an atom entering a laser illuminated region, and the resulting arrival-time distribution was close to the axiomatic distribution of Kijowski, but not exactly equal, neither in limiting cases nor after compensation of reflection losses by normalization on the level of expectation values. In this paper we employ a normalization on the level of operators, recently proposed in a slightly different context. We show that in this case the axiomatic arrival-time distribution of Kijowski is recovered as a limiting case. In addition, it is shown that Allcock's complex potential model is also a limit of the physically motivated fluorescence approach and connected to Kijowski's distribution through operator normalization

  7. Improved Root Normal Size Distributions for Liquid Atomization

    Science.gov (United States)

    2015-11-01

    ANSI Std. Z39.18 j CONVERSION TABLE Conversion Factors for U.S. Customary to metric (SI) units of measurement. MULTIPLY BY TO...Gray (Gy) coulomb /kilogram (C/kg) second (s) kilogram (kg) kilo pascal (kPa) 1 Improved Root Normal Size Distributions for Liquid

  8. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  9. Mast cell distribution in normal adult skin.

    Science.gov (United States)

    Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P

    2005-03-01

    To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.

  10. IntraGolgi distribution of the Conserved Oligomeric Golgi (COG) complex

    International Nuclear Information System (INIS)

    Vasile, Eliza; Oka, Toshihiko; Ericsson, Maria; Nakamura, Nobuhiro; Krieger, Monty

    2006-01-01

    The Conserved Oligomeric Golgi (COG) complex is an eight-subunit (Cog1-8) peripheral Golgi protein involved in membrane trafficking and glycoconjugate synthesis. COG appears to participate in retrograde vesicular transport and is required to maintain normal Golgi structure and function. COG mutations interfere with normal transport, distribution, and/or stability of Golgi proteins associated with glycoconjugate synthesis and trafficking, and lead to failure of spermatogenesis in Drosophila melanogaster, misdirected migration of gonadal distal tip cells in Caenorhabditis elegans, and type II congenital disorders of glycosylation in humans. The mechanism by which COG influences Golgi structure and function is unclear. Immunogold electron microscopy was used to visualize the intraGolgi distribution of a functional, hemagglutinin epitope-labeled COG subunit, Cog1-HA, that complements the Cog1-deficiency in Cog1-null Chinese hamster ovary cells. COG was found to be localized primarily on or in close proximity to the tips and rims of the Golgi's cisternae and their associated vesicles and on vesicles and vesiculo-tubular structures seen on both the cis and trans-Golgi Network faces of the cisternal stacks, in some cases on COPI containing vesicles. These findings support the proposal that COG is directly involved in controlling vesicular retrograde transport of Golgi resident proteins throughout the Golgi apparatus

  11. Understanding the implementation of complex interventions in health care: the normalization process model

    Directory of Open Access Journals (Sweden)

    Rogers Anne

    2007-09-01

    Full Text Available Abstract Background The Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice. Methods A formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions. Results The model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration. Conclusion The model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.

  12. The effects of familiarity and complexity on appraisal of complex songs by cochlear implant recipients and normal hearing adults.

    Science.gov (United States)

    Gfeller, Kate; Christ, Aaron; Knutson, John; Witt, Shelley; Mehr, Maureen

    2003-01-01

    The purposes of this study were (a) to develop a test of complex song appraisal that would be suitable for use with adults who use a cochlear implant (assistive hearing device) and (b) to compare the appraisal ratings (liking) of complex songs by adults who use cochlear implants (n = 66) with a comparison group of adults with normal hearing (n = 36). The article describes the development of a computerized test for appraisal, with emphasis on its theoretical basis and the process for item selection of naturalistic stimuli. The appraisal test was administered to the 2 groups to determine the effects of prior song familiarity and subjective complexity on complex song appraisal. Comparison of the 2 groups indicates that the implant users rate 2 of 3 musical genres (country western, pop) as significantly more complex than do normal hearing adults, and give significantly less positive ratings to classical music than do normal hearing adults. Appraisal responses of implant recipients were examined in relation to hearing history, age, performance on speech perception and cognitive tests, and musical background.

  13. Comparison of CSF Distribution between Idiopathic Normal Pressure Hydrocephalus and Alzheimer Disease.

    Science.gov (United States)

    Yamada, S; Ishikawa, M; Yamamoto, K

    2016-07-01

    CSF volumes in the basal cistern and Sylvian fissure are increased in both idiopathic normal pressure hydrocephalus and Alzheimer disease, though the differences in these volumes in idiopathic normal pressure hydrocephalus and Alzheimer disease have not been well-described. Using CSF segmentation and volume quantification, we compared the distribution of CSF in idiopathic normal pressure hydrocephalus and Alzheimer disease. CSF volumes were extracted from T2-weighted 3D spin-echo sequences on 3T MR imaging and quantified semi-automatically. We compared the volumes and ratios of the ventricles and subarachnoid spaces after classification in 30 patients diagnosed with idiopathic normal pressure hydrocephalus, 10 with concurrent idiopathic normal pressure hydrocephalus and Alzheimer disease, 18 with Alzheimer disease, and 26 control subjects 60 years of age or older. Brain to ventricle ratios at the anterior and posterior commissure levels and 3D volumetric convexity cistern to ventricle ratios were useful indices for the differential diagnosis of idiopathic normal pressure hydrocephalus or idiopathic normal pressure hydrocephalus with Alzheimer disease from Alzheimer disease, similar to the z-Evans index and callosal angle. The most distinctive characteristics of the CSF distribution in idiopathic normal pressure hydrocephalus were small convexity subarachnoid spaces and the large volume of the basal cistern and Sylvian fissure. The distribution of the subarachnoid spaces in the idiopathic normal pressure hydrocephalus with Alzheimer disease group was the most deformed among these 3 groups, though the mean ventricular volume of the idiopathic normal pressure hydrocephalus with Alzheimer disease group was intermediate between that of the idiopathic normal pressure hydrocephalus and Alzheimer disease groups. The z-axial expansion of the lateral ventricle and compression of the brain just above the ventricle were the common findings in the parameters for differentiating

  14. Multivariate stochastic simulation with subjective multivariate normal distributions

    Science.gov (United States)

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  15. A general approach to double-moment normalization of drop size distributions

    Science.gov (United States)

    Lee, G. W.; Sempere-Torres, D.; Uijlenhoet, R.; Zawadzki, I.

    2003-04-01

    Normalization of drop size distributions (DSDs) is re-examined here. First, we present an extension of scaling normalization using one moment of the DSD as a parameter (as introduced by Sempere-Torres et al, 1994) to a scaling normalization using two moments as parameters of the normalization. It is shown that the normalization of Testud et al. (2001) is a particular case of the two-moment scaling normalization. Thus, a unified vision of the question of DSDs normalization and a good model representation of DSDs is given. Data analysis shows that from the point of view of moment estimation least square regression is slightly more effective than moment estimation from the normalized average DSD.

  16. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    Directory of Open Access Journals (Sweden)

    Casault Sébastien

    2016-05-01

    Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture

  17. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  18. Distributive justice and cognitive enhancement in lower, normal intelligence.

    Science.gov (United States)

    Dunlop, Mikael; Savulescu, Julian

    2014-01-01

    There exists a significant disparity within society between individuals in terms of intelligence. While intelligence varies naturally throughout society, the extent to which this impacts on the life opportunities it affords to each individual is greatly undervalued. Intelligence appears to have a prominent effect over a broad range of social and economic life outcomes. Many key determinants of well-being correlate highly with the results of IQ tests, and other measures of intelligence, and an IQ of 75 is generally accepted as the most important threshold in modern life. The ability to enhance our cognitive capacities offers an exciting opportunity to correct disabling natural variation and inequality in intelligence. Pharmaceutical cognitive enhancers, such as modafinil and methylphenidate, have been shown to have the capacity to enhance cognition in normal, healthy individuals. Perhaps of most relevance is the presence of an 'inverted U effect' for most pharmaceutical cognitive enhancers, whereby the degree of enhancement increases as intelligence levels deviate further below the mean. Although enhancement, including cognitive enhancement, has been much debated recently, we argue that there are egalitarian reasons to enhance individuals with low but normal intelligence. Under egalitarianism, cognitive enhancement has the potential to reduce opportunity inequality and contribute to relative income and welfare equality in the lower, normal intelligence subgroup. Cognitive enhancement use is justifiable under prioritarianism through various means of distribution; selective access to the lower, normal intelligence subgroup, universal access, or paradoxically through access primarily to the average and above average intelligence subgroups. Similarly, an aggregate increase in social well-being is achieved through similar means of distribution under utilitarianism. In addition, the use of cognitive enhancement within the lower, normal intelligence subgroup negates, or at

  19. Anal sphincter complex: endoanal MR imaging of normal anatomy

    NARCIS (Netherlands)

    Hussain, S. M.; Stoker, J.; Laméris, J. S.

    1995-01-01

    To determine the normal anatomy of the anal sphincter complex on magnetic resonance (MR) images. Ten healthy volunteers (four men, six women; age range, 21-26 years) underwent MR imaging with an endoanal coil. The lower part of the anal canal contained the internal sphincter, the longitudinal muscle

  20. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  1. Normal bone and soft tissue distribution of fluorine-18-sodium fluoride and artifacts on 18F-NaF PET/CT bone scan: a pictorial review.

    Science.gov (United States)

    Sarikaya, Ismet; Elgazzar, Abdelhamid H; Sarikaya, Ali; Alfeeli, Mahmoud

    2017-10-01

    Fluorine-18-sodium fluoride (F-NaF) PET/CT is a relatively new and high-resolution bone imaging modality. Since the use of F-NaF PET/CT has been increasing, it is important to accurately assess the images and be aware of normal distribution and major artifacts. In this pictorial review article, we will describe the normal uptake patterns of F-NaF in the bone tissues, particularly in complex structures, as well as its physiologic soft tissue distribution and certain artifacts seen on F-NaF PET/CT images.

  2. Wealth distribution on complex networks

    Science.gov (United States)

    Ichinomiya, Takashi

    2012-12-01

    We study the wealth distribution of the Bouchaud-Mézard model on complex networks. It is known from numerical simulations that this distribution depends on the topology of the network; however, no one has succeeded in explaining it. Using “adiabatic” and “independent” assumptions along with the central-limit theorem, we derive equations that determine the probability distribution function. The results are compared to those of simulations for various networks. We find good agreement between our theory and the simulations, except for the case of Watts-Strogatz networks with a low rewiring rate due to the breakdown of independent assumption.

  3. Radiation distribution sensing with normal optical fiber

    International Nuclear Information System (INIS)

    Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo; Tsujimura, Norio

    2002-01-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( 90 Sr -90 Y), gamma rays ( 137 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 -5 % and 5.4x10 -4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)

  4. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  5. Computer program determines exact two-sided tolerance limits for normal distributions

    Science.gov (United States)

    Friedman, H. A.; Webb, S. R.

    1968-01-01

    Computer program determines by numerical integration the exact statistical two-sided tolerance limits, when the proportion between the limits is at least a specified number. The program is limited to situations in which the underlying probability distribution for the population sampled is the normal distribution with unknown mean and variance.

  6. Distribution of Different Sized Ocular Surface Vessels in Diabetics and Normal Individuals.

    Science.gov (United States)

    Banaee, Touka; Pourreza, Hamidreza; Doosti, Hassan; Abrishami, Mojtaba; Ehsaei, Asieh; Basiry, Mohsen; Pourreza, Reza

    2017-01-01

    To compare the distribution of different sized vessels using digital photographs of the ocular surface of diabetic and normal individuals. In this cross-sectional study, red-free conjunctival photographs of diabetic and normal individuals, aged 30-60 years, were taken under defined conditions and analyzed using a Radon transform-based algorithm for vascular segmentation. The image areas occupied by vessels (AOV) of different diameters were calculated. The main outcome measure was the distribution curve of mean AOV of different sized vessels. Secondary outcome measures included total AOV and standard deviation (SD) of AOV of different sized vessels. Two hundred and sixty-eight diabetic patients and 297 normal (control) individuals were included, differing in age (45.50 ± 5.19 vs. 40.38 ± 6.19 years, P distribution curves of mean AOV differed between patients and controls (smaller AOV for larger vessels in patients; P distribution curve of vessels compared to controls. Presence of diabetes mellitus is associated with contraction of larger vessels in the conjunctiva. Smaller vessels dilate with diabetic retinopathy. These findings may be useful in the photographic screening of diabetes mellitus and retinopathy.

  7. Sample size determination for logistic regression on a logit-normal distribution.

    Science.gov (United States)

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  8. Distribution of normal superficial ocular vessels in digital images.

    Science.gov (United States)

    Banaee, Touka; Ehsaei, Asieh; Pourreza, Hamidreza; Khajedaluee, Mohammad; Abrishami, Mojtaba; Basiri, Mohsen; Daneshvar Kakhki, Ramin; Pourreza, Reza

    2014-02-01

    To investigate the distribution of different-sized vessels in the digital images of the ocular surface, an endeavor which may provide useful information for future studies. This study included 295 healthy individuals. From each participant, four digital photographs of the superior and inferior conjunctivae of both eyes, with a fixed succession of photography (right upper, right lower, left upper, left lower), were taken with a slit lamp mounted camera. Photographs were then analyzed by a previously described algorithm for vessel detection in the digital images. The area (of the image) occupied by vessels (AOV) of different sizes was measured. Height, weight, fasting blood sugar (FBS) and hemoglobin levels were also measured and the relationship between these parameters and the AOV was investigated. These findings indicated a statistically significant difference in the distribution of the AOV among the four conjunctival areas. No significant correlations were noted between the AOV of each conjunctival area and the different demographic and biometric factors. Medium-sized vessels were the most abundant vessels in the photographs of the four investigated conjunctival areas. The AOV of the different sizes of vessels follows a normal distribution curve in the four areas of the conjunctiva. The distribution of the vessels in successive photographs changes in a specific manner, with the mean AOV becoming larger as the photos were taken from the right upper to the left lower area. The AOV of vessel sizes has a normal distribution curve and medium-sized vessels occupy the largest area of the photograph. Copyright © 2013 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  9. Different percentages of false-positive results obtained using five methods for the calculation of reference change values based on simulated normal and ln-normal distributions of data

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2016-01-01

    a homeostatic set point that follows a normal (Gaussian) distribution. This set point (or baseline in steady-state) should be estimated from a set of previous samples, but, in practice, decisions based on reference change value are often based on only two consecutive results. The original reference change value......-positive results. The aim of this study was to investigate false-positive results using five different published methods for calculation of reference change value. METHODS: The five reference change value methods were examined using normally and ln-normally distributed simulated data. RESULTS: One method performed...... best in approaching the theoretical false-positive percentages on normally distributed data and another method performed best on ln-normally distributed data. The commonly used reference change value method based on two results (without use of estimated set point) performed worst both on normally...

  10. Radiation distribution sensing with normal optical fiber

    Energy Technology Data Exchange (ETDEWEB)

    Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo [Nagoya Univ., Dept. of Nuclear Engineering, Nagoya, Aichi (Japan); Tsujimura, Norio [Japan Nuclear Cycle Development Inst., Tokai Works, Tokai, Ibaraki (Japan)

    2002-12-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ({sup 90}Sr{sup -90}Y), gamma rays ({sup 137}Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10{sup -5}% and 5.4x10{sup -4}%, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)

  11. Normalization of High Dimensional Genomics Data Where the Distribution of the Altered Variables Is Skewed

    Science.gov (United States)

    Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per

    2011-01-01

    Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher

  12. Adaptive Bayesian inference on the mean of an infinite-dimensional normal distribution

    NARCIS (Netherlands)

    Belitser, E.; Ghosal, S.

    2003-01-01

    We consider the problem of estimating the mean of an infinite-break dimensional normal distribution from the Bayesian perspective. Under the assumption that the unknown true mean satisfies a "smoothness condition," we first derive the convergence rate of the posterior distribution for a prior that

  13. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    Science.gov (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  14. DNA Double-Strand Break Rejoining in Complex Normal Tissues

    International Nuclear Information System (INIS)

    Ruebe, Claudia E.; Dong, Xiaorong; Kuehne, Martin; Fricke, Andreas; Kaestner, Lars; Lipp, Peter; Ruebe, Christian

    2008-01-01

    Purpose: The clinical radiation responses of different organs vary widely and likely depend on the intrinsic radiosensitivities of their different cell populations. Double-strand breaks (DSBs) are the most deleterious form of DNA damage induced by ionizing radiation, and the cells' capacity to rejoin radiation-induced DSBs is known to affect their intrinsic radiosensitivity. To date, only little is known about the induction and processing of radiation-induced DSBs in complex normal tissues. Using an in vivo model with repair-proficient mice, the highly sensitive γH2AX immunofluorescence was established to investigate whether differences in DSB rejoining could account for the substantial differences in clinical radiosensitivity observed among normal tissues. Methods and Materials: After whole body irradiation of C57BL/6 mice (0.1, 0.5, 1.0, and 2.0 Gy), the formation and rejoining of DSBs was analyzed by enumerating γH2AX foci in various organs representative of both early-responding (small intestine) and late-responding (lung, brain, heart, kidney) tissues. Results: The linear dose correlation observed in all analyzed tissues indicated that γH2AX immunofluorescence allows for the accurate quantification of DSBs in complex organs. Strikingly, the various normal tissues exhibited identical kinetics for γH2AX foci loss, despite their clearly different clinical radiation responses. Conclusion: The identical kinetics of DSB rejoining measured in different organs suggest that tissue-specific differences in radiation responses are independent of DSB rejoining. This finding emphasizes the fundamental role of DSB repair in maintaining genomic integrity, thereby contributing to cellular viability and functionality and, thus, tissue homeostasis

  15. Size distribution of interstellar particles. III. Peculiar extinctions and normal infrared extinction

    International Nuclear Information System (INIS)

    Mathis, J.S.; Wallenhorst, S.G.

    1981-01-01

    The effect of changing the upper and lower size limits of a distribution of bare graphite and silicate particles with n(a)αa/sup -q/ is investigated. Mathis, Rumpl, and Nordsieck showed that the normal extinction is matched very well by having the small-size cutoff, a/sub -/, roughly-equal0.005 or 0.01 μm, and the large size a/sub +/, about 0.25 μm, and q = 3.5 for both substances. We consider the progressively peculiar extinctions exhibited by the well-observed stars, sigma Sco, rho Oph, and theta 1 Ori C, with values of R/sub v/[equivalentA/sub v//E(B--V)] of 3.4, 4.4, and 5.5 compared to the normal 3.1. Two (sigma Sco, rho Oph) are in a neutral dense cloud; theta 1 Ori C is in the Orion Nebula. We find that sigma Sco has a normal graphite distribution but has had its small silicate particles removed, so that a/sub -/(sil)roughly-equal0.04 μm if q = 3.5, or q(sil) = 2.6 if the size limits are fixed. However, the upper size limit on silicates remains normal. In rho Oph, the graphite is still normal, but both a/sub -/(sil) and a/sub +/(sil) are increased, to about 0.04 μm and 0.4 or 0.5 μm, respectively, if q = 3.5, or q(sil)roughly-equal1.3 if the size limits are fixed. In theta 1 Ori, the small limit on graphite has increased to about 0.04 μm, or q(gra)roughly-equal3, while the silicates are about like those in rho Oph. The calculated lambda2175 bump is broader than the observed, but normal foreground extinction probably contributes appreciably to the observed bump. The absolute amount of extinction per H atom for rho Oph is not explained. The column density of H is so large that systematic effects might be present. Very large graphite particles (a>3 μm) are required to ''hide'' the graphite without overly affecting the visual extinction, but a normal (small) graphite size distribution is required by the lambda2175 bump. We feel that it is unlikely that such a bimodal distribution exists

  16. Distributed redundancy and robustness in complex systems

    KAUST Repository

    Randles, Martin

    2011-03-01

    The uptake and increasing prevalence of Web 2.0 applications, promoting new large-scale and complex systems such as Cloud computing and the emerging Internet of Services/Things, requires tools and techniques to analyse and model methods to ensure the robustness of these new systems. This paper reports on assessing and improving complex system resilience using distributed redundancy, termed degeneracy in biological systems, to endow large-scale complicated computer systems with the same robustness that emerges in complex biological and natural systems. However, in order to promote an evolutionary approach, through emergent self-organisation, it is necessary to specify the systems in an \\'open-ended\\' manner where not all states of the system are prescribed at design-time. In particular an observer system is used to select robust topologies, within system components, based on a measurement of the first non-zero Eigen value in the Laplacian spectrum of the components\\' network graphs; also known as the algebraic connectivity. It is shown, through experimentation on a simulation, that increasing the average algebraic connectivity across the components, in a network, leads to an increase in the variety of individual components termed distributed redundancy; the capacity for structurally distinct components to perform an identical function in a particular context. The results are applied to a specific application where active clustering of like services is used to aid load balancing in a highly distributed network. Using the described procedure is shown to improve performance and distribute redundancy. © 2010 Elsevier Inc.

  17. The PDF of fluid particle acceleration in turbulent flow with underlying normal distribution of velocity fluctuations

    International Nuclear Information System (INIS)

    Aringazin, A.K.; Mazhitov, M.I.

    2003-01-01

    We describe a formal procedure to obtain and specify the general form of a marginal distribution for the Lagrangian acceleration of fluid particle in developed turbulent flow using Langevin type equation and the assumption that velocity fluctuation u follows a normal distribution with zero mean, in accord to the Heisenberg-Yaglom picture. For a particular representation, β=exp[u], of the fluctuating parameter β, we reproduce the underlying log-normal distribution and the associated marginal distribution, which was found to be in a very good agreement with the new experimental data by Crawford, Mordant, and Bodenschatz on the acceleration statistics. We discuss on arising possibilities to make refinements of the log-normal model

  18. Confidence Intervals for True Scores Using the Skew-Normal Distribution

    Science.gov (United States)

    Garcia-Perez, Miguel A.

    2010-01-01

    A recent comparative analysis of alternative interval estimation approaches and procedures has shown that confidence intervals (CIs) for true raw scores determined with the Score method--which uses the normal approximation to the binomial distribution--have actual coverage probabilities that are closest to their nominal level. It has also recently…

  19. Confidence bounds for normal and lognormal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill

    2003-01-01

    This paper compares the so-called exact approach for obtaining confidence intervals on normal distribution coefficients of variation to approximate methods. Approximate approaches were found to perform less well than the exact approach for large coefficients of variation and small sample sizes. Web-based computer programs are described for calculating confidence...

  20. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  1. A study of the up-and-down method for non-normal distribution functions

    DEFF Research Database (Denmark)

    Vibholm, Svend; Thyregod, Poul

    1988-01-01

    The assessment of breakdown probabilities is examined by the up-and-down method. The exact maximum-likelihood estimates for a number of response patterns are calculated for three different distribution functions and are compared with the estimates corresponding to the normal distribution. Estimates...

  2. The approximation of the normal distribution by means of chaotic expression

    International Nuclear Information System (INIS)

    Lawnik, M

    2014-01-01

    The approximation of the normal distribution by means of a chaotic expression is achieved by means of Weierstrass function, where, for a certain set of parameters, the density of the derived recurrence renders good approximation of the bell curve

  3. Dobinski-type relations and the log-normal distribution

    International Nuclear Information System (INIS)

    Blasiak, P; Penson, K A; Solomon, A I

    2003-01-01

    We consider sequences of generalized Bell numbers B(n), n = 1, 2, ..., which can be represented by Dobinski-type summation formulae, i.e. B(n) = 1/C Σ k =0 ∞ [P(k)] n /D(k), with P(k) a polynomial, D(k) a function of k and C = const. They include the standard Bell numbers (P(k) k, D(k) = k!, C = e), their generalizations B r,r (n), r = 2, 3, ..., appearing in the normal ordering of powers of boson monomials (P(k) (k+r)!/k!, D(k) = k!, C = e), variants of 'ordered' Bell numbers B o (p) (n) (P(k) = k, D(k) = (p+1/p) k , C = 1 + p, p = 1, 2 ...), etc. We demonstrate that for α, β, γ, t positive integers (α, t ≠ 0), [B(αn 2 + βn + γ)] t is the nth moment of a positive function on (0, ∞) which is a weighted infinite sum of log-normal distributions. (letter to the editor)

  4. The law of distribution of light beam direction fluctuations in telescopes. [normal density functions

    Science.gov (United States)

    Divinskiy, M. L.; Kolchinskiy, I. G.

    1974-01-01

    The distribution of deviations from mean star trail directions was studied on the basis of 105 star trails. It was found that about 93% of the trails yield a distribution in agreement with the normal law. About 4% of the star trails agree with the Charlier distribution.

  5. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    Science.gov (United States)

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  6. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  7. MR imaging of the bone marrow using short TI IR, 1. Normal and pathological intensity distribution of the bone marrow

    Energy Technology Data Exchange (ETDEWEB)

    Ishizaka, Hiroshi; Kurihara, Mikiko; Tomioka, Kuniaki; Kobayashi, Kanako; Sato, Noriko; Nagai, Teruo; Heshiki, Atsuko; Amanuma, Makoto; Mizuno, Hitomi.

    1989-02-01

    Normal vertebral bone marrow intensity distribution and its alteration in various anemias were evaluated on short TI IR sequences. Material consists of 73 individuals, 48 normals and 25 anemic patients excluding neoplastic conditions. All normal and reactive hypercellular bone marrow revealed characteristic intensity distribution; marginal high intensity and central low intensity, corresponding well to normal distribution of red and yellow marrows and their physiological or reactive conversion between red and yellow marrows. Aplastic anemia did not reveal normal intensity distribution, presumably due to autonomous condition.

  8. Annual rainfall statistics for stations in the Top End of Australia: normal and log-normal distribution analysis

    International Nuclear Information System (INIS)

    Vardavas, I.M.

    1992-01-01

    A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs

  9. Chinese Writing of Deaf or Hard-of-Hearing Students and Normal-Hearing Peers from Complex Network Approach.

    Science.gov (United States)

    Jin, Huiyuan; Liu, Haitao

    2016-01-01

    Deaf or hard-of-hearing individuals usually face a greater challenge to learn to write than their normal-hearing counterparts. Due to the limitations of traditional research methods focusing on microscopic linguistic features, a holistic characterization of the writing linguistic features of these language users is lacking. This study attempts to fill this gap by adopting the methodology of linguistic complex networks. Two syntactic dependency networks are built in order to compare the macroscopic linguistic features of deaf or hard-of-hearing students and those of their normal-hearing peers. One is transformed from a treebank of writing produced by Chinese deaf or hard-of-hearing students, and the other from a treebank of writing produced by their Chinese normal-hearing counterparts. Two major findings are obtained through comparison of the statistical features of the two networks. On the one hand, both linguistic networks display small-world and scale-free network structures, but the network of the normal-hearing students' exhibits a more power-law-like degree distribution. Relevant network measures show significant differences between the two linguistic networks. On the other hand, deaf or hard-of-hearing students tend to have a lower language proficiency level in both syntactic and lexical aspects. The rigid use of function words and a lower vocabulary richness of the deaf or hard-of-hearing students may partially account for the observed differences.

  10. [Origination of Pareto distribution in complex dynamic systems].

    Science.gov (United States)

    Chernavskiĭ, D S; Nikitin, A P; Chernavskaia, O D

    2008-01-01

    The Pareto distribution, whose probability density function can be approximated at sufficiently great chi as rho(chi) - chi(-alpha), where alpha > or = 2, is of crucial importance from both the theoretical and practical point of view. The main reason is its qualitative distinction from the normal (Gaussian) distribution. Namely, the probability of high deviations appears to be significantly higher. The conception of the universal applicability of the Gauss law remains to be widely distributed despite the lack of objective confirmation of this notion in a variety of application areas. The origin of the Pareto distribution in dynamic systems located in the gaussian noise field is considered. A simple one-dimensional model is discussed where the system response in a rather wide interval of the variable can be quite precisely approximated by this distribution.

  11. Radiation distribution sensing with normal optical fiber

    CERN Document Server

    Kawarabayashi, J; Naka, R; Uritani, A; Watanabe, K I; Iguchi, T; Tsujimura, N

    2002-01-01

    The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( sup 9 sup 0 Sr sup - sup 9 sup 0 Y), gamma rays ( sup 1 sup 3 sup 7 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 sup - sup 5 % and 5.4x10 sup - sup 4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that t...

  12. The distal semimembranosus complex: normal MR anatomy, variants, biomechanics and pathology

    International Nuclear Information System (INIS)

    Beltran, Javier; Jbara, Marlena; Maimon, Ron; Matityahu, Amir; Hwang, Ki; Padron, Mario; Mota, Javier; Beltran, Luis; Sundaram, Murali

    2003-01-01

    To describe the normal MR anatomy and variations of the distal semimembranosus tendinous arms and the posterior oblique ligament as seen in the three orthogonal planes, to review the biomechanics of this complex and to illustrate pathologic examples. The distal semimembranosus tendon divides into five tendinous arms named the anterior, direct, capsular, inferior and the oblique popliteal ligament. These arms intertwine with the branches of the posterior oblique ligament in the posterior medial aspect of the knee, providing stability. This tendon-ligamentous complex also acts synergistically with the popliteus muscle and actively pulls the posterior horn of the medial meniscus during knee flexion. Pathologic conditions involving this complex include complete and partial tears, insertional tendinosis, avulsion fractures and bursitis. (orig.)

  13. The distal semimembranosus complex: normal MR anatomy, variants, biomechanics and pathology

    Energy Technology Data Exchange (ETDEWEB)

    Beltran, Javier; Jbara, Marlena; Maimon, Ron [Department of Radiology, Maimonides Medical Center, 4802 Tenth Avenue, NY 11219, Brooklyn (United States); Matityahu, Amir; Hwang, Ki [Department of Orthopedic Surgery, Maimonides Medical Center, Brooklyn, NY (United States); Padron, Mario [Department of Radiology, Clinica CEMTRO, Madrid (Spain); Mota, Javier [Department of Radiology, Instituto Clinica Corachan, Barcelona (Spain); Beltran, Luis [New York Medical College, Valhalla, NY (United States); Sundaram, Murali [Department of Radiology, Mayo Clinic, Rochester, MN (United States)

    2003-08-01

    To describe the normal MR anatomy and variations of the distal semimembranosus tendinous arms and the posterior oblique ligament as seen in the three orthogonal planes, to review the biomechanics of this complex and to illustrate pathologic examples. The distal semimembranosus tendon divides into five tendinous arms named the anterior, direct, capsular, inferior and the oblique popliteal ligament. These arms intertwine with the branches of the posterior oblique ligament in the posterior medial aspect of the knee, providing stability. This tendon-ligamentous complex also acts synergistically with the popliteus muscle and actively pulls the posterior horn of the medial meniscus during knee flexion. Pathologic conditions involving this complex include complete and partial tears, insertional tendinosis, avulsion fractures and bursitis. (orig.)

  14. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  15. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  16. Spatial arrangement and size distribution of normal faults, Buckskin detachment upper plate, Western Arizona

    Science.gov (United States)

    Laubach, S. E.; Hundley, T. H.; Hooker, J. N.; Marrett, R. A.

    2018-03-01

    Fault arrays typically include a wide range of fault sizes and those faults may be randomly located, clustered together, or regularly or periodically located in a rock volume. Here, we investigate size distribution and spatial arrangement of normal faults using rigorous size-scaling methods and normalized correlation count (NCC). Outcrop data from Miocene sedimentary rocks in the immediate upper plate of the regional Buckskin detachment-low angle normal-fault, have differing patterns of spatial arrangement as a function of displacement (offset). Using lower size-thresholds of 1, 0.1, 0.01, and 0.001 m, displacements range over 5 orders of magnitude and have power-law frequency distributions spanning ∼ four orders of magnitude from less than 0.001 m to more than 100 m, with exponents of -0.6 and -0.9. The largest faults with >1 m displacement have a shallower size-distribution slope and regular spacing of about 20 m. In contrast, smaller faults have steep size-distribution slopes and irregular spacing, with NCC plateau patterns indicating imposed clustering. Cluster widths are 15 m for the 0.1-m threshold, 14 m for 0.01-m, and 1 m for 0.001-m displacement threshold faults. Results demonstrate normalized correlation count effectively characterizes the spatial arrangement patterns of these faults. Our example from a high-strain fault pattern above a detachment is compatible with size and spatial organization that was influenced primarily by boundary conditions such as fault shape, mechanical unit thickness and internal stratigraphy on a range of scales rather than purely by interaction among faults during their propagation.

  17. Spring ligament complex: Illustrated normal anatomy and spectrum of pathologies on 3T MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Omar, Hythem [Musculoskeletal Radiology, UT Southwestern Medical Center, Dallas, TX (United States); Saini, Vikram [Center for Infection and Inflammation Imaging Research, Johns Hopkins University, Baltimore, MD (United States); Wadhwa, Vibhor [Department of Radiology, University of Arkansas for Medical Sciences, Little Rock, AR (United States); Liu, George [Orthopaedic Surgery, UT Southwestern Medical Center, Dallas, TX (United States); Chhabra, Avneesh, E-mail: avneesh.chhabra@utsouthwestern.edu [Musculoskeletal Radiology, UT Southwestern Medical Center, Dallas, TX (United States)

    2016-11-15

    Highlights: • The Spring ligament complex is an important stabilizer of medial arch of foot. • Of all SLC components, the integrity of Supero-Medial band is the most important. • Associated pathologies with SLC instability include PTT injury, pes planovalgus and sinus tarsi syndrome. • Conservative and operative management are viable depending on pes planovalgus progression. - Abstract: The spring (plantar calcaneonavicular) ligament complex connects the calcaneus and navicular bone of the foot and serves as the primary static stabilizer of the medial longitudinal arch of the foot. In this article, we describe the normal anatomy of the spring ligament complex, illustrate 3T magnetic resonance imaging appearances in its normal and abnormal states, and discuss the pathological associations with relevant case examples.

  18. Even-odd charged multiplicity distributions and energy dependence of normalized multiplicity moments in different rapidity windows

    International Nuclear Information System (INIS)

    Wu Yuanfang; Liu Lianshou

    1990-01-01

    The even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows are calculated, starting from a simple picture for charge correlation with non-zero correlation length. The coincidence and separation of these distributions are explained. The calculated window-and energy-dependence of normalized moments recovered the behaviour found in experiments. A new definition for normalized moments is propossed, especially suitable for narrow rapidity windows

  19. The skin immune system (SIS): distribution and immunophenotype of lymphocyte subpopulations in normal human skin

    NARCIS (Netherlands)

    Bos, J. D.; Zonneveld, I.; Das, P. K.; Krieg, S. R.; van der Loos, C. M.; Kapsenberg, M. L.

    1987-01-01

    The complexity of immune response-associated cells present in normal human skin was recently redefined as the skin immune system (SIS). In the present study, the exact immunophenotypes of lymphocyte subpopulations with their localizations in normal human skin were determined quantitatively. B cells

  20. Joining Distributed Complex Objects: Definition and Performance

    NARCIS (Netherlands)

    Teeuw, W.B.; Teeuw, Wouter B.; Blanken, Henk

    1992-01-01

    The performance of a non-standard distributed database system is strongly ifluenced by complex objects. The effective exploitation of parallelism in querying them and a suitable structure to store them are required in order to obtain acceptable response times in these database environments where

  1. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  2. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve P. Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...

  3. Normal cranial bone marrow MR imaging pattern with age-related ADC value distribution

    International Nuclear Information System (INIS)

    Li Qi; Pan Shinong; Yin Yuming; Li Wei; Chen Zhian; Liu Yunhui; Wu Zhenhua; Guo Qiyong

    2011-01-01

    Objective: To determine MRI appearances of normal age-related cranial bone marrow and the relationship between MRI patterns and apparent diffusion coefficient (ADC) values. Methods: Five hundred subjects were divided into seven groups based on ages. Cranial bone marrow MRI patterns were performed based on different thickness of the diploe and signal intensity distribution characteristics. ADC values of the frontal, parietal, occipital and temporal bones on DWI were measured and calculated. Correlations between ages and ADC values, between patterns and ADC values, as well as the distribution of ADC values were analyzed. Results: Normal cranial bone marrow was divided into four types and six subtypes, Type I, II, III and IV, which had positive correlation with age increasing (χ 2 = 266.36, P 0.05). In addition, there was significant negative correlation between the ADC values and MRI patterns in the normal parietal and occipital bones (r = -0.691 and -0.750, P < 0.01). Conclusion: The combination of MRI features and ADC values changes in different cranial bones showed significant correlation with age increasing. Familiar with the MRI appearance of the normal bone marrow conversion pattern in different age group and their ADC value will aid the diagnosis and differential of the cranial bone pathology.

  4. Breast cancer subtype distribution is different in normal weight, overweight, and obese women.

    Science.gov (United States)

    Gershuni, Victoria; Li, Yun R; Williams, Austin D; So, Alycia; Steel, Laura; Carrigan, Elena; Tchou, Julia

    2017-06-01

    Obesity is associated with tumor promoting pathways related to insulin resistance and chronic low-grade inflammation which have been linked to various disease states, including cancer. Many studies have focused on the relationship between obesity and increased estrogen production, which contributes to the pathogenesis of estrogen receptor-positive breast cancers. The link between obesity and other breast cancer subtypes, such as triple-negative breast cancer (TNBC) and Her2/neu+ (Her2+) breast cancer, is less clear. We hypothesize that obesity may be associated with the pathogenesis of specific breast cancer subtypes resulting in a different subtype distribution than normal weight women. A single-institution, retrospective analysis of tumor characteristics of 848 patients diagnosed with primary operable breast cancer between 2000 and 2013 was performed to evaluate the association between BMI and clinical outcome. Patients were grouped based on their BMI at time of diagnosis stratified into three subgroups: normal weight (BMI = 18-24.9), overweight (BMI = 25-29.9), and obese (BMI > 30). The distribution of breast cancer subtypes across the three BMI subgroups was compared. Obese and overweight women were more likely to present with TNBC and normal weight women with Her2+ breast cancer (p = 0.008). We demonstrated, for the first time, that breast cancer subtype distribution varied significantly according to BMI status. Our results suggested that obesity might activate molecular pathways other than the well-known obesity/estrogen circuit in the pathogenesis of breast cancer. Future studies are needed to understand the molecular mechanisms that drive the variation in subtype distribution across BMI subgroups.

  5. Chinese Writing of Deaf or Hard-of-hearing Students and Normal-hearing Peers from Complex Network Approach

    Directory of Open Access Journals (Sweden)

    Huiyuan Jin

    2016-11-01

    Full Text Available Deaf or hard-of-hearing individuals usually face a greater challenge to learn to write than their normal-hearing counterparts, because sign language is the primary communicative skills for many deaf people. The current body of research only covers the detailed linguistic features of deaf or hard-of-hearing students. Due to the limitations of traditional research methods focusing on microscopic linguistic features, a holistic characterization of the writing linguistic features of these language users is lacking. This study attempts to fill this gap by adopting the methodology of linguistic complex networks. Two syntactic dependency networks in order to compare the macroscopic linguistic features of deaf or hard-of-hearing students and those of their normal-hearing peers. One is transformed from a treebank of writing produced by Chinese deaf or hard-of-hearing students, and the other from a treebank of writing produced by their Chinese normal-hearing counterparts. Two major findings are obtained through comparison of the statistical features of the two networks. On the one hand, both linguistic networks display small-world and scale-free network structures, but the network of the normal-hearing students’ exhibits a more power-law-like degree distribution. Relevant network measures show significant differences between the two linguistic networks. On the other hand, deaf or hard-of-hearing students tend to have a lower language proficiency level in both syntactic and lexical aspects. The rigid use of function words and a lower vocabulary richness of the deaf or hard-of-hearing students may partially account for the observed differences.

  6. Distribution of adenosine deaminase complexing protein (ADCP) in human tissues.

    Science.gov (United States)

    Dinjens, W N; ten Kate, J; van der Linden, E P; Wijnen, J T; Khan, P M; Bosman, F T

    1989-12-01

    The normal distribution of adenosine deaminase complexing protein (ADCP) in the human body was investigated quantitatively by ADCP-specific radioimmunoassay (RIA) and qualitatively by immunohistochemistry. In these studies we used a specific rabbit anti-human ADCP antiserum. In all 19 investigated tissues, except erythrocytes, ADCP was found by RIA in the soluble and membrane fractions. From all tissues the membrane fractions contained more ADCP (expressed per mg protein) than the soluble fractions. High membrane ADCP concentrations were found in skin, renal cortex, gastrointestinal tract, and prostate. Immunoperoxidase staining confirmed the predominant membrane-associated localization of the protein. In serous sweat glands, convoluted tubules of renal cortex, bile canaliculi, gastrointestinal tract, lung, pancreas, prostate gland, salivary gland, gallbladder, mammary gland, and uterus, ADCP immunoreactivity was found confined to the luminal membranes of the epithelial cells. These data demonstrate that ADCP is present predominantly in exocrine glands and absorptive epithelia. The localization of ADCP at the secretory or absorptive apex of the cells suggests that the function of ADCP is related to the secretory and/or absorptive process.

  7. Carbon K-shell photoionization of CO: Molecular frame angular distributions of normal and conjugate shakeup satellites

    International Nuclear Information System (INIS)

    Jahnke, T.; Titze, J.; Foucar, L.; Wallauer, R.; Osipov, T.; Benis, E.P.; Jagutzki, O.; Arnold, W.; Czasch, A.; Staudte, A.; Schoeffler, M.; Alnaser, A.; Weber, T.; Prior, M.H.; Schmidt-Boecking, H.; Doerner, R.

    2011-01-01

    We have measured the molecular frame angular distributions of photoelectrons emitted from the Carbon K-shell of fixed-in-space CO molecules for the case of simultaneous excitation of the remaining molecular ion. Normal and conjugate shakeup states are observed. Photoelectrons belonging to normal Σ-satellite lines show an angular distribution resembling that observed for the main photoline at the same electron energy. Surprisingly a similar shape is found for conjugate shakeup states with Π-symmetry. In our data we identify shake rather than electron scattering (PEVE) as the mechanism producing the conjugate lines. The angular distributions clearly show the presence of a Σ shape resonance for all of the satellite lines.

  8. Distribution of the anticancer drugs doxorubicin, mitoxantrone and topotecan in tumors and normal tissues.

    Science.gov (United States)

    Patel, Krupa J; Trédan, Olivier; Tannock, Ian F

    2013-07-01

    Pharmacokinetic analyses estimate the mean concentration of drug within a given tissue as a function of time, but do not give information about the spatial distribution of drugs within that tissue. Here, we compare the time-dependent spatial distribution of three anticancer drugs within tumors, heart, kidney, liver and brain. Mice bearing various xenografts were treated with doxorubicin, mitoxantrone or topotecan. At various times after injection, tumors and samples of heart, kidney, liver and brain were excised. Within solid tumors, the distribution of doxorubicin, mitoxantrone and topotecan was limited to perivascular regions at 10 min after administration and the distance from blood vessels at which drug intensity fell to half was ~25-75 μm. Although drug distribution improved after 3 and 24 h, there remained a significant decrease in drug fluorescence with increasing distance from tumor blood vessels. Drug distribution was relatively uniform in the heart, kidney and liver with substantially greater perivascular drug uptake than in tumors. There was significantly higher total drug fluorescence in the liver than in tumors after 10 min, 3 and 24 h. Little to no drug fluorescence was observed in the brain. There are marked differences in the spatial distributions of three anticancer drugs within tumor tissue and normal tissues over time, with greater exposure to most normal tissues and limited drug distribution to many cells in tumors. Studies of the spatial distribution of drugs are required to complement pharmacokinetic data in order to better understand and predict drug effects and toxicities.

  9. Complexity of Resilient Power Distribution Networks

    International Nuclear Information System (INIS)

    May, Michael

    2008-01-01

    Power Systems in general and specifically the problems of communication, control, and coordination in human supervisory control of electric power transmission and distribution networks constitute a good case study for resilience engineering. Because of the high cost and high impact on society of transmission disturbances and blackouts and the vulnerability of power networks to terrorist attacks, Transmission Systems Operators (TSOs) are already focusing on organizational structures, procedures, and technical innovations that could improve the flexibility and robustness of power Systems and achieve the overall goal of providing secure power supply. For a number of reasons however the complexity of power Systems is increasing and new problems arise for human supervisory control and the ability of these Systems to implement fast recovery from disturbances. Around the world power Systems are currently being restructured to adapt to regional electricity markets and secure the availability, resilience and sustainability of electric power generation, transmission and distribution. This demands a reconsideration of the available decision support, the activity of human supervisory control of the highly automated processes involved and the procedures regulating it, as well as the role of the TSOs and the regional, national and international organizations set up to manage their activity. Unfortunately we can expect that human supervisory control of power Systems will become more complex in the near future for a number of reasons. The European Union for the Co-ordination of Transmission of Electricity (UCTE) has remarked that although the interconnected Systems of power transmission networks has been developed over the years with the main goal of providing secure power supply through common use of reserve capacities and the optimization of the use of energy resources, today's market dynamics imposing a high level of cross-border exchanges is 'out of the scope of the

  10. Modeling complexity in engineered infrastructure system: Water distribution network as an example

    Science.gov (United States)

    Zeng, Fang; Li, Xiang; Li, Ke

    2017-02-01

    The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.

  11. Spatially Distributed Social Complex Networks

    Directory of Open Access Journals (Sweden)

    Gerald F. Frasco

    2014-01-01

    Full Text Available We propose a bare-bones stochastic model that takes into account both the geographical distribution of people within a country and their complex network of connections. The model, which is designed to give rise to a scale-free network of social connections and to visually resemble the geographical spread seen in satellite pictures of the Earth at night, gives rise to a power-law distribution for the ranking of cities by population size (but for the largest cities and reflects the notion that highly connected individuals tend to live in highly populated areas. It also yields some interesting insights regarding Gibrat’s law for the rates of city growth (by population size, in partial support of the findings in a recent analysis of real data [Rozenfeld et al., Proc. Natl. Acad. Sci. U.S.A. 105, 18702 (2008.]. The model produces a nontrivial relation between city population and city population density and a superlinear relationship between social connectivity and city population, both of which seem quite in line with real data.

  12. Spatially Distributed Social Complex Networks

    Science.gov (United States)

    Frasco, Gerald F.; Sun, Jie; Rozenfeld, Hernán D.; ben-Avraham, Daniel

    2014-01-01

    We propose a bare-bones stochastic model that takes into account both the geographical distribution of people within a country and their complex network of connections. The model, which is designed to give rise to a scale-free network of social connections and to visually resemble the geographical spread seen in satellite pictures of the Earth at night, gives rise to a power-law distribution for the ranking of cities by population size (but for the largest cities) and reflects the notion that highly connected individuals tend to live in highly populated areas. It also yields some interesting insights regarding Gibrat's law for the rates of city growth (by population size), in partial support of the findings in a recent analysis of real data [Rozenfeld et al., Proc. Natl. Acad. Sci. U.S.A. 105, 18702 (2008).]. The model produces a nontrivial relation between city population and city population density and a superlinear relationship between social connectivity and city population, both of which seem quite in line with real data.

  13. Neutron importance and the generalized Green function for the conventionally critical reactor with normalized neutron distribution

    International Nuclear Information System (INIS)

    Khromov, V.V.

    1978-01-01

    The notion of neutron importance when applied to nuclear reactor statics problems described by time-independent homogeneous equations of neutron transport with provision for normalization of neutron distribution is considered. An equation has been obtained for the function of neutron importance in a conditionally critical reactor with respect to an arbitrary nons linear functional determined for the normalized neutron distribution. Relation between this function and the generalized Green function of the selfconjugated operator of the reactor equation is determined and the formula of small perturbations for the functionals of a conditionally critical reactor is deduced

  14. Climatological characteristics of raindrop size distributions within a topographically complex area

    Science.gov (United States)

    Suh, S.-H.; You, C.-H.; Lee, D.-I.

    2015-04-01

    Raindrop size distribution (DSD) characteristics within the complex area of Busan, Korea (35.12° N, 129.10° E) were studied using a Precipitation Occurrence Sensor System (POSS) disdrometer over a four-year period from 24 February 2001 to 24 December 2004. Average DSD parameters in Busan, a mid-latitude site, were compared with corresponding parameters recorded in the high-latitude site of Järvenpää, Finland. Mean values of median drop diameter (D0) and the shape parameter (μ) in Busan are smaller than those in Järvenpää, whereas the mean normalized intercept parameter (Nw) and rainfall rate (R) are higher in Busan. To analyze the climatological DSD characteristics in more detail, the entire period of recorded rainfall was divided into 10 categories with different temporal and spatial scales. When only convective rainfall was considered, mean Dm and Nw values for all these categories converged around a maritime cluster, except for rainfall associated with typhoons. The convective rainfall of a typhoon showed much smaller Dm and larger Nw compared with the other rainfall categories. In terms of diurnal DSD variability, we observe maritime (continental) precipitation during the daytime (DT) (nighttime, NT), which likely results from sea (land) breeze identified through wind direction analysis. These features also appeared in the seasonal diurnal distribution. The DT and NT Probability Density Function (PDF) during the summer was similar to the PDF of the entire study period. However, the DT and NT PDF during the winter season displayed an inverse distribution due to seasonal differences in wind direction.

  15. Elastin distribution in the normal uterus, uterine leiomyomas, adenomyosis and adenomyomas: a comparison.

    Science.gov (United States)

    Zheng, Wei-Qiang; Ma, Rong; Zheng, Jian-Ming; Gong, Zhi-Jing

    2006-04-01

    To describe the histologic distribution of elastin in the nonpregnant human uterus, uterine leiomyomas, adenomyosis and adenomyomas. Uteri were obtained from women undergoing hysterectomy for benign conditions, including 26 cases of uterine leiomyomas, 24 cases of adenomyosis, 18 adenomyomas and 6 cases of autopsy specimens. Specific histochemical staining techniques were employed in order to demonstrate the distribution of elastin. The distribution of elastin components in the uterus was markedly uneven and showed a decreasing gradient from outer to inner myometrium. No elastin was present within leiomyomas, adenomyomas or adenomyosis. The distribution of elastin may help explain the normal function of the myometrium in labor. It implies that the uneven distribution of elastin components and absence of elastin within leiomyomas, adenomyomas and adenomyosis could be of some clinical significance. The altered elastin distribution in disease states may help explain such symptoms as dysmenorrhea in uterine endometriosis.

  16. The normal distribution of thoracoabdominal aorta small branch artery ostia

    International Nuclear Information System (INIS)

    Cronin, Paul; Williams, David M.; Vellody, Ranjith; Kelly, Aine Marie; Kazerooni, Ella A.; Carlos, Ruth C.

    2011-01-01

    The purpose of this study was to determine the normal distribution of aortic branch artery ostia. CT scans of 100 subjects were retrospectively reviewed. The angular distributions of the aorta with respect to the center of the T3 to L4 vertebral bodies, and of branch artery origins with respect to the center of the aorta were measured. At each vertebral body level the distribution of intercostal/lumbar arteries and other branch arteries were calculated. The proximal descending aorta is posteriorly placed becoming a midline structure, at the thoracolumbar junction, and remains anterior to the vertebral bodies within the abdomen. The intercostal and lumbar artery ostia have a distinct distribution. At each vertebral level from T3 caudally, one intercostal artery originates from the posterior wall of the aorta throughout the thoracic aorta, while the other intercostal artery originates from the medial wall of the descending thoracic aorta high in the chest, posteromedially from the mid-thoracic aorta, and from the posterior wall of the aorta low in the chest. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Lumbar branches originate only from the posterior wall of the abdominal aorta. Aortic branch artery origins arise with a bimodal distribution and have a characteristic location. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Knowing the location of aortic branch artery ostia may help distinguish branch artery pseudoaneurysms from penetrating ulcers.

  17. Bifurcations of the normal modes of the Ne...Br{sub 2} complex

    Energy Technology Data Exchange (ETDEWEB)

    Blesa, Fernando [Departamento de Fisica Aplicada, Universidad de Zaragoza, Zaragoza (Spain); Mahecha, Jorge [Instituto de Fisica, Universidad de Antioquia, Medellin (Colombia); Salas, J. Pablo [Area de Fisica Aplicada, Universidad de La Rioja, Logrono (Spain); Inarrea, Manuel, E-mail: manuel.inarrea@unirioja.e [Area de Fisica Aplicada, Universidad de La Rioja, Logrono (Spain)

    2009-12-28

    We study the classical dynamics of the rare gas-dihalogen Ne...Br{sub 2} complex in its ground electronic state. By considering the dihalogen bond frozen at its equilibrium distance, the system has two degrees of freedom and its potential energy surface presents linear and T-shape isomers. We find the nonlinear normal modes of both isomers that determine the phase space structure of the system. By means of surfaces of section and applying the numerical continuation of families of periodic orbits, we detect and identify the different bifurcations suffered by the normal modes as a function of the system energy. Finally, using the Orthogonal Fast Lyapunov Indicator (OFLI), we study the evolution of the fraction of the phase space volume occupied by regular motions.

  18. Probability distribution of atmospheric pollutants: comparison among four methods for the determination of the log-normal distribution parameters; La distribuzione di probabilita` degli inquinanti atmosferici: confronto tra quattro metodi per la determinazione dei parametri della distribuzione log-normale

    Energy Technology Data Exchange (ETDEWEB)

    Bellasio, R [Enviroware s.r.l., Agrate Brianza, Milan (Italy). Centro Direzionale Colleoni; Lanzani, G; Ripamonti, M; Valore, M [Amministrazione Provinciale, Como (Italy)

    1998-04-01

    This work illustrates the possibility to interpolate the measured concentrations of CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} during one year (1995) at the 13 stations of the air quality monitoring station network of the Provinces of Como and Lecco (Italy) by means of a log-normal distribution. Particular attention was given in choosing the method for the determination of the log-normal distribution parameters among four possible methods: I natural, II percentiles, III moments, IV maximum likelihood. In order to evaluate the goodness of fit a ranking procedure was carried out over the values of four indices: absolute deviation, weighted absolute deviation, Kolmogorov-Smirnov index and Cramer-von Mises-Smirnov index. The capability of the log-normal distribution to fit the measured data is then discussed as a function of the pollutant and of the monitoring station. Finally an example of application is given: the effect of an emission reduction strategy in Lombardy Region (the so called `bollino blu`) is evaluated using a log-normal distribution. [Italiano] In questo lavoro si discute la possibilita` di interpolare le concentrazioni misurate di CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} durante un anno solare (il 1995) nelle 13 stazioni della Rete di Rilevamento della qualita` dell`aria delle Provincie di Como e di Lecco mediante una funzione log-normale. In particolare si discute quale metodo e` meglio usare per l`individuazione dei 2 parametri caratteristici della log-normale, tra 4 teoreticamente possibili: I naturale, II dei percentili, III dei momenti, IV della massima verosimiglianza. Per valutare i risultati ottenuti si usano: la deviazione assoluta, la deviazione pesata, il parametro di Kolmogorov-Smirnov e quello di Cramer-von Mises-Smirnov effettuando un ranking tra i metodi in funzione degli inquinanti e della stazione di misura. Ancora in funzione degli inquinanti e delle diverse stazioni di misura si discute poi la capacita` della funzione log-normale di

  19. Complexation behavior of oppositely charged polyelectrolytes: Effect of charge distribution

    International Nuclear Information System (INIS)

    Zhao, Mingtian; Li, Baohui; Zhou, Jihan; Su, Cuicui; Niu, Lin; Liang, Dehai

    2015-01-01

    Complexation behavior of oppositely charged polyelectrolytes in a solution is investigated using a combination of computer simulations and experiments, focusing on the influence of polyelectrolyte charge distributions along the chains on the structure of the polyelectrolyte complexes. The simulations are performed using Monte Carlo with the replica-exchange algorithm for three model systems where each system is composed of a mixture of two types of oppositely charged model polyelectrolyte chains (EGEG) 5 /(KGKG) 5 , (EEGG) 5 /(KKGG) 5 , and (EEGG) 5 /(KGKG) 5 , in a solution including explicit solvent molecules. Among the three model systems, only the charge distributions along the chains are not identical. Thermodynamic quantities are calculated as a function of temperature (or ionic strength), and the microscopic structures of complexes are examined. It is found that the three systems have different transition temperatures, and form complexes with different sizes, structures, and densities at a given temperature. Complex microscopic structures with an alternating arrangement of one monolayer of E/K monomers and one monolayer of G monomers, with one bilayer of E and K monomers and one bilayer of G monomers, and with a mixture of monolayer and bilayer of E/K monomers in a box shape and a trilayer of G monomers inside the box are obtained for the three mixture systems, respectively. The experiments are carried out for three systems where each is composed of a mixture of two types of oppositely charged peptide chains. Each peptide chain is composed of Lysine (K) and glycine (G) or glutamate (E) and G, in solution, and the chain length and amino acid sequences, and hence the charge distribution, are precisely controlled, and all of them are identical with those for the corresponding model chain. The complexation behavior and complex structures are characterized through laser light scattering and atomic force microscopy measurements. The order of the apparent weight

  20. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming; Jin, Ick-Hoon

    2013-01-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  1. A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants

    KAUST Repository

    Liang, Faming

    2013-08-01

    Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.

  2. Externally studentized normal midrange distribution

    Directory of Open Access Journals (Sweden)

    Ben Dêivide de Oliveira Batista

    Full Text Available ABSTRACT The distribution of externally studentized midrange was created based on the original studentization procedures of Student and was inspired in the distribution of the externally studentized range. The large use of the externally studentized range in multiple comparisons was also a motivation for developing this new distribution. This work aimed to derive analytic equations to distribution of the externally studentized midrange, obtaining the cumulative distribution, probability density and quantile functions and generating random values. This is a new distribution that the authors could not find any report in the literature. A second objective was to build an R package for obtaining numerically the probability density, cumulative distribution and quantile functions and make it available to the scientific community. The algorithms were proposed and implemented using Gauss-Legendre quadrature and the Newton-Raphson method in R software, resulting in the SMR package, available for download in the CRAN site. The implemented routines showed high accuracy proved by using Monte Carlo simulations and by comparing results with different number of quadrature points. Regarding to the precision to obtain the quantiles for cases where the degrees of freedom are close to 1 and the percentiles are close to 100%, it is recommended to use more than 64 quadrature points.

  3. Generalization of the normal-exponential model: exploration of a more accurate parametrisation for the signal distribution on Illumina BeadArrays.

    Science.gov (United States)

    Plancade, Sandra; Rozenholc, Yves; Lund, Eiliv

    2012-12-11

    Illumina BeadArray technology includes non specific negative control features that allow a precise estimation of the background noise. As an alternative to the background subtraction proposed in BeadStudio which leads to an important loss of information by generating negative values, a background correction method modeling the observed intensities as the sum of the exponentially distributed signal and normally distributed noise has been developed. Nevertheless, Wang and Ye (2012) display a kernel-based estimator of the signal distribution on Illumina BeadArrays and suggest that a gamma distribution would represent a better modeling of the signal density. Hence, the normal-exponential modeling may not be appropriate for Illumina data and background corrections derived from this model may lead to wrong estimation. We propose a more flexible modeling based on a gamma distributed signal and a normal distributed background noise and develop the associated background correction, implemented in the R-package NormalGamma. Our model proves to be markedly more accurate to model Illumina BeadArrays: on the one hand, it is shown on two types of Illumina BeadChips that this model offers a more correct fit of the observed intensities. On the other hand, the comparison of the operating characteristics of several background correction procedures on spike-in and on normal-gamma simulated data shows high similarities, reinforcing the validation of the normal-gamma modeling. The performance of the background corrections based on the normal-gamma and normal-exponential models are compared on two dilution data sets, through testing procedures which represent various experimental designs. Surprisingly, we observe that the implementation of a more accurate parametrisation in the model-based background correction does not increase the sensitivity. These results may be explained by the operating characteristics of the estimators: the normal-gamma background correction offers an improvement

  4. The effects of model and data complexity on predictions from species distributions models

    DEFF Research Database (Denmark)

    García-Callejas, David; Bastos, Miguel

    2016-01-01

    How complex does a model need to be to provide useful predictions is a matter of continuous debate across environmental sciences. In the species distributions modelling literature, studies have demonstrated that more complex models tend to provide better fits. However, studies have also shown...... that predictive performance does not always increase with complexity. Testing of species distributions models is challenging because independent data for testing are often lacking, but a more general problem is that model complexity has never been formally described in such studies. Here, we systematically...

  5. Hazard tolerance of spatially distributed complex networks

    International Nuclear Information System (INIS)

    Dunn, Sarah; Wilkinson, Sean

    2017-01-01

    In this paper, we present a new methodology for quantifying the reliability of complex systems, using techniques from network graph theory. In recent years, network theory has been applied to many areas of research and has allowed us to gain insight into the behaviour of real systems that would otherwise be difficult or impossible to analyse, for example increasingly complex infrastructure systems. Although this work has made great advances in understanding complex systems, the vast majority of these studies only consider a systems topological reliability and largely ignore their spatial component. It has been shown that the omission of this spatial component can have potentially devastating consequences. In this paper, we propose a number of algorithms for generating a range of synthetic spatial networks with different topological and spatial characteristics and identify real-world networks that share the same characteristics. We assess the influence of nodal location and the spatial distribution of highly connected nodes on hazard tolerance by comparing our generic networks to benchmark networks. We discuss the relevance of these findings for real world networks and show that the combination of topological and spatial configurations renders many real world networks vulnerable to certain spatial hazards. - Highlights: • We develop a method for quantifying the reliability of real-world systems. • We assess the spatial resilience of synthetic spatially distributed networks. • We form algorithms to generate spatial scale-free and exponential networks. • We show how these “synthetic” networks are proxies for real world systems. • Conclude that many real world systems are vulnerable to spatially coherent hazard.

  6. Energy dependence of angular distributions of sputtered particles by ion-beam bombardment at normal incidence

    International Nuclear Information System (INIS)

    Matsuda, Yoshinobu; Ueda, Yasutoshi; Uchino, Kiichiro; Muraoka, Katsunori; Maeda, Mitsuo; Akazaki, Masanori; Yamamura, Yasunori.

    1986-01-01

    The angular distributions of sputtered Fe-atoms were measured using the laser fluorescence technique during Ar-ion bombardment for energies of 0.6, 1, 2 and 3 keV at normal incidence. The measured cosine distribution at 0.6 keV progressively deviated to an over-cosine distribution at higher energies, and at 3 keV the angular distribution was an overcosine distribution of about 20 %. The experimental results agree qualitatively with calculations by a recent computer simulation code, ACAT. The results are explained by the competition between surface scattering and the effects of primary knock-on atoms, which tend to make the angular distributions over-cosine and under-cosine, respectively. (author)

  7. Normal distribution of /sup 111/In chloride on scintigram

    Energy Technology Data Exchange (ETDEWEB)

    Oyama, K; Machida, K; Hayashi, S; Watari, T; Akaike, A

    1977-05-01

    Indium-111-chloride (/sup 111/InCl/sub 3/) was used as a bone marrow imaging and a tumor-localizing agent in 38 patients (46 scintigrams), who were suspected of, or diagnosed as, having malignant disease, and who were irradiated for malignant disease. The regions of suspected malignant disease, of abnormally accumulated on scintigrams, and the target irradiated, were excluded to estimate the normal distribution of /sup 111/InCl/sub 3/. Scintigrams were taken 48 hrs after intravenous injection of /sup 111/InCl/sub 3/ 1 to 3 mCi. The percent and score distribution of /sup 111/InCl/sub 3/ were noted in 23 regions. As the liver showed the highest accumulation of /sup 111/In on all scintigrams, the liver was designated as 2+. Comparing with the radioactivity in the liver, other regions had similar (2+), moderately decreased (+), or severely decreased (-) accumulation on scintigram. The score is given one for 2+, 0.5 for +, 0 for -. The score and percentage distributions were: liver 100 (100%), lumbar vertebra 58.5 (100%), mediastinum 55 (100%), nasopharynx 50 (100%), testis 47.5 (59%), heart 44.5 (89%), and pelvis 43.5 (78%). Comparing this study with a previous study of /sup 111/In-BLM, score distribution in lumbar vertebra, pelvis, and skull were similar. /sup 111/In-BLM is excreted rapidly after injection, but little /sup 111/InCl/sub 3/ is excreted. Accumulation of /sup 111/In in bone marrow depends upon the amount of /sup 111/In-transferrin in blood. High accumulation in the lumbar vertebra and pelvis shows that /sup 111/InCl/sub 3/ would be effective as a bone marrow imaging agent.

  8. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  9. Impact of foot progression angle on the distribution of plantar pressure in normal children.

    Science.gov (United States)

    Lai, Yu-Cheng; Lin, Huey-Shyan; Pan, Hui-Fen; Chang, Wei-Ning; Hsu, Chien-Jen; Renn, Jenn-Huei

    2014-02-01

    Plantar pressure distribution during walking is affected by several gait factors, most especially the foot progression angle which has been studied in children with neuromuscular diseases. However, this relationship in normal children has only been reported in limited studies. The purpose of this study is to clarify the correlation between foot progression angle and plantar pressure distribution in normal children, as well as the impacts of age and sex on this correlation. This study retrospectively reviewed dynamic pedobarographic data that were included in the gait laboratory database of our institution. In total, 77 normally developed children aged 5-16 years who were treated between 2004 and 2009 were included. Each child's footprint was divided into 5 segments: lateral forefoot, medial forefoot, lateral midfoot, medial midfoot, and heel. The percentages of impulse exerted at the medial foot, forefoot, midfoot, and heel were calculated. The average foot progression angle was 5.03° toe-out. Most of the total impulse was exerted on the forefoot (52.0%). Toe-out gait was positively correlated with high medial (r = 0.274; P plantar pressure as part of the treatment of various foot pathologies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. An empirical multivariate log-normal distribution representing uncertainty of biokinetic parameters for 137Cs

    International Nuclear Information System (INIS)

    Miller, G.; Martz, H.; Bertelli, L.; Melo, D.

    2008-01-01

    A simplified biokinetic model for 137 Cs has six parameters representing transfer of material to and from various compartments. Using a Bayesian analysis, the joint probability distribution of these six parameters is determined empirically for two cases with quite a lot of bioassay data. The distribution is found to be a multivariate log-normal. Correlations between different parameters are obtained. The method utilises a fairly large number of pre-determined forward biokinetic calculations, whose results are stored in interpolation tables. Four different methods to sample the multidimensional parameter space with a limited number of samples are investigated: random, stratified, Latin Hypercube sampling with a uniform distribution of parameters and importance sampling using a lognormal distribution that approximates the posterior distribution. The importance sampling method gives much smaller sampling uncertainty. No sampling method-dependent differences are perceptible for the uniform distribution methods. (authors)

  11. On the Use of the Log-Normal Particle Size Distribution to Characterize Global Rain

    Science.gov (United States)

    Meneghini, Robert; Rincon, Rafael; Liao, Liang

    2003-01-01

    Although most parameterizations of the drop size distributions (DSD) use the gamma function, there are several advantages to the log-normal form, particularly if we want to characterize the large scale space-time variability of the DSD and rain rate. The advantages of the distribution are twofold: the logarithm of any moment can be expressed as a linear combination of the individual parameters of the distribution; the parameters of the distribution are approximately normally distributed. Since all radar and rainfall-related parameters can be written approximately as a moment of the DSD, the first property allows us to express the logarithm of any radar/rainfall variable as a linear combination of the individual DSD parameters. Another consequence is that any power law relationship between rain rate, reflectivity factor, specific attenuation or water content can be expressed in terms of the covariance matrix of the DSD parameters. The joint-normal property of the DSD parameters has applications to the description of the space-time variation of rainfall in the sense that any radar-rainfall quantity can be specified by the covariance matrix associated with the DSD parameters at two arbitrary space-time points. As such, the parameterization provides a means by which we can use the spaceborne radar-derived DSD parameters to specify in part the covariance matrices globally. However, since satellite observations have coarse temporal sampling, the specification of the temporal covariance must be derived from ancillary measurements and models. Work is presently underway to determine whether the use of instantaneous rain rate data from the TRMM Precipitation Radar can provide good estimates of the spatial correlation in rain rate from data collected in 5(sup 0)x 5(sup 0) x 1 month space-time boxes. To characterize the temporal characteristics of the DSD parameters, disdrometer data are being used from the Wallops Flight Facility site where as many as 4 disdrometers have been

  12. Problems with using the normal distribution--and ways to improve quality and efficiency of data analysis.

    Directory of Open Access Journals (Sweden)

    Eckhard Limpert

    Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.

  13. Drug binding affinities and potencies are best described by a log-normal distribution and use of geometric means

    International Nuclear Information System (INIS)

    Stanisic, D.; Hancock, A.A.; Kyncl, J.J.; Lin, C.T.; Bush, E.N.

    1986-01-01

    (-)-Norepinephrine (NE) is used as an internal standard in their in vitro adrenergic assays, and the concentration of NE which produces a half-maximal inhibition of specific radioligand binding (affinity; K/sub I/), or half-maximal contractile response (potency; ED 50 ) has been measured numerous times. The goodness-of-fit test for normality was performed on both normal (Gaussian) or log 10 -normal frequency histograms of these data using the SAS Univariate procedure. Specific binding of 3 H-prazosin to rat liver (α 1 -), 3 H rauwolscine to rat cortex (α 2 -) and 3 H-dihydroalprenolol to rat ventricle (β 1 -) or rat lung (β 2 -receptors) was inhibited by NE; the distributions of NE K/sub I/'s at all these sites were skewed to the right, with highly significant (p 50 's of NE in isolated rabbit aorta (α 1 ), phenoxybenzamine-treated dog saphenous vein (α 2 ) and guinea pig atrium (β 1 ). The vasorelaxant potency of atrial natriuretic hormone in histamine-contracted rabbit aorta also was better described by a log-normal distribution, indicating that log-normalcy is probably a general phenomenon of drug-receptor interactions. Because data of this type appear to be log-normally distributed, geometric means should be used in parametric statistical analyses

  14. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  15. Statistical properties of the normalized ice particle size distribution

    Science.gov (United States)

    Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.

    2005-05-01

    Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000

  16. Mechanical properties of the normal human cartilage-bone complex in relation to age

    DEFF Research Database (Denmark)

    Ding, Ming; Dalstra, M; Linde, F

    1998-01-01

    OBJECTIVE: This study investigates the age-related variations in the mechanical properties of the normal human tibial cartilage-bone complex and the relationships between cartilage and bone. DESIGN: A novel technique was applied to assess the mechanical properties of the cartilage and bone by mea...... that are of importance for the understanding of the etiology and pathogenesis of degenerative joint diseases, such as arthrosis....

  17. On generalisations of the log-Normal distribution by means of a new product definition in the Kapteyn process

    Science.gov (United States)

    Duarte Queirós, Sílvio M.

    2012-07-01

    We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.

  18. Comparing of Normal Stress Distribution in Static and Dynamic Soil-Structure Interaction Analyses

    International Nuclear Information System (INIS)

    Kholdebarin, Alireza; Massumi, Ali; Davoodi, Mohammad; Tabatabaiefar, Hamid Reza

    2008-01-01

    It is important to consider the vertical component of earthquake loading and inertia force in soil-structure interaction analyses. In most circumstances, design engineers are primarily concerned about the analysis of behavior of foundations subjected to earthquake-induced forces transmitted from the bedrock. In this research, a single rigid foundation with designated geometrical parameters located on sandy-clay soil has been modeled in FLAC software with Finite Different Method and subjected to three different vertical components of earthquake records. In these cases, it is important to evaluate effect of footing on underlying soil and to consider normal stress in soil with and without footing. The distribution of normal stress under the footing in static and dynamic states has been studied and compared. This Comparison indicated that, increasing in normal stress under the footing caused by vertical component of ground excitations, has decreased dynamic vertical settlement in comparison with static state

  19. Elastic microfibril distribution in the cornea: Differences between normal and keratoconic stroma.

    Science.gov (United States)

    White, Tomas L; Lewis, Philip N; Young, Robert D; Kitazawa, Koji; Inatomi, Tsutomu; Kinoshita, Shigeru; Meek, Keith M

    2017-06-01

    The optical and biomechanical properties of the cornea are largely governed by the collagen-rich stroma, a layer that represents approximately 90% of the total thickness. Within the stroma, the specific arrangement of superimposed lamellae provides the tissue with tensile strength, whilst the spatial arrangement of individual collagen fibrils within the lamellae confers transparency. In keratoconus, this precise stromal arrangement is lost, resulting in ectasia and visual impairment. In the normal cornea, we previously characterised the three-dimensional arrangement of an elastic fiber network spanning the posterior stroma from limbus-to-limbus. In the peripheral cornea/limbus there are elastin-containing sheets or broad fibers, most of which become microfibril bundles (MBs) with little or no elastin component when reaching the central cornea. The purpose of the current study was to compare this network with the elastic fiber distribution in post-surgical keratoconic corneal buttons, using serial block face scanning electron microscopy and transmission electron microscopy. We have demonstrated that the MB distribution is very different in keratoconus. MBs are absent from a region of stroma anterior to Descemet's membrane, an area that is densely populated in normal cornea, whilst being concentrated below the epithelium, an area in which they are absent in normal cornea. We contend that these latter microfibrils are produced as a biomechanical response to provide additional strength to the anterior stroma in order to prevent tissue rupture at the apex of the cone. A lack of MBs anterior to Descemet's membrane in keratoconus would alter the biomechanical properties of the tissue, potentially contributing to the pathogenesis of the disease. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Thermal modelling of normal distributed nanoparticles through thickness in an inorganic material matrix

    Science.gov (United States)

    Latré, S.; Desplentere, F.; De Pooter, S.; Seveno, D.

    2017-10-01

    Nanoscale materials showing superior thermal properties have raised the interest of the building industry. By adding these materials to conventional construction materials, it is possible to decrease the total thermal conductivity by almost one order of magnitude. This conductivity is mainly influenced by the dispersion quality within the matrix material. At the industrial scale, the main challenge is to control this dispersion to reduce or even eliminate thermal bridges. This allows to reach an industrially relevant process to balance out the high material cost and their superior thermal insulation properties. Therefore, a methodology is required to measure and describe these nanoscale distributions within the inorganic matrix material. These distributions are either random or normally distributed through thickness within the matrix material. We show that the influence of these distributions is meaningful and modifies the thermal conductivity of the building material. Hence, this strategy will generate a thermal model allowing to predict the thermal behavior of the nanoscale particles and their distributions. This thermal model will be validated by the hot wire technique. For the moment, a good correlation is found between the numerical results and experimental data for a randomly distributed form of nanoparticles in all directions.

  1. Distribution and elimination of intravenously administered atrial natriuretic hormone(ANH) to normal and nephrectomized rats

    International Nuclear Information System (INIS)

    Devine, E.; Artman, L.; Budzik, G.; Bush, E.; Holleman, W.

    1986-01-01

    The 24 amino acid peptide, ANH(5-28), was N-terminally labeled with I-125 Bolton-Hunter reagent, iodo-N-succinimidyl 3-(4-hydroxyphenyl)propionate. The I-125 peptide plus 1μg/kg of the I-127 Bolton-Hunter peptide was injected into normal and nephrectomized anesthetized (Nembutal) rats. Blood samples were drawn into a cocktail developed to inhibit plasma induced degradation. Radiolabeled peptides were analyzed by HPLC. A biphasic curve of I-125 ANH(5-28) elimination was obtained, the first phase (t 1/2 = 15 sec) representing in vivo distribution and the second phase (t 1/2 = 7-10 min) a measurement of elimination. This biphasic elimination curve was similar in normal and nephrectomized rats. The apparent volumes of distribution were 15-20 ml for the first phase and > 300 ml for the second phase. In order to examine the tissue distribution of the peptide, animals were sacrificed at 2 minutes and the I-125 tissue contents were quantitated. The majority of the label was located in the liver (50%), kidneys (21%) and the lung (5%). The degradative peptides appearing in the plasma and urine of normal rats were identical. No intact radiolabeled ANH(5-28) was found in the urine. In conclusion, iodinated Bolton-Hunter labeled ANH(5-28) is rapidly removed from the circulation by the liver and to a lesser extent by the kidney, but the rate of elimination is not decreased by nephrectomy

  2. Effects of adipose tissue distribution on maximum lipid oxidation rate during exercise in normal-weight women.

    Science.gov (United States)

    Isacco, L; Thivel, D; Duclos, M; Aucouturier, J; Boisseau, N

    2014-06-01

    Fat mass localization affects lipid metabolism differently at rest and during exercise in overweight and normal-weight subjects. The aim of this study was to investigate the impact of a low vs high ratio of abdominal to lower-body fat mass (index of adipose tissue distribution) on the exercise intensity (Lipox(max)) that elicits the maximum lipid oxidation rate in normal-weight women. Twenty-one normal-weight women (22.0 ± 0.6 years, 22.3 ± 0.1 kg.m(-2)) were separated into two groups of either a low or high abdominal to lower-body fat mass ratio [L-A/LB (n = 11) or H-A/LB (n = 10), respectively]. Lipox(max) and maximum lipid oxidation rate (MLOR) were determined during a submaximum incremental exercise test. Abdominal and lower-body fat mass were determined from DXA scans. The two groups did not differ in aerobic fitness, total fat mass, or total and localized fat-free mass. Lipox(max) and MLOR were significantly lower in H-A/LB vs L-A/LB women (43 ± 3% VO(2max) vs 54 ± 4% VO(2max), and 4.8 ± 0.6 mg min(-1)kg FFM(-1)vs 8.4 ± 0.9 mg min(-1)kg FFM(-1), respectively; P normal-weight women, a predominantly abdominal fat mass distribution compared with a predominantly peripheral fat mass distribution is associated with a lower capacity to maximize lipid oxidation during exercise, as evidenced by their lower Lipox(max) and MLOR. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  3. Using an APOS Framework to Understand Teachers' Responses to Questions on the Normal Distribution

    Science.gov (United States)

    Bansilal, Sarah

    2014-01-01

    This study is an exploration of teachers' engagement with concepts embedded in the normal distribution. The participants were a group of 290 in-service teachers enrolled in a teacher development program. The research instrument was an assessment task that can be described as an "unknown percentage" problem, which required the application…

  4. On the possible ''normalization'' of experimental curves of 230Th vertical distribution in abyssal oceanic sediments

    International Nuclear Information System (INIS)

    Kuznetsov, Yu.V.; Al'terman, Eh.I.; Lisitsyn, A.P.; AN SSSR, Moscow. Inst. Okeanologii)

    1981-01-01

    The possibilities of the method of normalization of experimental ionic curves in reference to dating of abyssal sediments and establishing their accumulation rapidities are studied. The method is based on using correlation between ionic curves extrema and variations of Fe, Mn, C org., and P contents in abyssal oceanic sediments. It has been found that the above method can be successfully applied for correction of 230 Th vertical distribution data obtained by low-background γ-spectrometry. The method leads to most reliable results in those cases when the vertical distribution curves in sediments of elements concentrators of 230 Th are symbasic between themselves. The normalization of experimental ionic curves in many cases gives the possibility to realize the sediment age stratification [ru

  5. Distributed coding/decoding complexity in video sensor networks.

    Science.gov (United States)

    Cordeiro, Paulo J; Assunção, Pedro

    2012-01-01

    Video Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. In such large scale environments which include video coding, transmission and display/storage, there are several open problems to overcome in practical implementations. This paper addresses the most relevant challenges posed by VSNs, namely stringent bandwidth usage and processing time/power constraints. In particular, the paper proposes a novel VSN architecture where large sets of visual sensors with embedded processors are used for compression and transmission of coded streams to gateways, which in turn transrate the incoming streams and adapt them to the variable complexity requirements of both the sensor encoders and end-user decoder terminals. Such gateways provide real-time transcoding functionalities for bandwidth adaptation and coding/decoding complexity distribution by transferring the most complex video encoding/decoding tasks to the transcoding gateway at the expense of a limited increase in bit rate. Then, a method to reduce the decoding complexity, suitable for system-on-chip implementation, is proposed to operate at the transcoding gateway whenever decoders with constrained resources are targeted. The results show that the proposed method achieves good performance and its inclusion into the VSN infrastructure provides an additional level of complexity control functionality.

  6. One-dimensional time-dependent conduction states and temperature distribution along a normal zone during a quench

    International Nuclear Information System (INIS)

    Lopez, G.

    1991-01-01

    The quench simulations of a superconducting (s.c.) magnet requires some assumptions about the evolution of the normal zone and its temperature profile. The axial evolution of the normal zone is considered through the longitudinal quench velocity. However, the transversal quench propagation may be considered through the transversal quench velocity or with the turn-to-turn time delay quench propagation. The temperature distribution has been assumed adiabatic-like or cosine-like in two different computer programs. Although both profiles are different, they bring about more or less the same qualitative quench results differing only in about 8%. Unfortunately, there are not experimental data for the temperature profile along the conductor in a quench event to have a realistic comparison. Little attention has received the temperature profile, mainly because it is not so critical parameter in the quench analysis. Nonetheless, a confident quench analysis requires that the temperature distribution along the normal zone be taken into account with good approximation. In this paper, an analytical study is made about the temperature profile

  7. Normalization of informatisation parameter on airfield light-signal bar at flights in complex meteorological conditions

    Directory of Open Access Journals (Sweden)

    П.В. Попов

    2005-03-01

    Full Text Available  The technique of maintenance of the set level of flights safetivness is developed by normalization of informatisation parameters functional groups of light-signal lightings at technological stages of interaction of crew of the airplane with the airfield light-signals bar at flights in a complex weathercast conditions.

  8. Distribution of CD163-positive cell and MHC class II-positive cell in the normal equine uveal tract.

    Science.gov (United States)

    Sano, Yuto; Matsuda, Kazuya; Okamoto, Minoru; Takehana, Kazushige; Hirayama, Kazuko; Taniyama, Hiroyuki

    2016-02-01

    Antigen-presenting cells (APCs) in the uveal tract participate in ocular immunity including immune homeostasis and the pathogenesis of uveitis. In horses, although uveitis is the most common ocular disorder, little is known about ocular immunity, such as the distribution of APCs. In this study, we investigated the distribution of CD163-positive and MHC II-positive cells in the normal equine uveal tract using an immunofluorescence technique. Eleven eyes from 10 Thoroughbred horses aged 1 to 24 years old were used. Indirect immunofluorescence was performed using the primary antibodies CD163, MHC class II (MHC II) and CD20. To demonstrate the site of their greatest distribution, positive cells were manually counted in 3 different parts of the uveal tract (ciliary body, iris and choroid), and their average number was assessed by statistical analysis. The distribution of pleomorphic CD163- and MHC II-expressed cells was detected throughout the equine uveal tract, but no CD20-expressed cells were detected. The statistical analysis demonstrated the distribution of CD163- and MHC II-positive cells focusing on the ciliary body. These results demonstrated that the ciliary body is the largest site of their distribution in the normal equine uveal tract, and the ciliary body is considered to play important roles in uveal and/or ocular immune homeostasis. The data provided in this study will help further understanding of equine ocular immunity in the normal state and might be beneficial for understanding of mechanisms of ocular disorders, such as equine uveitis.

  9. Optimization of b-value distribution for biexponential diffusion-weighted MR imaging of normal prostate.

    Science.gov (United States)

    Jambor, Ivan; Merisaari, Harri; Aronen, Hannu J; Järvinen, Jukka; Saunavaara, Jani; Kauko, Tommi; Borra, Ronald; Pesola, Marko

    2014-05-01

    To determine the optimal b-value distribution for biexponential diffusion-weighted imaging (DWI) of normal prostate using both a computer modeling approach and in vivo measurements. Optimal b-value distributions for the fit of three parameters (fast diffusion Df, slow diffusion Ds, and fraction of fast diffusion f) were determined using Monte-Carlo simulations. The optimal b-value distribution was calculated using four individual optimization methods. Eight healthy volunteers underwent four repeated 3 Tesla prostate DWI scans using both 16 equally distributed b-values and an optimized b-value distribution obtained from the simulations. The b-value distributions were compared in terms of measurement reliability and repeatability using Shrout-Fleiss analysis. Using low noise levels, the optimal b-value distribution formed three separate clusters at low (0-400 s/mm2), mid-range (650-1200 s/mm2), and high b-values (1700-2000 s/mm2). Higher noise levels resulted into less pronounced clustering of b-values. The clustered optimized b-value distribution demonstrated better measurement reliability and repeatability in Shrout-Fleiss analysis compared with 16 equally distributed b-values. The optimal b-value distribution was found to be a clustered distribution with b-values concentrated in the low, mid, and high ranges and was shown to improve the estimation quality of biexponential DWI parameters of in vivo experiments. Copyright © 2013 Wiley Periodicals, Inc.

  10. Basic study on radiation distribution sensing with normal optical fiber

    International Nuclear Information System (INIS)

    Naka, R.; Kawarabayashi, J.; Uritani, A.; Iguchi, T.; Kaneko, J.; Takeuchi, H.; Kakuta, T.

    2000-01-01

    Recently, some methods of radiation distribution sensing with optical fibers have been proposed. These methods employ scintillating fibers or scintillators with wavelength-shifting fibers. The positions of radiation interactions are detected by applying a time-of-flight (TOF) technique to the scintillation photon propagation. In the former method, the attenuation length for the scintillation photons in the scintillating fiber is relatively short, so that the operating length of the sensor is limited to several meters. In the latter method, a radiation distribution cannot continuously be obtained but discretely. To improve these shortcomings, a normal optical fiber made of polymethyl methacrylate (PMMA) is used in this study. Although the scintillation efficiency of PMMA is very low, several photons are emitted through interaction with a radiation. The fiber is transparent for the emitted photons to have a relatively long operating length. A radiation distribution can continuously be obtained. This paper describes a principle of the position sensing method based on the time of flight technique and preliminary results obtained for 90 Sr- 90 Y beta rays, 137 Cs gamma rays, and 14 MeV neutrons. The spatial resolutions for the above three kinds of radiations are 0.30 m, 0.37 m, 0.13 m, and the detection efficiencies are 1.1 x 10 -3 , 1.6 x 10 -7 , 5.4 x 10 -6 , respectively, with 10 m operation length. The results of a spectroscopic study on the optical property of the fiber are also described. (author)

  11. The distribution of YKL-40 in osteoarthritic and normal human articular cartilage

    DEFF Research Database (Denmark)

    Volck, B; Ostergaard, K; Johansen, J S

    1999-01-01

    YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL......-40 in osteoarthritic (n=9) and macroscopically normal (n=5) human articular cartilage, collected from 12 pre-selected areas of the femoral head, to discover a potential role for YKL-40 in cartilage remodelling in osteoarthritis. Immunohistochemical analysis showed that YKL-40 staining was found...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...

  12. Surface complexation of selenite on goethite: MO/DFT geometry and charge distribution

    NARCIS (Netherlands)

    Hiemstra, T.; Rietra, R.P.J.J.; Riemsdijk, van W.H.

    2007-01-01

    The adsorption of selenite on goethite (alpha-FeOOH) has been analyzed with the charge distribution (CD) and the multi-site surface complexation (MUSIC) model being combined with an extended Stem (ES) layer model option. The geometry of a set of different types of hydrated iron-selenite complexes

  13. Validation of MCDS by comparison of predicted with experimental velocity distribution functions in rarefied normal shocks

    Science.gov (United States)

    Pham-Van-diep, Gerald C.; Erwin, Daniel A.

    1989-01-01

    Velocity distribution functions in normal shock waves in argon and helium are calculated using Monte Carlo direct simulation. These are compared with experimental results for argon at M = 7.18 and for helium at M = 1.59 and 20. For both argon and helium, the variable-hard-sphere (VHS) model is used for the elastic scattering cross section, with the velocity dependence derived from a viscosity-temperature power-law relationship in the way normally used by Bird (1976).

  14. Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach

    International Nuclear Information System (INIS)

    Goodarzi, Samereh; Pazirandeh, Ali; Jameie, Seyed Behnamedin; Baghban Khojasteh, Nasrin

    2012-01-01

    Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: ► Boron distribution in male and female rats' normal brain was studied in this research. ► Coronal sections of animal tissue samples were irradiated with thermal neutrons. ► Alpha and Lithium tracks were counted using alpha autoradiography. ► Different boron concentration was seen in brain sections of male and female rats. ► The highest boron concentration was seen in 4 h after boron compound injection.

  15. Analysis of a hundred-years series of magnetic activity indices. III. Is the frequency distribution logarithmo-normal

    International Nuclear Information System (INIS)

    Mayaud, P.N.

    1976-01-01

    Because of the various components of positive conservation existing in the series of aa indices, their frequency distribution is necessarily distorted with respect to any random distribution. However when one takes these various components into account, the observed distribution can be considered as being a logarithmo-normal distribution. This implies that the geomagnetic activity satisfies the conditions of the central limit theorem, according to which a phenomenon which presents such a distribution is due to independent causes whose effects are multiplicative. Furthermore, the distorsion of the frequency distribution caused by the 11-year and 90-year cycles corresponds to a pure attenuation effect; an interpretation by the solar 'coronal holes' is proposed [fr

  16. Distribution pattern of urine albumin creatinine ratio and the prevalence of high-normal levels in untreated asymptomatic non-diabetic hypertensive patients.

    Science.gov (United States)

    Ohmaru, Natsuki; Nakatsu, Takaaki; Izumi, Reishi; Mashima, Keiichi; Toki, Misako; Kobayashi, Asako; Ogawa, Hiroko; Hirohata, Satoshi; Ikeda, Satoru; Kusachi, Shozo

    2011-01-01

    Even high-normal albuminuria is reportedly associated with cardiovascular events. We determined the urine albumin creatinine ratio (UACR) in spot urine samples and analyzed the UACR distribution and the prevalence of high-normal levels. The UACR was determined using immunoturbidimetry in 332 untreated asymptomatic non-diabetic Japanese patients with hypertension and in 69 control subjects. The microalbuminuria and macroalbuminuria levels were defined as a UCAR ≥30 and creatinine and a UCAR ≥300 µg/mg·creatinine, respectively. The distribution patterns showed a highly skewed distribution for the lower levels, and a common logarithmic transformation produced a close fit to a Gaussian distribution with median, 25th and 75th percentile values of 22.6, 13.5 and 48.2 µg/mg·creatinine, respectively. When a high-normal UACR was set at >20 to creatinine, 19.9% (66/332) of the hypertensive patients exhibited a high-normal UACR. Microalbuminuria and macroalbuminuria were observed in 36.1% (120/336) and 2.1% (7/332) of the patients, respectively. UACR was significantly correlated with the systolic and diastolic blood pressures and the pulse pressure. A stepwise multivariate analysis revealed that these pressures as well as age were independent factors that increased UACR. The UACR distribution exhibited a highly skewed pattern, with approximately 60% of untreated, non-diabetic hypertensive patients exhibiting a high-normal or larger UACR. Both hypertension and age are independent risk factors that increase the UACR. The present study indicated that a considerable percentage of patients require anti-hypertensive drugs with antiproteinuric effects at the start of treatment.

  17. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Smith, Donald L.; Capote, Roberto

    2013-01-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue

  18. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, Gašper, E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Smith, Donald L., E-mail: donald.l.smith@anl.gov [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States); Capote, Roberto, E-mail: roberto.capotenoy@iaea.org [NAPC–Nuclear Data Section, International Atomic Energy Agency, PO Box 100, Vienna-A-1400 (Austria)

    2013-11-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue.

  19. Accuracy and uncertainty analysis of soil Bbf spatial distribution estimation at a coking plant-contaminated site based on normalization geostatistical technologies.

    Science.gov (United States)

    Liu, Geng; Niu, Junjie; Zhang, Chao; Guo, Guanlin

    2015-12-01

    Data distribution is usually skewed severely by the presence of hot spots in contaminated sites. This causes difficulties for accurate geostatistical data transformation. Three types of typical normal distribution transformation methods termed the normal score, Johnson, and Box-Cox transformations were applied to compare the effects of spatial interpolation with normal distribution transformation data of benzo(b)fluoranthene in a large-scale coking plant-contaminated site in north China. Three normal transformation methods decreased the skewness and kurtosis of the benzo(b)fluoranthene, and all the transformed data passed the Kolmogorov-Smirnov test threshold. Cross validation showed that Johnson ordinary kriging has a minimum root-mean-square error of 1.17 and a mean error of 0.19, which was more accurate than the other two models. The area with fewer sampling points and that with high levels of contamination showed the largest prediction standard errors based on the Johnson ordinary kriging prediction map. We introduce an ideal normal transformation method prior to geostatistical estimation for severely skewed data, which enhances the reliability of risk estimation and improves the accuracy for determination of remediation boundaries.

  20. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    Science.gov (United States)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  1. EVALUATING THE NOVEL METHODS ON SPECIES DISTRIBUTION MODELING IN COMPLEX FOREST

    Directory of Open Access Journals (Sweden)

    C. H. Tu

    2012-07-01

    Full Text Available The prediction of species distribution has become a focus in ecology. For predicting a result more effectively and accurately, some novel methods have been proposed recently, like support vector machine (SVM and maximum entropy (MAXENT. However, high complexity in the forest, like that in Taiwan, will make the modeling become even harder. In this study, we aim to explore which method is more applicable to species distribution modeling in the complex forest. Castanopsis carlesii (long-leaf chinkapin, LLC, growing widely in Taiwan, was chosen as the target species because its seeds are an important food source for animals. We overlaid the tree samples on the layers of altitude, slope, aspect, terrain position, and vegetation index derived from SOPT-5 images, and developed three models, MAXENT, SVM, and decision tree (DT, to predict the potential habitat of LLCs. We evaluated these models by two sets of independent samples in different site and the effect on the complexity of forest by changing the background sample size (BSZ. In the forest with low complex (small BSZ, the accuracies of SVM (kappa = 0.87 and DT (0.86 models were slightly higher than that of MAXENT (0.84. In the more complex situation (large BSZ, MAXENT kept high kappa value (0.85, whereas SVM (0.61 and DT (0.57 models dropped significantly due to limiting the habitat close to samples. Therefore, MAXENT model was more applicable to predict species’ potential habitat in the complex forest; whereas SVM and DT models would tend to underestimate the potential habitat of LLCs.

  2. Distributed Cooperation Solution Method of Complex System Based on MAS

    Science.gov (United States)

    Weijin, Jiang; Yuhui, Xu

    To adapt the model in reconfiguring fault diagnosing to dynamic environment and the needs of solving the tasks of complex system fully, the paper introduced multi-Agent and related technology to the complicated fault diagnosis, an integrated intelligent control system is studied in this paper. Based on the thought of the structure of diagnostic decision and hierarchy in modeling, based on multi-layer decomposition strategy of diagnosis task, a multi-agent synchronous diagnosis federation integrated different knowledge expression modes and inference mechanisms are presented, the functions of management agent, diagnosis agent and decision agent are analyzed, the organization and evolution of agents in the system are proposed, and the corresponding conflict resolution algorithm in given, Layered structure of abstract agent with public attributes is build. System architecture is realized based on MAS distributed layered blackboard. The real world application shows that the proposed control structure successfully solves the fault diagnose problem of the complex plant, and the special advantage in the distributed domain.

  3. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Distribution tactics for success in turbulent versus stable environments: A complexity theory approach

    Directory of Open Access Journals (Sweden)

    Roger Bruce Mason

    2013-11-01

    Full Text Available This article proposes that the external environment influences the choice of distribution tactics. Since businesses and markets are complex adaptive systems, using complexity theory to understand such environments is necessary, but it has not been widely researched. A qualitative case method using in-depth interviews investigated four successful, versus less successful, companies in turbulent versus stable environments. The results tentatively confirmed that the more successful company, in a turbulent market, sees distribution activities as less important than other aspects of the marketing mix, but uses them to stabilise customer relationships and to maintain distribution processes. These findings can benefit marketers by emphasising a new way to consider place activities. How marketers can be assisted, and suggestions for further research, are provided.

  5. Use of critical pathway models and log-normal frequency distributions for siting nuclear facilities

    International Nuclear Information System (INIS)

    Waite, D.A.; Denham, D.H.

    1975-01-01

    The advantages and disadvantages of potential sites for nuclear facilities are evaluated through the use of environmental pathway and log-normal distribution analysis. Environmental considerations of nuclear facility siting are necessarily geared to the identification of media believed to be sifnificant in terms of dose to man or to be potential centres for long-term accumulation of contaminants. To aid in meeting the scope and purpose of this identification, an exposure pathway diagram must be developed. This type of diagram helps to locate pertinent environmental media, points of expected long-term contaminant accumulation, and points of population/contaminant interface for both radioactive and non-radioactive contaminants. Confirmation of facility siting conclusions drawn from pathway considerations must usually be derived from an investigatory environmental surveillance programme. Battelle's experience with environmental surveillance data interpretation using log-normal techniques indicates that this distribution has much to offer in the planning, execution and analysis phases of such a programme. How these basic principles apply to the actual siting of a nuclear facility is demonstrated for a centrifuge-type uranium enrichment facility as an example. A model facility is examined to the extent of available data in terms of potential contaminants and facility general environmental needs. A critical exposure pathway diagram is developed to the point of prescribing the characteristics of an optimum site for such a facility. Possible necessary deviations from climatic constraints are reviewed and reconciled with conclusions drawn from the exposure pathway analysis. Details of log-normal distribution analysis techniques are presented, with examples of environmental surveillance data to illustrate data manipulation techniques and interpretation procedures as they affect the investigatory environmental surveillance programme. Appropriate consideration is given these

  6. Centralspindlin and Chromosomal Passenger Complex Behavior During Normal and Rappaport Furrow Specification in Echinoderm Embryos

    Science.gov (United States)

    Argiros, Haroula; Henson, Lauren; Holguin, Christiana; Foe, Victoria; Shuster, Charles Bradley

    2014-01-01

    The chromosomal passenger (CPC) and Centralspindlin complexes are essential for organizing the anaphase central spindle and providing cues that position the cytokinetic furrow between daughter nuclei. However, echinoderm zygotes are also capable of forming “Rappaport furrows” between asters positioned back-to-back without intervening chromosomes. To understand how these complexes contribute to normal and Rappaport furrow formation, we studied the localization patterns of Survivin and mitotic-kinesin-like-protein1 (MKLP1), members respectively of the CPC and the Centralspindlin complex, and the effect of CPC inhibition on cleavage in mono- and binucleate echinoderm zygotes. In zygotes, Survivin initially localized to metaphase chromosomes, upon anaphase onset relocalized to the central spindle and then, together with MKLP1 spread towards the equatorial cortex in an Aurora-dependent manner. Inhibition of Aurora kinase activity resulted in disruption of central spindle organization and furrow regression, although astral microtubule elongation and furrow initiation were normal. In binucleate cells containing two parallel spindles MKLP1 and Survivin localized to the plane of the former metaphase plate, but were not observed in the secondary cleavage plane formed between unrelated spindle poles, except when chromosomes were abnormally present there. However, the secondary furrow was sensitive to Aurora inhibition, indicating that Aurora kinase may still contribute to furrow ingression without chromosomes nearby. Our results provide insights that reconcile classic micromanipulation studies with current molecular understanding of furrow specification in animal cells. PMID:22887753

  7. Normal Approximations to the Distributions of the Wilcoxon Statistics: Accurate to What "N"? Graphical Insights

    Science.gov (United States)

    Bellera, Carine A.; Julien, Marilyse; Hanley, James A.

    2010-01-01

    The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…

  8. American Option Pricing using GARCH models and the Normal Inverse Gaussian distribution

    DEFF Research Database (Denmark)

    Stentoft, Lars Peter

    In this paper we propose a feasible way to price American options in a model with time varying volatility and conditional skewness and leptokurtosis using GARCH processes and the Normal Inverse Gaussian distribution. We show how the risk neutral dynamics can be obtained in this model, we interpret...... properties shows that there are important option pricing differences compared to the Gaussian case as well as to the symmetric special case. A large scale empirical examination shows that our model outperforms the Gaussian case for pricing options on three large US stocks as well as a major index...

  9. The distribution of YKL-40 in osteoarthritic and normal human articular cartilage

    DEFF Research Database (Denmark)

    Volck, B; Ostergaard, K; Johansen, J S

    1999-01-01

    YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...

  10. Change detection in polarimetric SAR images using complex Wishart distributed matrices

    DEFF Research Database (Denmark)

    Conradsen, Knut; Nielsen, Allan Aasbjerg; Skriver, Henning

    In surveillance it is important to be able to detect natural or man-made changes e.g. based on sequences of satellite or air borne images of the same area taken at different times. The mapping capability of synthetic aperture radar (SAR) is independent of e.g. cloud cover, and thus this technology...... scattering matrix, and after suitable preprocessing the outcome at each picture element (pixel) may be represented as a 3 by 3 Hermitian matrix following a complex Wishart distribution. One approach to solving the change detection problem based on SAR images is therefore to apply suitable statistical tests...... in the complex Wishart distribution. We propose a set-up for a systematic solution to the (practical) problems using the likelihood ratio test statistics. We show some examples based on a time series of images with 1024 by 1024 pixels....

  11. Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach

    Energy Technology Data Exchange (ETDEWEB)

    Goodarzi, Samereh, E-mail: samere.g@gmail.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Pazirandeh, Ali, E-mail: paziran@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Jameie, Seyed Behnamedin, E-mail: behnamjameie@tums.ac.ir [Basic Science Department, Faculty of Allied Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Department of Anatomy, Faculty of Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Baghban Khojasteh, Nasrin, E-mail: khojasteh_n@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of)

    2012-06-15

    Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: Black-Right-Pointing-Pointer Boron distribution in male and female rats' normal brain was studied in this research. Black-Right-Pointing-Pointer Coronal sections of animal tissue samples were irradiated with thermal neutrons. Black-Right-Pointing-Pointer Alpha and Lithium tracks were counted using alpha autoradiography. Black-Right-Pointing-Pointer Different boron concentration was seen in brain sections of male and female rats. Black-Right-Pointing-Pointer The highest boron concentration was seen in 4 h after boron compound injection.

  12. What do we gain from simplicity versus complexity in species distribution models?

    Science.gov (United States)

    Merow, Cory; Smith, Matthew J.; Edwards, Thomas C.; Guisan, Antoine; McMahon, Sean M.; Normand, Signe; Thuiller, Wilfried; Wuest, Rafael O.; Zimmermann, Niklaus E.; Elith, Jane

    2014-01-01

    Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence–environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence–environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building ‘under fit’ models, having insufficient flexibility to describe observed occurrence–environment relationships, we risk misunderstanding the factors shaping species distributions. By building ‘over fit’ models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species

  13. Determining Normal-Distribution Tolerance Bounds Graphically

    Science.gov (United States)

    Mezzacappa, M. A.

    1983-01-01

    Graphical method requires calculations and table lookup. Distribution established from only three points: mean upper and lower confidence bounds and lower confidence bound of standard deviation. Method requires only few calculations with simple equations. Graphical procedure establishes best-fit line for measured data and bounds for selected confidence level and any distribution percentile.

  14. Optimal Output of Distributed Generation Based On Complex Power Increment

    Science.gov (United States)

    Wu, D.; Bao, H.

    2017-12-01

    In order to meet the growing demand for electricity and improve the cleanliness of power generation, new energy generation, represented by wind power generation, photovoltaic power generation, etc has been widely used. The new energy power generation access to distribution network in the form of distributed generation, consumed by local load. However, with the increase of the scale of distribution generation access to the network, the optimization of its power output is becoming more and more prominent, which needs further study. Classical optimization methods often use extended sensitivity method to obtain the relationship between different power generators, but ignore the coupling parameter between nodes makes the results are not accurate; heuristic algorithm also has defects such as slow calculation speed, uncertain outcomes. This article proposes a method called complex power increment, the essence of this method is the analysis of the power grid under steady power flow. After analyzing the results we can obtain the complex scaling function equation between the power supplies, the coefficient of the equation is based on the impedance parameter of the network, so the description of the relation of variables to the coefficients is more precise Thus, the method can accurately describe the power increment relationship, and can obtain the power optimization scheme more accurately and quickly than the extended sensitivity method and heuristic method.

  15. A comparison of LMC and SDL complexity measures on binomial distributions

    Science.gov (United States)

    Piqueira, José Roberto C.

    2016-02-01

    The concept of complexity has been widely discussed in the last forty years, with a lot of thinking contributions coming from all areas of the human knowledge, including Philosophy, Linguistics, History, Biology, Physics, Chemistry and many others, with mathematicians trying to give a rigorous view of it. In this sense, thermodynamics meets information theory and, by using the entropy definition, López-Ruiz, Mancini and Calbet proposed a definition for complexity that is referred as LMC measure. Shiner, Davison and Landsberg, by slightly changing the LMC definition, proposed the SDL measure and the both, LMC and SDL, are satisfactory to measure complexity for a lot of problems. Here, SDL and LMC measures are applied to the case of a binomial probability distribution, trying to clarify how the length of the data set implies complexity and how the success probability of the repeated trials determines how complex the whole set is.

  16. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering

    Science.gov (United States)

    Gamayunov, K. V.; Khazanov, G. V.

    2006-01-01

    calculate the pitch-angle diffusion coefficients using the typical wave normal distributions obtained from our self-consistent ring current-EMIC wave model, and try to quantify the effect of EMIC wave normal angle characteristics on relativistic electron scattering.

  17. Complex Interdependence Regulates Heterotypic Transcription Factor Distribution and Coordinates Cardiogenesis.

    Science.gov (United States)

    Luna-Zurita, Luis; Stirnimann, Christian U; Glatt, Sebastian; Kaynak, Bogac L; Thomas, Sean; Baudin, Florence; Samee, Md Abul Hassan; He, Daniel; Small, Eric M; Mileikovsky, Maria; Nagy, Andras; Holloway, Alisha K; Pollard, Katherine S; Müller, Christoph W; Bruneau, Benoit G

    2016-02-25

    Transcription factors (TFs) are thought to function with partners to achieve specificity and precise quantitative outputs. In the developing heart, heterotypic TF interactions, such as between the T-box TF TBX5 and the homeodomain TF NKX2-5, have been proposed as a mechanism for human congenital heart defects. We report extensive and complex interdependent genomic occupancy of TBX5, NKX2-5, and the zinc finger TF GATA4 coordinately controlling cardiac gene expression, differentiation, and morphogenesis. Interdependent binding serves not only to co-regulate gene expression but also to prevent TFs from distributing to ectopic loci and activate lineage-inappropriate genes. We define preferential motif arrangements for TBX5 and NKX2-5 cooperative binding sites, supported at the atomic level by their co-crystal structure bound to DNA, revealing a direct interaction between the two factors and induced DNA bending. Complex interdependent binding mechanisms reveal tightly regulated TF genomic distribution and define a combinatorial logic for heterotypic TF regulation of differentiation. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  19. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    Science.gov (United States)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  20. Skewed Normal Distribution Of Return Assets In Call European Option Pricing

    Directory of Open Access Journals (Sweden)

    Evy Sulistianingsih

    2011-12-01

    Full Text Available Option is one of security derivates. In financial market, option is a contract that gives a right (notthe obligation for its owner to buy or sell a particular asset for a certain price at a certain time.Option can give a guarantee for a risk that can be faced in a market.This paper studies about theuse of Skewed Normal Distribution (SN in call europeanoption pricing. The SN provides aflexible framework that captures the skewness of log return. We obtain aclosed form solution forthe european call option pricing when log return follow the SN. Then, we will compare optionprices that is obtained by the SN and the Black-Scholes model with the option prices of market. Keywords: skewed normaldistribution, log return, options.

  1. Reply to: Are There More Gifted People than Would Be Expected on a Normal Distribution?

    Science.gov (United States)

    Gallagher, James J.

    2014-01-01

    The author responds to the article by Warne, Godwin, and Smith (2013) on the question of whether there are more gifted people than would be expected in a Gaussian normal distribution. He asserts that the answer to this question is yes, based on (a) data that he and his colleagues have collected, (b) data that are already available and quoted by…

  2. Log-Normal Distribution in a Growing System with Weighted and Multiplicatively Interacting Particles

    Science.gov (United States)

    Fujihara, Akihiro; Tanimoto, Satoshi; Yamamoto, Hiroshi; Ohtsuki, Toshiya

    2018-03-01

    A growing system with weighted and multiplicatively interacting particles is investigated. Each particle has a quantity that changes multiplicatively after a binary interaction, with its growth rate controlled by a weight parameter in a homogeneous symmetric kernel. We consider the system using moment inequalities and analytically derive the log-normal-type tail in the probability distribution function of quantities when the parameter is negative, which is different from the result for single-body multiplicative processes. We also find that the system approaches a winner-take-all state when the parameter is positive.

  3. Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...

  4. Bicervical normal uterus with normal vagina | Okeke | Annals of ...

    African Journals Online (AJOL)

    To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...

  5. Spatio-temporal precipitation climatology over complex terrain using a censored additive regression model.

    Science.gov (United States)

    Stauffer, Reto; Mayr, Georg J; Messner, Jakob W; Umlauf, Nikolaus; Zeileis, Achim

    2017-06-15

    Flexible spatio-temporal models are widely used to create reliable and accurate estimates for precipitation climatologies. Most models are based on square root transformed monthly or annual means, where a normal distribution seems to be appropriate. This assumption becomes invalid on a daily time scale as the observations involve large fractions of zero observations and are limited to non-negative values. We develop a novel spatio-temporal model to estimate the full climatological distribution of precipitation on a daily time scale over complex terrain using a left-censored normal distribution. The results demonstrate that the new method is able to account for the non-normal distribution and the large fraction of zero observations. The new climatology provides the full climatological distribution on a very high spatial and temporal resolution, and is competitive with, or even outperforms existing methods, even for arbitrary locations.

  6. Type I error rates of rare single nucleotide variants are inflated in tests of association with non-normally distributed traits using simple linear regression methods.

    Science.gov (United States)

    Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F

    2016-01-01

    In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.

  7. Diverse complexities, complex diversities: Resisting ‘normal science’ in pedagogical and research methodologies. A perspective from Aotearoa (New Zealand

    Directory of Open Access Journals (Sweden)

    Ritchie Jenny

    2016-06-01

    Full Text Available This paper offers an overview of complexities of the contexts for education in Aotearoa, which include the need to recognise and include Māori (Indigenous perspectives, but also to extend this inclusion to the context of increasing ethnic diversity. These complexities include the situation of worsening disparities between rich and poor which disproportionately position Māori and those from Pacific Island backgrounds in situations of poverty. It then offers a brief critique of government policies before providing some examples of models that resist ‘normal science’ categorisations. These include: the Māori values underpinning the effective teachers’ profile of the Kotahitanga project and of the Māori assessment model for early childhood education; the dispositions identified in a Samoan model for assessing young children’s learning; and the approach developed for assessing Māori children’s literacy and numeracy within schools where Māori language is the medium of instruction. These models all position learning within culturally relevant frames that are grounded in non-Western onto-epistemologies which include spiritual, cultural, and collective aspirations.

  8. Exact, time-independent estimation of clone size distributions in normal and mutated cells.

    Science.gov (United States)

    Roshan, A; Jones, P H; Greenman, C D

    2014-10-06

    Biological tools such as genetic lineage tracing, three-dimensional confocal microscopy and next-generation DNA sequencing are providing new ways to quantify the distribution of clones of normal and mutated cells. Understanding population-wide clone size distributions in vivo is complicated by multiple cell types within observed tissues, and overlapping birth and death processes. This has led to the increased need for mathematically informed models to understand their biological significance. Standard approaches usually require knowledge of clonal age. We show that modelling on clone size independent of time is an alternative method that offers certain analytical advantages; it can help parametrize these models, and obtain distributions for counts of mutated or proliferating cells, for example. When applied to a general birth-death process common in epithelial progenitors, this takes the form of a gambler's ruin problem, the solution of which relates to counting Motzkin lattice paths. Applying this approach to mutational processes, alternative, exact, formulations of classic Luria-Delbrück-type problems emerge. This approach can be extended beyond neutral models of mutant clonal evolution. Applications of these approaches are twofold. First, we resolve the probability of progenitor cells generating proliferating or differentiating progeny in clonal lineage tracing experiments in vivo or cell culture assays where clone age is not known. Second, we model mutation frequency distributions that deep sequencing of subclonal samples produce.

  9. Log-Normality and Multifractal Analysis of Flame Surface Statistics

    Science.gov (United States)

    Saha, Abhishek; Chaudhuri, Swetaprovo; Law, Chung K.

    2013-11-01

    The turbulent flame surface is typically highly wrinkled and folded at a multitude of scales controlled by various flame properties. It is useful if the information contained in this complex geometry can be projected onto a simpler regular geometry for the use of spectral, wavelet or multifractal analyses. Here we investigate local flame surface statistics of turbulent flame expanding under constant pressure. First the statistics of local length ratio is experimentally obtained from high-speed Mie scattering images. For spherically expanding flame, length ratio on the measurement plane, at predefined equiangular sectors is defined as the ratio of the actual flame length to the length of a circular-arc of radius equal to the average radius of the flame. Assuming isotropic distribution of such flame segments we convolute suitable forms of the length-ratio probability distribution functions (pdfs) to arrive at corresponding area-ratio pdfs. Both the pdfs are found to be near log-normally distributed and shows self-similar behavior with increasing radius. Near log-normality and rather intermittent behavior of the flame-length ratio suggests similarity with dissipation rate quantities which stimulates multifractal analysis. Currently at Indian Institute of Science, India.

  10. Comparative pharmacokinetic and tissue distribution profiles of four major bioactive components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Wang, Chang-Hong; Tao, Yan-Yan; Zhou, Hua; Liu, Cheng-Hai

    2015-10-10

    Fuzheng Huayu recipe (FZHY) is a herbal product for the treatment of liver fibrosis approved by the Chinese State Food and Drug Administration (SFDA), but its pharmacokinetics and tissue distribution had not been investigated. In this study, the liver fibrotic model was induced with intraperitoneal injection of dimethylnitrosamine (DMN), and FZHY was given orally to the model and normal rats. The plasma pharmacokinetics and tissue distribution profiles of four major bioactive components from FZHY were analyzed in the normal and fibrotic rat groups using an ultrahigh performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. Results revealed that the bioavailabilities of danshensu (DSS), salvianolic acid B (SAB) and rosmarinic acid (ROS) in liver fibrotic rats increased 1.49, 3.31 and 2.37-fold, respectively, compared to normal rats. There was no obvious difference in the pharmacokinetics of amygdalin (AMY) between the normal and fibrotic rats. The tissue distribution of DSS, SAB, and AMY trended to be mostly in the kidney and lung. The distribution of DSS, SAB, and AMY in liver tissue of the model rats was significantly decreased compared to the normal rats. Significant differences in the pharmacokinetics and tissue distribution profiles of DSS, ROS, SAB and AMY were observed in rats with hepatic fibrosis after oral administration of FZHY. These results provide a meaningful basis for developing a clinical dosage regimen in the treatment of hepatic fibrosis by FZHY. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. The Italian primary school-size distribution and the city-size: a complex nexus

    Science.gov (United States)

    Belmonte, Alessandro; di Clemente, Riccardo; Buldyrev, Sergey V.

    2014-06-01

    We characterize the statistical law according to which Italian primary school-size distributes. We find that the school-size can be approximated by a log-normal distribution, with a fat lower tail that collects a large number of very small schools. The upper tail of the school-size distribution decreases exponentially and the growth rates are distributed with a Laplace PDF. These distributions are similar to those observed for firms and are consistent with a Bose-Einstein preferential attachment process. The body of the distribution features a bimodal shape suggesting some source of heterogeneity in the school organization that we uncover by an in-depth analysis of the relation between schools-size and city-size. We propose a novel cluster methodology and a new spatial interaction approach among schools which outline the variety of policies implemented in Italy. Different regional policies are also discussed shedding lights on the relation between policy and geographical features.

  12. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  13. Distribution of separated energy and injected charge at normal falling of fast electron beam on target

    CERN Document Server

    Smolyar, V A; Eremin, V V

    2002-01-01

    In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well

  14. Distribution of separated energy and injected charge at normal falling of fast electron beam on target

    International Nuclear Information System (INIS)

    Smolyar, V.A.; Eremin, A.V.; Eremin, V.V.

    2002-01-01

    In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well [ru

  15. Tumor and normal tissue responses to fractioned non-uniform dose delivery

    Energy Technology Data Exchange (ETDEWEB)

    Kaellman, P; Aegren, A; Brahme, A [Karolinska Inst., Stockholm (Sweden). Dept. of Radiation Physics

    1996-08-01

    The volume dependence of the radiation response of a tumor is straight forward to quantify because it depends primarily on the eradication of all its clonogenic cells. A tumor therefore has a parallel organization as any surviving clonogen in principle can repopulate the tumor. The difficulty with the response of the tumor is instead to know the density and sensitivity distribution of the most resistant clonogenic cells. The increase in the 50% tumor control dose and the decrease in the maximum normalized slope of the dose response relation, {gamma}, in presence of small compartments of resistant tumor cells have therefore been quantified to describe their influence on the dose response relation. Injury to normal tissue is a much more complex and gradual process. It depends on earlier effects induced long before depletion of the differentiated and clonogenic cells that in addition may have a complex structural and functional organization. The volume dependence of the dose response relation of normal tissues is therefore described here by the relative seriality, s, of the infrastructure of the organ. The model can also be generalized to describe the response of heterogeneous tissues to non uniform dose distributions. The new model is compared with clinical and experimental data on normal tissue response, and shows good agreement both with regard to the shape of dose response relation and the volume dependence of the isoeffect dose. The response of tumors and normal tissues are quantified for arbitrary dose fractionations using the linear quadratic cell survival parameters {alpha} and {beta}. The parameters of the dose response relation are derived both for a constant dose per fraction and a constant number of dose fractions, thus in the latter case accounting also for non uniform dose delivery. (author). 26 refs, 4 figs.

  16. Real-time generation of the Wigner distribution of complex functions using phase conjugation in photorefractive materials.

    Science.gov (United States)

    Sun, P C; Fainman, Y

    1990-09-01

    An optical processor for real-time generation of the Wigner distribution of complex amplitude functions is introduced. The phase conjugation of the input signal is accomplished by a highly efficient self-pumped phase conjugator based on a 45 degrees -cut barium titanate photorefractive crystal. Experimental results on the real-time generation of Wigner distribution slices for complex amplitude two-dimensional optical functions are presented and discussed.

  17. Control of complex dynamics and chaos in distributed parameter systems

    Energy Technology Data Exchange (ETDEWEB)

    Chakravarti, S.; Marek, M.; Ray, W.H. [Univ. of Wisconsin, Madison, WI (United States)

    1995-12-31

    This paper discusses a methodology for controlling complex dynamics and chaos in distributed parameter systems. The reaction-diffusion system with Brusselator kinetics, where the torus-doubling or quasi-periodic (two characteristic incommensurate frequencies) route to chaos exists in a defined range of parameter values, is used as an example. Poincare maps are used for characterization of quasi-periodic and chaotic attractors. The dominant modes or topos, which are inherent properties of the system, are identified by means of the Singular Value Decomposition. Tested modal feedback control schemas based on identified dominant spatial modes confirm the possibility of stabilization of simple quasi-periodic trajectories in the complex quasi-periodic or chaotic spatiotemporal patterns.

  18. The Weight of Euro Coins: Its Distribution Might Not Be as Normal as You Would Expect

    Science.gov (United States)

    Shkedy, Ziv; Aerts, Marc; Callaert, Herman

    2006-01-01

    Classical regression models, ANOVA models and linear mixed models are just three examples (out of many) in which the normal distribution of the response is an essential assumption of the model. In this paper we use a dataset of 2000 euro coins containing information (up to the milligram) about the weight of each coin, to illustrate that the…

  19. Moments of generalized Husimi distributions and complexity of many-body quantum states

    International Nuclear Information System (INIS)

    Sugita, Ayumu

    2003-01-01

    We consider generalized Husimi distributions for many-body systems, and show that their moments are good measures of complexity of many-body quantum states. Our construction of the Husimi distribution is based on the coherent state of the single-particle transformation group. Then the coherent states are independent-particle states, and, at the same time, the most localized states in the Husimi representation. Therefore delocalization of the Husimi distribution, which can be measured by the moments, is a sign of many-body correlation (entanglement). Since the delocalization of the Husimi distribution is also related to chaoticity of the dynamics, it suggests a relation between entanglement and chaos. Our definition of the Husimi distribution can be applied not only to systems of distinguishable particles, but also to those of identical particles, i.e., fermions and bosons. We derive an algebraic formula to evaluate the moments of the Husimi distribution

  20. The role of activity complexes in the distribution of solar magnetic fields.

    Science.gov (United States)

    García de La Rosa, J. I.; Reyes, R. C.

    Using published data on the large-scale distribution of solar activity, the authors conclude that the longlived coronal holes are formed and maintained by the unbalanced magnetic flux which developes at both extremes of the complexes of activity.

  1. Soil Temperature Variability in Complex Terrain measured using Distributed a Fiber-Optic Distributed Temperature Sensing

    Science.gov (United States)

    Seyfried, M. S.; Link, T. E.

    2013-12-01

    Soil temperature (Ts) exerts critical environmental controls on hydrologic and biogeochemical processes. Rates of carbon cycling, mineral weathering, infiltration and snow melt are all influenced by Ts. Although broadly reflective of the climate, Ts is sensitive to local variations in cover (vegetative, litter, snow), topography (slope, aspect, position), and soil properties (texture, water content), resulting in a spatially and temporally complex distribution of Ts across the landscape. Understanding and quantifying the processes controlled by Ts requires an understanding of that distribution. Relatively few spatially distributed field Ts data exist, partly because traditional Ts data are point measurements. A relatively new technology, fiber optic distributed temperature system (FO-DTS), has the potential to provide such data but has not been rigorously evaluated in the context of remote, long term field research. We installed FO-DTS in a small experimental watershed in the Reynolds Creek Experimental Watershed (RCEW) in the Owyhee Mountains of SW Idaho. The watershed is characterized by complex terrain and a seasonal snow cover. Our objectives are to: (i) evaluate the applicability of fiber optic DTS to remote field environments and (ii) to describe the spatial and temporal variability of soil temperature in complex terrain influenced by a variable snow cover. We installed fiber optic cable at a depth of 10 cm in contrasting snow accumulation and topographic environments and monitored temperature along 750 m with DTS. We found that the DTS can provide accurate Ts data (+/- .4°C) that resolves Ts changes of about 0.03°C at a spatial scale of 1 m with occasional calibration under conditions with an ambient temperature range of 50°C. We note that there are site-specific limitations related cable installation and destruction by local fauna. The FO-DTS provide unique insight into the spatial and temporal variability of Ts in a landscape. We found strong seasonal

  2. Probabilistic modeling using bivariate normal distributions for identification of flow and displacement intervals in longwall overburden

    Energy Technology Data Exchange (ETDEWEB)

    Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research

    2011-01-15

    Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.

  3. Rough Sets and Stomped Normal Distribution for Simultaneous Segmentation and Bias Field Correction in Brain MR Images.

    Science.gov (United States)

    Banerjee, Abhirup; Maji, Pradipta

    2015-12-01

    The segmentation of brain MR images into different tissue classes is an important task for automatic image analysis technique, particularly due to the presence of intensity inhomogeneity artifact in MR images. In this regard, this paper presents a novel approach for simultaneous segmentation and bias field correction in brain MR images. It integrates judiciously the concept of rough sets and the merit of a novel probability distribution, called stomped normal (SN) distribution. The intensity distribution of a tissue class is represented by SN distribution, where each tissue class consists of a crisp lower approximation and a probabilistic boundary region. The intensity distribution of brain MR image is modeled as a mixture of finite number of SN distributions and one uniform distribution. The proposed method incorporates both the expectation-maximization and hidden Markov random field frameworks to provide an accurate and robust segmentation. The performance of the proposed approach, along with a comparison with related methods, is demonstrated on a set of synthetic and real brain MR images for different bias fields and noise levels.

  4. Universality of many-body two-nucleon momentum distributions: Correlated nucleon spectral function of complex nuclei

    Science.gov (United States)

    Ciofi degli Atti, Claudio; Morita, Hiko

    2017-12-01

    Background: The nuclear spectral function is a fundamental quantity that describes the mean-field and short-range correlation dynamics of nucleons embedded in the nuclear medium; its knowledge is a prerequisite for the interpretation of various electroweak scattering processes off nuclear targets aimed at providing fundamental information on strong and weak interactions. Whereas in the case of the three-nucleon and, partly, the four-nucleon systems, the spectral function can be calculated ab initio within a nonrelativistic many-body Schroedinger approach, in the case of complex nuclei only models of the correlated, high-momentum part of the spectral function are available so far. Purpose: The purpose of this paper is to present a new approach such that the spectral function for a specific nucleus can be obtained from a reliable many-body calculation based upon realistic nucleon-nucleon interactions, thus avoiding approximations leading to adjustable parameters. Methods: The expectation value of the nuclear many-body Hamiltonian, containing realistic nucleon-nucleon interaction of the Argonne family, is evaluated variationally by a normalization-conserving linked-cluster expansion and the resulting many-body correlated wave functions are used to calculate the one-nucleon and the two-nucleon momentum distributions; by analyzing the high-momentum behavior of the latter, the spectral function can be expressed in terms of a transparent convolution formula involving the relative and center-of-mass (c.m.) momentum distributions in specific regions of removal energy E and momentum k . Results: It is found that as a consequence of the factorization of the many-body wave functions at short internucleon separations, the high-momentum behavior of the two-nucleon momentum distributions in A =3 ,4 ,12 ,16 ,40 nuclei factorizes, at proper values of the relative and c.m. momenta, into the c.m. and relative momentum distributions, with the latter exhibiting a universal A

  5. The origin of human complex diversity: Stochastic epistatic modules and the intrinsic compatibility between distributional robustness and phenotypic changeability.

    Science.gov (United States)

    Ijichi, Shinji; Ijichi, Naomi; Ijichi, Yukina; Imamura, Chikako; Sameshima, Hisami; Kawaike, Yoichi; Morioka, Hirofumi

    2018-01-01

    The continuing prevalence of a highly heritable and hypo-reproductive extreme tail of a human neurobehavioral quantitative diversity suggests the possibility that the reproductive majority retains the genetic mechanism for the extremes. From the perspective of stochastic epistasis, the effect of an epistatic modifier variant can randomly vary in both phenotypic value and effect direction among the careers depending on the genetic individuality, and the modifier careers are ubiquitous in the population distribution. The neutrality of the mean genetic effect in the careers warrants the survival of the variant under selection pressures. Functionally or metabolically related modifier variants make an epistatic network module and dozens of modules may be involved in the phenotype. To assess the significance of stochastic epistasis, a simplified module-based model was employed. The individual repertoire of the modifier variants in a module also participates in the genetic individuality which determines the genetic contribution of each modifier in the career. Because the entire contribution of a module to the phenotypic outcome is consequently unpredictable in the model, the module effect represents the total contribution of the related modifiers as a stochastic unit in the simulations. As a result, the intrinsic compatibility between distributional robustness and quantitative changeability could mathematically be simulated using the model. The artificial normal distribution shape in large-sized simulations was preserved in each generation even if the lowest fitness tail was un-reproductive. The robustness of normality beyond generations is analogous to the real situations of human complex diversity including neurodevelopmental conditions. The repeated regeneration of the un-reproductive extreme tail may be inevitable for the reproductive majority's competence to survive and change, suggesting implications of the extremes for others. Further model-simulations to

  6. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  7. Distributed optimization-based control of multi-agent networks in complex environments

    CERN Document Server

    Zhu, Minghui

    2015-01-01

    This book offers a concise and in-depth exposition of specific algorithmic solutions for distributed optimization based control of multi-agent networks and their performance analysis. It synthesizes and analyzes distributed strategies for three collaborative tasks: distributed cooperative optimization, mobile sensor deployment and multi-vehicle formation control. The book integrates miscellaneous ideas and tools from dynamic systems, control theory, graph theory, optimization, game theory and Markov chains to address the particular challenges introduced by such complexities in the environment as topological dynamics, environmental uncertainties, and potential cyber-attack by human adversaries. The book is written for first- or second-year graduate students in a variety of engineering disciplines, including control, robotics, decision-making, optimization and algorithms and with backgrounds in aerospace engineering, computer science, electrical engineering, mechanical engineering and operations research. Resea...

  8. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  9. Distribution Log Normal of 222 Rn in the state of Zacatecas, Mexico

    International Nuclear Information System (INIS)

    Garcia, M.L.; Mireles, F.; Quirino, L.; Davila, I.; Rios, C.; Pinedo, J.L.

    2006-01-01

    In this work the evaluation of the concentration of 222 Rn in air for Zacatecas is shown. The Solid State Nuclear Track Detectors were used as the technique for the realization of the measurements in large scale with cellulose nitrate LR-115, type 2, in open chambers of 222 Rn. The measurements were carried out during three months in different times of the year. In the results it is presented the log normal distribution, arithmetic mean and geometric media for the concentration at indoor and outdoor of residence constructions, the concentration at indoor of occupational constructions and in the 57 municipal heads of the state of Zacatecas. The statistics of the values in the concentration showed variation according to the time of the year, obtaining high quantities in winter seasons for both cases. The distribution of the concentration of 222 Rn is presented in the state map for each one of the municipalities, representing the measurement places in the entire state of Zacatecas. Finally the places where the values in the concentration of 222 Rn in air are near to the one limit settled down by the EPA of 148 Bq/m 3 are presented. (Author)

  10. Precision time distribution within a deep space communications complex

    Science.gov (United States)

    Curtright, J. B.

    1972-01-01

    The Precision Time Distribution System (PTDS) at the Golstone Deep Space Communications Complex is a practical application of existing technology to the solution of a local problem. The problem was to synchronize four station timing systems to a master source with a relative accuracy consistently and significantly better than 10 microseconds. The solution involved combining a precision timing source, an automatic error detection assembly and a microwave distribution network into an operational system. Upon activation of the completed PTDS two years ago, synchronization accuracy at Goldstone (two station relative) was improved by an order of magnitude. It is felt that the validation of the PTDS mechanization is now completed. Other facilities which have site dispersion and synchronization accuracy requirements similar to Goldstone may find the PTDS mechanization useful in solving their problem. At present, the two station relative synchronization accuracy at Goldstone is better than one microsecond.

  11. Topographical Distribution of Arsenic, Manganese, and Selenium in the Normal Human Brain

    DEFF Research Database (Denmark)

    Larsen, Niels Agersnap; Pakkenberg, H.; Damsgaard, Else

    1979-01-01

    The concentrations of arsenic, manganese and selenium per gram wet tissue weight were determined in samples from 24 areas of normal human brains from 5 persons with ages ranging from 15 to 81 years of age. The concentrations of the 3 elements were determined for each sample by means of neutron...... activation analysis with radiochemical separation. Distinct patterns of distribution were shown for each of the 3 elements. Variations between individuals were found for some but not all brain areas, resulting in coefficients of variation between individuals of about 30% for arsenic, 10% for manganese and 20......% for selenium. The results seem to indicate that arsenic is associated with the lipid phase, manganese with the dry matter and selenium with the aqueous phase of brain tissue....

  12. Uncertainty analysis of the radiological characteristics of radioactive waste using a method based on log-normal distributions

    International Nuclear Information System (INIS)

    Gigase, Yves

    2007-01-01

    Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide as example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)

  13. Morphology and distribution of chandelier cell axon terminals in the mouse cerebral cortex and claustroamygdaloid complex.

    Science.gov (United States)

    Inda, M C; DeFelipe, J; Muñoz, A

    2009-01-01

    Chandelier cells represent a unique type of cortical gamma-aminobutityric acidergic interneuron whose axon terminals (Ch-terminals) only form synapses with the axon initial segments of some pyramidal cells. Here, we have used immunocytochemistry for the high-affinity plasma membrane transporter GAT-1 and the calcium-binding protein parvalbumin to analyze the morphology and distribution of Ch-terminals in the mouse cerebral cortex and claustroamygdaloid complex. In general, 2 types of Ch-terminals were distinguished on the basis of their size and the density of the axonal boutons that made up the terminal. Simple Ch-terminals were made up of 1 or 2 rows of labeled boutons, each row consisting of only 3-5 boutons. In contrast, complex Ch-terminals were tight cylinder-like structures made up of multiple rows of boutons. Simple Ch-terminals were detected throughout the cerebral cortex and claustroamygdaloid complex, the complex type was only occasionally found in certain regions, whereas in others they were very abundant. These results indicate that there are substantial differences in the morphology and distribution of Ch-terminals between different areas and layers of the mouse cerebral cortex. Furthermore, we suggest that the distribution of complex Ch-terminals may be related to the developmental origin of the different brain regions analyzed.

  14. Assessment of Stable Isotope Distribution in Complex Systems

    Science.gov (United States)

    He, Y.; Cao, X.; Wang, J.; Bao, H.

    2017-12-01

    Biomolecules in living organisms have the potential to approach chemical steady state and even apparent isotope equilibrium because enzymatic reactions are intrinsically reversible. If an apparent local equilibrium can be identified, enzymatic reversibility and its controlling factors may be quantified, which helps to understand complex biochemical processes. Earlier research on isotope fractionation tends to focus on specific process and compare mostly two different chemical species. Using linear regression, "Thermodynamic order", which refers to correlated δ13C and 13β values, has been proposed to be present among many biomolecules by Galimov et al. However, the concept "thermodynamic order" they proposed and the approach they used has been questioned. Here, we propose that the deviation of a complex system from its equilibrium state can be rigorously described as a graph problem as is applied in discrete mathematics. The deviation of isotope distribution from equilibrium state and apparent local isotope equilibrium among a subset of biomolecules can be assessed using an apparent fractionation difference matrix (|Δα|). Applying the |Δα| matrix analysis to earlier published data of amino acids, we show the existence of apparent local equilibrium among different amino acids in potato and a kind of green alga. The existence of apparent local equilibrium is in turn consistent with the notion that enzymatic reactions can be reversible even in living systems. The result also implies that previous emphasis on external carbon source intake may be misplaced when studying isotope distribution in physiology. In addition to the identification of local equilibrium among biomolecules, the difference matrix approach has the potential to explore chemical or isotope equilibrium state in extraterrestrial bodies, to distinguish living from non-living systems, and to classify living species. This approach will benefit from large numbers of systematic data and advanced pattern

  15. Effects of a primordial magnetic field with log-normal distribution on the cosmic microwave background

    International Nuclear Information System (INIS)

    Yamazaki, Dai G.; Ichiki, Kiyotomo; Takahashi, Keitaro

    2011-01-01

    We study the effect of primordial magnetic fields (PMFs) on the anisotropies of the cosmic microwave background (CMB). We assume the spectrum of PMFs is described by log-normal distribution which has a characteristic scale, rather than power-law spectrum. This scale is expected to reflect the generation mechanisms and our analysis is complementary to previous studies with power-law spectrum. We calculate power spectra of energy density and Lorentz force of the log-normal PMFs, and then calculate CMB temperature and polarization angular power spectra from scalar, vector, and tensor modes of perturbations generated from such PMFs. By comparing these spectra with WMAP7, QUaD, CBI, Boomerang, and ACBAR data sets, we find that the current CMB data set places the strongest constraint at k≅10 -2.5 Mpc -1 with the upper limit B < or approx. 3 nG.

  16. Low-Complexity Compression Algorithm for Hyperspectral Images Based on Distributed Source Coding

    Directory of Open Access Journals (Sweden)

    Yongjian Nian

    2013-01-01

    Full Text Available A low-complexity compression algorithm for hyperspectral images based on distributed source coding (DSC is proposed in this paper. The proposed distributed compression algorithm can realize both lossless and lossy compression, which is implemented by performing scalar quantization strategy on the original hyperspectral images followed by distributed lossless compression. Multilinear regression model is introduced for distributed lossless compression in order to improve the quality of side information. Optimal quantized step is determined according to the restriction of the correct DSC decoding, which makes the proposed algorithm achieve near lossless compression. Moreover, an effective rate distortion algorithm is introduced for the proposed algorithm to achieve low bit rate. Experimental results show that the compression performance of the proposed algorithm is competitive with that of the state-of-the-art compression algorithms for hyperspectral images.

  17. Communication complexity of distributed computing and a parallel algorithm for polynomial roots

    International Nuclear Information System (INIS)

    Tiwari, P.

    1986-01-01

    The first part of this thesis begins with a discussion of the minimum communication requirements in some distributed networks. The main result is a general technique for determining lower bounds on the communication complexity of problems on various distributed computer networks. This general technique is derived by simulating the general network by a linear array and then using a lower bound on the communication complexity of the problem on the linear array. Applications of this technique yield nontrivial optimal or near-optimal lower bounds on the communication complexity of distinctness, ranking, uniqueness, merging, and triangle detection on a ring, a mesh, and a complete binary tree of processors. A technique similar to the one used in proving the above results, yields interesting graph theoretic results concerning decomposition of a graph into complete bipartite subgraphs. The second part of the this is devoted to the design of a fast parallel algorithm for determining all roots of a polynomial. Given a polynomial rho(z) of degree n with m bit integer coefficients and an integer μ, the author considers the problem of determining all its roots with error less than 2/sup -μ/. It is shown that this problem is in the class NC if rho(z) has all real roots

  18. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed

    2017-08-28

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  19. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus

    2017-01-01

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  20. Not Normal: the uncertainties of scientific measurements

    Science.gov (United States)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  1. Pharmacokinetics and tissue distribution of five active ingredients of Eucommiae cortex in normal and ovariectomized mice by UHPLC-MS/MS.

    Science.gov (United States)

    An, Jing; Hu, Fangdi; Wang, Changhong; Zhang, Zijia; Yang, Li; Wang, Zhengtao

    2016-09-01

    1. Pinoresinol di-O-β-d-glucopyranoside (PDG), geniposide (GE), geniposidic acid (GA), aucubin (AN) and chlorogenic acid (CA) are the representative active ingredients in Eucommiae cortex (EC), which may be estrogenic. 2. The ultra high-performance liquid chromatography/tandem mass spectrometry (UHPLC-MS/MS) method for simultaneous determination of the five ingredients showed good linearity, low limits of quantification and high extraction recoveries, as well as acceptable precision, accuracy and stability in mice plasma and tissue samples (liver, spleen, kidney and uterus). It was successfully applied to the comparative study on pharmacokinetics and tissue distribution of PDG, GE, GA, AN and CA between normal and ovariectomized (OVX) mice. 3. The results indicated that except CA, the plasma and tissue concentrations of PDG, GE, GA in OVX mice were all greater than those in normal mice. AN could only be detected in the plasma and liver homogenate of normal mice, which was poorly absorbed in OVX mice and low in other measured tissues. PDG, GE and GA seem to be better absorbed in OVX mice than in normal mice proved by the remarkable increased value of AUC0-∞ and Cmax. It is beneficial that PDG, GE, GA have better plasma absorption and tissue distribution in pathological state.

  2. Deformation associated with continental normal faults

    Science.gov (United States)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  3. Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis

    Science.gov (United States)

    Das, Samiran

    2018-04-01

    The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.

  4. Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.

    Science.gov (United States)

    Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel

    2014-01-01

    Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.

  5. Metal distributions in complexes with Chlorella vulgaris in seawater and wastewater

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, P.R.; Kowalak, A.D.

    1999-10-01

    Divalent cadmium (Cd), copper (Cu), iron (Fe), nickel (Ni), lead (Pb), and zinc (Zn) simultaneous complexes with an algal biomass Chlorella vulgaris were studied for bioremediation purposes in various aqueous media: distilled-deionized water (DDIW), seawater, nuclear-reactor pool water, and process wastewater. Reactions were monitored using various dry masses of algae at constant temperature and constant metal concentrations for reaction times ranging from 0 to 150 minutes. Complexes occurred within 30 minutes and reached a steady state after 80 to 120 minutes. Distribution constants (K{prime}{sub d}) were calculated for the complexes and relative orders of K{prime}{sub d} were reported. The K{prime}{sub d} are used to evaluate relative efficiency of metal remediation from waters. Lead, Cu, and Ni complexes had the greatest K{prime}{sub d} values and those metals were most efficiently removed from these waters. Zinc and Fe formed the most labile complexes. The order of K{prime}{sub d} values for complexes in DDIW was Pb > Cu > Cd > Zn, then Cu > Cd > Zn in seawater, Cd > Cu > Zn in reactor pool water, and Ni > Cd > Cu > Zn > Fe in wastewater. C. vulgaris biomass may potentially be used as an alternative to traditional water treatment methods for simultaneous extraction of metals from seawater, process wastewater, or drinking water.

  6. Change detection in full and dual polarization sar data and the complex wishart distribution

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning

    A test statistic for equality of two complex variance-covariance matrices following the complex Wishart distribution with an associated probability of observing a smaller value of the test statistic is sketched. We demonstrate the use of the test statistic and the associated probability measure f...... for change detection in both full and dual polarimetry synthetic aperture radar (SAR) data collected by the Danish EMISAR system....

  7. Distribution of erlotinib in rash and normal skin in cancer patients receiving erlotinib visualized by matrix assisted laser desorption/ionization mass spectrometry imaging.

    Science.gov (United States)

    Nishimura, Meiko; Hayashi, Mitsuhiro; Mizutani, Yu; Takenaka, Kei; Imamura, Yoshinori; Chayahara, Naoko; Toyoda, Masanori; Kiyota, Naomi; Mukohara, Toru; Aikawa, Hiroaki; Fujiwara, Yasuhiro; Hamada, Akinobu; Minami, Hironobu

    2018-04-06

    The development of skin rashes is the most common adverse event observed in cancer patients treated with epidermal growth factor receptor-tyrosine kinase inhibitors such as erlotinib. However, the pharmacological evidence has not been fully revealed. Erlotinib distribution in the rashes was more heterogeneous than that in the normal skin, and the rashes contained statistically higher concentrations of erlotinib than adjacent normal skin in the superficial skin layer (229 ± 192 vs. 120 ± 103 ions/mm 2 ; P = 0.009 in paired t -test). LC-MS/MS confirmed that the concentration of erlotinib in the skin rashes was higher than that in normal skin in the superficial skin layer (1946 ± 1258 vs. 1174 ± 662 ng/cm 3 ; P = 0.028 in paired t -test). The results of MALDI-MSI and LC-MS/MS were well correlated (coefficient of correlation 0.879, P distribution of erlotinib in the skin tissue was visualized using non-labeled MALDI-MSI. Erlotinib concentration in the superficial layer of the skin rashes was higher than that in the adjacent normal skin. We examined patients with advanced pancreatic cancer who developed skin rashes after treatment with erlotinib and gemcitabine. We biopsied both the rash and adjacent normal skin tissues, and visualized and compared the distribution of erlotinib within the skin using matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI). The tissue concentration of erlotinib was also measured by liquid chromatography-tandem mass spectrometry (LC-MS/MS) with laser microdissection.

  8. New insights into the transposition mechanisms of IS6110 and its dynamic distribution between Mycobacterium tuberculosis Complex lineages.

    Science.gov (United States)

    Gonzalo-Asensio, Jesús; Pérez, Irene; Aguiló, Nacho; Uranga, Santiago; Picó, Ana; Lampreave, Carlos; Cebollada, Alberto; Otal, Isabel; Samper, Sofía; Martín, Carlos

    2018-04-01

    The insertion Sequence IS6110, only present in the pathogens of the Mycobacterium tuberculosis Complex (MTBC), has been the gold-standard epidemiological marker for TB for more than 25 years, but biological implications of IS6110 transposition during MTBC adaptation to humans remain elusive. By studying 2,236 clinical isolates typed by IS6110-RFLP and covering the MTBC, we remarked a lineage-specific content of IS6110 being higher in modern globally distributed strains. Once observed the IS6110 distribution in the MTBC, we selected representative isolates and found a correlation between the normalized expression of IS6110 and its abundance in MTBC chromosomes. We also studied the molecular regulation of IS6110 transposition and we found a synergistic action of two post-transcriptional mechanisms: a -1 ribosomal frameshift and a RNA pseudoknot which interferes translation. The construction of a transcriptionally active transposase resulted in 20-fold increase of the transposition frequency. Finally, we examined transposition in M. bovis and M. tuberculosis during laboratory starvation and in a mouse infection model of TB. Our results shown a higher transposition in M. tuberculosis, that preferably happens during TB infection in mice and after one year of laboratory culture, suggesting that IS6110 transposition is dynamically adapted to the host and to adverse growth conditions.

  9. Visualizing Tensor Normal Distributions at Multiple Levels of Detail.

    Science.gov (United States)

    Abbasloo, Amin; Wiens, Vitalis; Hermann, Max; Schultz, Thomas

    2016-01-01

    Despite the widely recognized importance of symmetric second order tensor fields in medicine and engineering, the visualization of data uncertainty in tensor fields is still in its infancy. A recently proposed tensorial normal distribution, involving a fourth order covariance tensor, provides a mathematical description of how different aspects of the tensor field, such as trace, anisotropy, or orientation, vary and covary at each point. However, this wealth of information is far too rich for a human analyst to take in at a single glance, and no suitable visualization tools are available. We propose a novel approach that facilitates visual analysis of tensor covariance at multiple levels of detail. We start with a visual abstraction that uses slice views and direct volume rendering to indicate large-scale changes in the covariance structure, and locations with high overall variance. We then provide tools for interactive exploration, making it possible to drill down into different types of variability, such as in shape or orientation. Finally, we allow the analyst to focus on specific locations of the field, and provide tensor glyph animations and overlays that intuitively depict confidence intervals at those points. Our system is demonstrated by investigating the effects of measurement noise on diffusion tensor MRI, and by analyzing two ensembles of stress tensor fields from solid mechanics.

  10. Non-linear learning in online tutorial to enhance students’ knowledge on normal distribution application topic

    Science.gov (United States)

    Kartono; Suryadi, D.; Herman, T.

    2018-01-01

    This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.

  11. A Platoon Dispersion Model Based on a Truncated Normal Distribution of Speed

    Directory of Open Access Journals (Sweden)

    Ming Wei

    2012-01-01

    Full Text Available Understanding platoon dispersion is critical for the coordination of traffic signal control in an urban traffic network. Assuming that platoon speed follows a truncated normal distribution, ranging from minimum speed to maximum speed, this paper develops a piecewise density function that describes platoon dispersion characteristics as the platoon moves from an upstream to a downstream intersection. Based on this density function, the expected number of cars in the platoon that pass the downstream intersection, and the expected number of cars in the platoon that do not pass the downstream point are calculated. To facilitate coordination in a traffic signal control system, dispersion models for the front and the rear of the platoon are also derived. Finally, a numeric computation for the coordination of successive signals is presented to illustrate the validity of the proposed model.

  12. New Riemannian Priors on the Univariate Normal Model

    Directory of Open Access Journals (Sweden)

    Salem Said

    2014-07-01

    Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.

  13. Sarcoglycans in the normal and pathological breast tissue of humans: an immunohistochemical and molecular study.

    Science.gov (United States)

    Arco, Alba; Favaloro, Angelo; Gioffrè, Mara; Santoro, Giuseppe; Speciale, Francesco; Vermiglio, Giovanna; Cutroneo, Giuseppina

    2012-01-01

    The sarcoglycan complex, consisting of α-, β-, γ-, δ- and ε-sarcoglycans, is a multimember transmembrane system providing a mechanosignaling connection from the cytoskeleton to the extracellular matrix. Whereas the expression of α- and γ-sarcoglycan is restricted to striated muscle, other sarcoglycans are widely expressed. Although many studies have investigated sarcoglycans in all muscle types, insufficient data are available on the distribution of the sarcoglycan complex in nonmuscle tissue. On this basis, we used immunohistochemical and RT-PCR techniques to study preliminarily the sarcoglycans in normal glandular breast tissue (which has never been studied in the literature on these proteins) to verify the effective wider distribution of this complex. Moreover, to understand the role of sarcoglycans, we also tested samples obtained from patients affected by fibrocystic mastopathy and breast fibroadenoma. Our data showed, for the first time, that all sarcoglycans are always detectable in all normal samples both in epithelial and myoepithelial cells; in pathological breast tissue, all sarcoglycans appeared severely reduced. These data demonstrated that all sarcoglycans, not only β-, δ-, and ε-sarcoglycans, have a wider distribution, implying a new unknown role for these proteins. Moreover, in breast diseases, sarcoglycans containing cadherin domain homologs could provoke a loss of strong adhesion between epithelial cells, permitting and facilitating the degeneration of these benign breast tumors into malignant tumors. Consequently, sarcoglycans could play an important and intriguing role in many breast diseases and in particular in tumor progression from benign to malignant. Copyright © 2011 S. Karger AG, Basel.

  14. Distribution of Basement Membrane Molecules, Laminin and Collagen Type IV, in Normal and Degenerated Cartilage Tissues.

    Science.gov (United States)

    Foldager, Casper Bindzus; Toh, Wei Seong; Gomoll, Andreas H; Olsen, Bjørn Reino; Spector, Myron

    2014-04-01

    The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti-collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional roles of these 2 extracellular matrix proteins

  15. Distribution of Basement Membrane Molecules, Laminin and Collagen Type IV, in Normal and Degenerated Cartilage Tissues

    Science.gov (United States)

    Toh, Wei Seong; Gomoll, Andreas H.; Olsen, Bjørn Reino; Spector, Myron

    2014-01-01

    Objective: The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Design: Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti–collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. Results: When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. Conclusions: We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional

  16. Heuristic Relative Entropy Principles with Complex Measures: Large-Degree Asymptotics of a Family of Multi-variate Normal Random Polynomials

    Science.gov (United States)

    Kiessling, Michael Karl-Heinz

    2017-10-01

    Let z\\in C, let σ ^2>0 be a variance, and for N\\in N define the integrals E_N^{}(z;σ ) := {1/σ } \\int _R\\ (x^2+z^2) e^{-{1/2σ^2 x^2}}{√{2π }}/dx \\quad if N=1, {1/σ } \\int _{R^N} \\prod \\prod \\limits _{1≤ k1. These are expected values of the polynomials P_N^{}(z)=\\prod _{1≤ n≤ N}(X_n^2+z^2) whose 2 N zeros ± i X_k^{}_{k=1,\\ldots ,N} are generated by N identically distributed multi-variate mean-zero normal random variables {X_k}N_{k=1} with co-variance {Cov}_N^{}(X_k,X_l)=(1+σ ^2-1/N)δ _{k,l}+σ ^2-1/N(1-δ _{k,l}). The E_N^{}(z;σ ) are polynomials in z^2, explicitly computable for arbitrary N, yet a list of the first three E_N^{}(z;σ ) shows that the expressions become unwieldy already for moderate N—unless σ = 1, in which case E_N^{}(z;1) = (1+z^2)^N for all z\\in C and N\\in N. (Incidentally, commonly available computer algebra evaluates the integrals E_N^{}(z;σ ) only for N up to a dozen, due to memory constraints). Asymptotic evaluations are needed for the large- N regime. For general complex z these have traditionally been limited to analytic expansion techniques; several rigorous results are proved for complex z near 0. Yet if z\\in R one can also compute this "infinite-degree" limit with the help of the familiar relative entropy principle for probability measures; a rigorous proof of this fact is supplied. Computer algebra-generated evidence is presented in support of a conjecture that a generalization of the relative entropy principle to signed or complex measures governs the N→ ∞ asymptotics of the regime iz\\in R. Potential generalizations, in particular to point vortex ensembles and the prescribed Gauss curvature problem, and to random matrix ensembles, are emphasized.

  17. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, Addendum

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1975-01-01

    New results and insights concerning a previously published iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions were discussed. It was shown that the procedure converges locally to the consistent maximum likelihood estimate as long as a specified parameter is bounded between two limits. Bound values were given to yield optimal local convergence.

  18. Retention and subcellular distribution of 67Ga in normal organs

    International Nuclear Information System (INIS)

    Ando, A.; Ando, I.; Hiraki, T.

    1986-01-01

    Using normal rats, retention values and subcellular distribution of 67 Ga in each organ were investigated. At 10 min after administration of 67 Ga-citrate the retention value of 67 Ga in blood was 6.77% dose/g, and this value decreased with time. The values for skeletal muscle, lung, pancreas, adrenal, heart muscle, brain, small intestine, large intestine and spinal cord were the highest at 10 min after administration, and they decreased with time. Conversely this value in bone increased until 10 days after injection. But in the liver, kidney, and stomach, these values increased with time after administration and were highest 24 h or 48 h after injection. After that, they decreased with time. The value in spleen reached a plateau 48 h after administration, and hardly varied for 10 days. From the results of subcellular fractionation, it was deduced that lysosome plays quite an important role in the concentration of 67 Ga in small intestine, stomach, lung, kidney and pancreas; a lesser role in its concentration in heart muscle, and hardly any role in the 67 Ga accumulation in skeletal muscle. In spleen, the contents in nuclear, mitochrondrial, microsomal, and supernatant fractions all contributed to the accumulation of 67 Ga. (orig.) [de

  19. Are There More Gifted People Than Would Be Expected in a Normal Distribution? An Investigation of the Overabundance Hypothesis

    Science.gov (United States)

    Warne, Russell T.; Godwin, Lindsey R.; Smith, Kyle V.

    2013-01-01

    Among some gifted education researchers, advocates, and practitioners, it is sometimes believed that there is a larger number of gifted people in the general population than would be predicted from a normal distribution (e.g., Gallagher, 2008; N. M. Robinson, Zigler, & Gallagher, 2000; Silverman, 1995, 2009), a belief that we termed the…

  20. M-dwarf exoplanet surface density distribution. A log-normal fit from 0.07 to 400 AU

    Science.gov (United States)

    Meyer, Michael R.; Amara, Adam; Reggiani, Maddalena; Quanz, Sascha P.

    2018-04-01

    Aims: We fit a log-normal function to the M-dwarf orbital surface density distribution of gas giant planets, over the mass range 1-10 times that of Jupiter, from 0.07 to 400 AU. Methods: We used a Markov chain Monte Carlo approach to explore the likelihoods of various parameter values consistent with point estimates of the data given our assumed functional form. Results: This fit is consistent with radial velocity, microlensing, and direct-imaging observations, is well-motivated from theoretical and phenomenological points of view, and predicts results of future surveys. We present probability distributions for each parameter and a maximum likelihood estimate solution. Conclusions: We suggest that this function makes more physical sense than other widely used functions, and we explore the implications of our results on the design of future exoplanet surveys.

  1. Complex force network in marginally and deeply jammed solids

    International Nuclear Information System (INIS)

    Hu Mao-Bin; Jiang Rui; Wu Qing-Song

    2013-01-01

    This paper studies the force network properties of marginally and deeply jammed packings of frictionless soft particles from the perspective of complex network theory. We generate zero-temperature granular packings at different pressures by minimizing the inter-particle potential energy. The force networks are constructed as nodes representing particles and links representing normal forces between the particles. Deeply jammed solids show remarkably different behavior from marginally jammed solids in their degree distribution, strength distribution, degree correlation, and clustering coefficient. Bimodal and multi-modal distributions emerge when the system enters the deep jamming region. The results also show that small and large particles can show different correlation behavior in this simple system

  2. Normal people working in normal organizations with normal equipment: system safety and cognition in a mid-air collision.

    Science.gov (United States)

    de Carvalho, Paulo Victor Rodrigues; Gomes, José Orlando; Huber, Gilbert Jacob; Vidal, Mario Cesar

    2009-05-01

    A fundamental challenge in improving the safety of complex systems is to understand how accidents emerge in normal working situations, with equipment functioning normally in normally structured organizations. We present a field study of the en route mid-air collision between a commercial carrier and an executive jet, in the clear afternoon Amazon sky in which 154 people lost their lives, that illustrates one response to this challenge. Our focus was on how and why the several safety barriers of a well structured air traffic system melted down enabling the occurrence of this tragedy, without any catastrophic component failure, and in a situation where everything was functioning normally. We identify strong consistencies and feedbacks regarding factors of system day-to-day functioning that made monitoring and awareness difficult, and the cognitive strategies that operators have developed to deal with overall system behavior. These findings emphasize the active problem-solving behavior needed in air traffic control work, and highlight how the day-to-day functioning of the system can jeopardize such behavior. An immediate consequence is that safety managers and engineers should review their traditional safety approach and accident models based on equipment failure probability, linear combinations of failures, rules and procedures, and human errors, to deal with complex patterns of coincidence possibilities, unexpected links, resonance among system functions and activities, and system cognition.

  3. Normal and abnormal distribution of the adrenomedullary imaging agent m-[I-131]iodobenzylguanidine (I-131 MIBG) in man; evaluation by scintigraphy

    International Nuclear Information System (INIS)

    Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.

    1983-01-01

    The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I-131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I-131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The normal adrenal glands were seldom seen and faintly imaged in 2% at 24 h after injection and in 16% at 48 h, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extra-adrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I-131 MIBG uptake at 24 through 72 h

  4. The effect of signal variability on the histograms of anthropomorphic channel outputs: factors resulting in non-normally distributed data

    Science.gov (United States)

    Elshahaby, Fatma E. A.; Ghaly, Michael; Jha, Abhinav K.; Frey, Eric C.

    2015-03-01

    Model Observers are widely used in medical imaging for the optimization and evaluation of instrumentation, acquisition parameters and image reconstruction and processing methods. The channelized Hotelling observer (CHO) is a commonly used model observer in nuclear medicine and has seen increasing use in other modalities. An anthropmorphic CHO consists of a set of channels that model some aspects of the human visual system and the Hotelling Observer, which is the optimal linear discriminant. The optimality of the CHO is based on the assumption that the channel outputs for data with and without the signal present have a multivariate normal distribution with equal class covariance matrices. The channel outputs result from the dot product of channel templates with input images and are thus the sum of a large number of random variables. The central limit theorem is thus often used to justify the assumption that the channel outputs are normally distributed. In this work, we aim to examine this assumption for realistically simulated nuclear medicine images when various types of signal variability are present.

  5. Distribution of class ii major histocompatibility complex antigenexpressing cells in human dental pulp with carious lesions

    Directory of Open Access Journals (Sweden)

    Tetiana Haniastuti

    2012-09-01

    Full Text Available Background: Dental caries is a bacterial infection which causes destruction of the hard tissues of the tooth. Exposure of the dentin to the oral environment as a result of caries inevitably results in a cellular response in the pulp. The major histocompatibility complex (MHC is a group of genes that code for cell-surface histocompatibility antigens. Cells expressing class II MHC molecules participate in the initial recognition and the processing of antigenic substances to serve as antigen-presenting cells. Purpose: The aim of the study was to elucidate the alteration in the distribution of class II MHC antigen-expressing cells in human dental pulp as carious lesions progressed toward the pulp. Methods: Fifteen third molars with caries at the occlusal site at various stages of decay and 5 intact third molars were extracted and used in this study. Before decalcifying with 10% EDTA solution (pH 7.4, all the samples were observed by micro-computed tomography to confirm the lesion condition three-dimensionally. The specimens were then processed for cryosection and immunohistochemistry using an anti-MHC class II monoclonal antibody. Results: Class II MHC antigen-expressing cells were found both in normal and carious specimens. In normal tooth, the class II MHC-immunopositive cells were observed mainly at the periphery of the pulp tissue. In teeth with caries, class II MHC-immunopositive cells were located predominantly subjacent to the carious lesions. As the caries progressed, the number of class II MHC antigen-expressing cells was increased. Conclusion: The depth of carious lesions affects the distribution of class II MHC antigen-expressing cells in the dental pulp.Latar belakang: Karies merupakan penyakit infeksi bakteri yang mengakibatkan destruksi jaringan keras gigi. Dentin yang terbuka akibat karies akan menginduksi respon imun seluler pada pulpa. Kompleks histokompatibilitas utama (MHC merupakan sekumpulan gen yang mengkode histokompatibilitas

  6. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  7. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  8. The Analysis of Bankruptcy Risk Using the Normal Distribution Gauss-Laplace in Case of a Company is the Most Modern Romanian Sea-River Port on the Danube

    Directory of Open Access Journals (Sweden)

    Rodica Pripoaie

    2015-08-01

    Full Text Available This work presents the application of the normal distribution Gauss-Laplace in case of a company is the most modern Romanian sea-river port on the Danube, specialized service providers, with a handling capacity of approx. 20,000,000 tons / year. The normal distribution Gauss-Laplace is the most known and used probability distribution, because it surprises better the evolution of economic and financial phenomena. Around the average, which has the greatest frequency, gravitate values more to less distant than average, but with the same standard deviation. It is noted that, although used in the forecasting calculations, analysis of profitability threshold - even ignores the risk of decisional operations (regarding deviations between the forecast and achievements, which may, in certain circumstances, influence much the activity of the company. This can be held into account when carefully studying the evolution of turnover follows a law of probability. In case not exist any information on the law of probability of turnover and no reason that one case appear more than another, according of Laplace law, we consider that these cases are uniformly distributed, therefore they follow a normal distribution.

  9. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Science.gov (United States)

    Watabe, Tadashi; Naka, Sadahiro; Ikeda, Hayato; Horitsugi, Genki; Kanai, Yasukazu; Isohashi, Kayako; Ishibashi, Mana; Kato, Hiroki; Shimosegawa, Eku; Watabe, Hiroshi; Hatazawa, Jun

    2014-01-01

    Acetylcholinesterase (AChE) inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11)C-Donepezil (DNP) and the AChE activity in the normal rat, with special focus on the adrenal glands. The distribution of (11)C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g). A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11)C-DNP (45.0 ± 10.7 MBq). The whole-body distribution of the (11)C-DNP PET was evaluated based on the Vt (total distribution volume) by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11)C-DNP in the body (following the liver) (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3), respectively), indicating that the distribution of (11)C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach) (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively), indicating high activity of AChE in the adrenal glands. We demonstrated the whole-body distribution of (11)C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11)C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  10. Distribution of intravenously administered acetylcholinesterase inhibitor and acetylcholinesterase activity in the adrenal gland: 11C-donepezil PET study in the normal rat.

    Directory of Open Access Journals (Sweden)

    Tadashi Watabe

    Full Text Available PURPOSE: Acetylcholinesterase (AChE inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11C-Donepezil (DNP and the AChE activity in the normal rat, with special focus on the adrenal glands. METHODS: The distribution of (11C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight  = 220 ± 8.9 g. A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11C-DNP (45.0 ± 10.7 MBq. The whole-body distribution of the (11C-DNP PET was evaluated based on the Vt (total distribution volume by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. RESULTS: The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11C-DNP in the body (following the liver (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3, respectively, indicating that the distribution of (11C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively, indicating high activity of AChE in the adrenal glands. CONCLUSIONS: We demonstrated the whole-body distribution of (11C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.

  11. Global Dynamics of Infectious Disease with Arbitrary Distributed Infectious Period on Complex Networks

    Directory of Open Access Journals (Sweden)

    Xiaoguang Zhang

    2014-01-01

    Full Text Available Most of the current epidemic models assume that the infectious period follows an exponential distribution. However, due to individual heterogeneity and epidemic diversity, these models fail to describe the distribution of infectious periods precisely. We establish a SIS epidemic model with multistaged progression of infectious periods on complex networks, which can be used to characterize arbitrary distributions of infectious periods of the individuals. By using mathematical analysis, the basic reproduction number R0 for the model is derived. We verify that the R0 depends on the average distributions of infection periods for different types of infective individuals, which extend the general theory obtained from the single infectious period epidemic models. It is proved that if R0<1, then the disease-free equilibrium is globally asymptotically stable; otherwise the unique endemic equilibrium exists such that it is globally asymptotically attractive. Finally numerical simulations hold for the validity of our theoretical results is given.

  12. Simulation study of pO2 distribution in induced tumour masses and normal tissues within a microcirculation environment.

    Science.gov (United States)

    Li, Mao; Li, Yan; Wen, Peng Paul

    2014-01-01

    The biological microenvironment is interrupted when tumour masses are introduced because of the strong competition for oxygen. During the period of avascular growth of tumours, capillaries that existed play a crucial role in supplying oxygen to both tumourous and healthy cells. Due to limitations of oxygen supply from capillaries, healthy cells have to compete for oxygen with tumourous cells. In this study, an improved Krogh's cylinder model which is more realistic than the previously reported assumption that oxygen is homogeneously distributed in a microenvironment, is proposed to describe the process of the oxygen diffusion from a capillary to its surrounding environment. The capillary wall permeability is also taken into account. The simulation study is conducted and the results show that when tumour masses are implanted at the upstream part of a capillary and followed by normal tissues, the whole normal tissues suffer from hypoxia. In contrast, when normal tissues are ahead of tumour masses, their pO2 is sufficient. In both situations, the pO2 in the whole normal tissues drops significantly due to the axial diffusion at the interface of normal tissues and tumourous cells. As the existence of the axial oxygen diffusion cannot supply the whole tumour masses, only these tumourous cells that are near the interface can be partially supplied, and have a small chance to survive.

  13. Spin fluctuations in liquid 3He: a strong-coupling calculation of T/sub c/ and the normal-state distribution function

    International Nuclear Information System (INIS)

    Fay, D.; Layzer, A.

    1975-01-01

    The Berk--Schrieffer method of strong-coupling superconductivity for nearly ferromagnetic systems is generalized to arbitrary L-state pairing and realistic (hard-core) potentials. Application to 3 He yields a P-state transition but very low values for T/sub c/ and an unsatisfactory normal-state momentum distribution

  14. Conformational distributions and proximity relationships in the rigor complex of actin and myosin subfragment-1.

    Science.gov (United States)

    Nyitrai, M; Hild, G; Lukács, A; Bódis, E; Somogyi, B

    2000-01-28

    Cyclic conformational changes in the myosin head are considered essential for muscle contraction. We hereby show that the extension of the fluorescence resonance energy transfer method described originally by Taylor et al. (Taylor, D. L., Reidler, J., Spudich, J. A., and Stryer, L. (1981) J. Cell Biol. 89, 362-367) allows determination of the position of a labeled point outside the actin filament in supramolecular complexes and also characterization of the conformational heterogeneity of an actin-binding protein while considering donor-acceptor distance distributions. Using this method we analyzed proximity relationships between two labeled points of S1 and the actin filament in the acto-S1 rigor complex. The donor (N-[[(iodoacetyl)amino]ethyl]-5-naphthylamine-1-sulfonate) was attached to either the catalytic domain (Cys-707) or the essential light chain (Cys-177) of S1, whereas the acceptor (5-(iodoacetamido)fluorescein) was attached to the actin filament (Cys-374). In contrast to the narrow positional distribution (assumed as being Gaussian) of Cys-707 (5 +/- 3 A), the positional distribution of Cys-177 was found to be broad (102 +/- 4 A). Such a broad positional distribution of the label on the essential light chain of S1 may be important in accommodating the helically arranged acto-myosin binding relative to the filament axis.

  15. New definition of complexity for self-gravitating fluid distributions: The spherically symmetric, static case

    Science.gov (United States)

    Herrera, L.

    2018-02-01

    We put forward a new definition of complexity, for static and spherically symmetric self-gravitating systems, based on a quantity, hereafter referred to as complexity factor, that appears in the orthogonal splitting of the Riemann tensor, in the context of general relativity. We start by assuming that the homogeneous (in the energy density) fluid, with isotropic pressure is endowed with minimal complexity. For this kind of fluid distribution, the value of complexity factor is zero. So, the rationale behind our proposal for the definition of complexity factor stems from the fact that it measures the departure, in the value of the active gravitational mass (Tolman mass), with respect to its value for a zero complexity system. Such departure is produced by a specific combination of energy density inhomogeneity and pressure anisotropy. Thus, zero complexity factor may also be found in self-gravitating systems with inhomogeneous energy density and anisotropic pressure, provided the effects of these two factors, on the complexity factor, cancel each other. Some exact interior solutions to the Einstein equations satisfying the zero complexity criterium are found, and prospective applications of this newly defined concept, to the study of the structure and evolution of compact objects, are discussed.

  16. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    Science.gov (United States)

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  17. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  18. Evaluation of the Weibull and log normal distribution functions as survival models of Escherichia coli under isothermal and non isothermal conditions.

    Science.gov (United States)

    Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2007-11-01

    Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.

  19. Sensing of nucleosides, nucleotides and DNA using luminescent Eu complex by normal and time resolved fluorescence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Azab, Hassan A.; Anwar, Zeinab M. [Chemistry Department, Faculty of Science, Suez Canal University, 41522 Ismailia (Egypt); Kamel, Rasha M., E-mail: rashamoka@yahoo.com [Chemistry Department, Faculty of Science, Suez University, 43518 Suez (Egypt); Rashwan, Mai S. [Chemistry Department, Faculty of Science, Suez Canal University, 41522 Ismailia (Egypt)

    2016-01-15

    The interaction of Eu-1,4,7,10-tetraazacyclododecane (Cyclen) complex by using 4,4,4 trifluoro-1-(2-naphthyl)1,3-butanedione (TNB) as antenna with some nucleosides (guanosine, adenosine, cytidine and inosine), nucleotides (AMP, GMP, CMP, ATP and IMP) and DNA is studied using fluorescence technique. Two detection modes are employed one is the time-resolved mode, and the other is the normal luminescence mode. The time-resolved mode is more sensing than the normal luminescence mode in the present study. By using Benesi–Hildebrand equation binding constants were determined at various temperatures. Thermodynamic parameters showed that the reaction is spontaneous through the obtained negative values of free energy change ΔG. The enthalpy ΔH and the entropy ΔS of reactions were all determined. - Highlights: • This is an application for the detection of biologically important ligands. • The detection limits, binding constants and thermodynamic parameters were evaluated. • Effect of some interferents on the detection of DNA has been investigated.

  20. Sensing of nucleosides, nucleotides and DNA using luminescent Eu complex by normal and time resolved fluorescence techniques

    International Nuclear Information System (INIS)

    Azab, Hassan A.; Anwar, Zeinab M.; Kamel, Rasha M.; Rashwan, Mai S.

    2016-01-01

    The interaction of Eu-1,4,7,10-tetraazacyclododecane (Cyclen) complex by using 4,4,4 trifluoro-1-(2-naphthyl)1,3-butanedione (TNB) as antenna with some nucleosides (guanosine, adenosine, cytidine and inosine), nucleotides (AMP, GMP, CMP, ATP and IMP) and DNA is studied using fluorescence technique. Two detection modes are employed one is the time-resolved mode, and the other is the normal luminescence mode. The time-resolved mode is more sensing than the normal luminescence mode in the present study. By using Benesi–Hildebrand equation binding constants were determined at various temperatures. Thermodynamic parameters showed that the reaction is spontaneous through the obtained negative values of free energy change ΔG. The enthalpy ΔH and the entropy ΔS of reactions were all determined. - Highlights: • This is an application for the detection of biologically important ligands. • The detection limits, binding constants and thermodynamic parameters were evaluated. • Effect of some interferents on the detection of DNA has been investigated.

  1. Radio metal (169Yb) uptake in normal and tumour cells in vitro. Influence of metabolic cell activity and complex structure

    International Nuclear Information System (INIS)

    Franke, W.G.; Kampf, G.

    1996-01-01

    Trivalent radio metal tracers have been used for tumour imaging and metastatic pain palliation. For better understanding their tumour accumulation, basic model studies of uptake of different 169 Yb complexes into cultured normal and tumour cells were performed. Whereas the uptake of 169 Yb citrate is strongly dependent on the metabolic activity and is not tumour-cell pacific, the uptake of 169 Yb complexed with amino carbonic acid (NTA, EDTA, DTPA) does not correlate to the metabolic activities. These complexes are taken up to a greater amount by the tumour cells (by a factor of about 2). Uptake of both complex types leads to a stable association to cellular compounds, 169 Yb is not releasable by the strong complexing agent DTPA. Protein binding of the 169 Yb complexes shows great influence on their cellular uptake. The bound proportion is no more available,for cellular uptake. The results indicate that i 0 uptake of 169 Yb citrate is an active cellular transport process which i not tumor-specific, ii) the 169 Yb amino carbonic acid complexes show a weak favouring by the tumour cells, iii) different from earlier acceptions the Yb complexes studied are not taken up by the cells in protein-bound form. The structure of the Yb complex is decisive for its protein binding and cellular uptake. (author). 13 refs., 6 figs

  2. [Calbindin and parvalbumin distribution in spinal cord of normal and rabies-infected mice].

    Science.gov (United States)

    Monroy-Gómez, Jeison; Torres-Fernández, Orlando

    2013-01-01

    Rabies is a fatal infectious disease of the nervous system; however, the knowledge about the pathogenic neural mechanisms in rabies is scarce. In addition, there are few studies of rabies pathology of the spinal cord. To study the distribution of calcium binding proteins calbindin and parvalbumin and assessing the effect of rabies virus infection on their expression in the spinal cord of mice. MATERIALES Y METHODS: Mice were inoculated with rabies virus, by intracerebral or intramuscular route. The spinal cord was extracted to perform some crosscuts which were treated by immunohistochemistry with monoclonal antibodies to reveal the presence of the two proteins in normal and rabies infected mice. We did qualitative and quantitative analyses of the immunoreactivity of the two proteins. Calbindin and parvalbumin showed differential distribution in Rexed laminae. Rabies infection produced a decrease in the expression of calbindin. On the contrary, the infection caused an increased expression of parvalbumin. The effect of rabies infection on the two proteins expression was similar when comparing both routes of inoculation. The differential effect of rabies virus infection on the expression of calbindin and parvalbumin in the spinal cord of mice was similar to that previously reported for brain areas. This result suggests uniformity in the response to rabies infection throughout the central nervous system. This is an important contribution to the understanding of the pathogenesis of rabies.

  3. Application of «Sensor signal analysis network» complex for distributed, time synchronized analysis of electromagnetic radiation

    Science.gov (United States)

    Mochalov, Vladimir; Mochalova, Anastasia

    2017-10-01

    The paper considers a developing software-hardware complex «Sensor signal analysis network» for distributed and time synchronized analysis of electromagnetic radiations. The areas of application and the main features of the complex are described. An example of application of the complex to monitor natural electromagnetic radiation sources is considered based on the data recorded in VLF range. A generalized functional scheme of stream analysis of signals by a complex functional node is suggested and its application for stream detection of atmospherics, whistlers and tweaks is considered.

  4. Distribution and ultrastructure of pigment cells in the skins of normal and albino adult turbot, Scophthalmus Maximus

    Institute of Scientific and Technical Information of China (English)

    GUO Huarong; HUANG Bing; QI Fei; ZHANG Shicui

    2007-01-01

    The distribution and ultrastructure of pigment cells in skins of normal and albino adult turbots were examined with transmission electron microscopy (TEM). Three types of pigment cells of melanophore, iridophore and xanthophore have been recognized in adult turbot skins. The skin color depends mainly on the amount and distribution of melanophore and iridophore, as xanthophore is quite rare. No pigment cells can be found in the epidermis of the skins. In the pigmented ocular skin of the turbot, melanophore and iridophore are usually co-localized in the dermis. This is quite different from the distribution in larvae skin. In albino and white blind skins of adult turbots, however, only iridophore monolayer still exists, while the melanophore monolayer disappears. This cytological evidence explains why the albino adult turbot, unlike its larvae, could never resume its body color no matter what environmental and nutritional conditions were provided. Endocytosis is quite active in the cellular membrane of the iridophore. This might be related to the formation of reflective platelet and stability of the iridophore.

  5. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Best Statistical Distribution of flood variables for Johor River in Malaysia

    Science.gov (United States)

    Salarpour Goodarzi, M.; Yusop, Z.; Yusof, F.

    2012-12-01

    A complex flood event is always characterized by a few characteristics such as flood peak, flood volume, and flood duration, which might be mutually correlated. This study explored the statistical distribution of peakflow, flood duration and flood volume at Rantau Panjang gauging station on the Johor River in Malaysia. Hourly data were recorded for 45 years. The data were analysed based on water year (July - June). Five distributions namely, Log Normal, Generalize Pareto, Log Pearson, Normal and Generalize Extreme Value (GEV) were used to model the distribution of all the three variables. Anderson-Darling and Kolmogorov-Smirnov goodness-of-fit tests were used to evaluate the best fit. Goodness-of-fit tests at 5% level of significance indicate that all the models can be used to model the distribution of peakflow, flood duration and flood volume. However, Generalize Pareto distribution is found to be the most suitable model when tested with the Anderson-Darling test and the, Kolmogorov-Smirnov suggested that GEV is the best for peakflow. The result of this research can be used to improve flood frequency analysis. Comparison between Generalized Extreme Value, Generalized Pareto and Log Pearson distributions in the Cumulative Distribution Function of peakflow

  7. Significant effect of topographic normalization of airborne LiDAR data on the retrieval of plant area index profile in mountainous forests

    Science.gov (United States)

    Liu, Jing; Skidmore, Andrew K.; Heurich, Marco; Wang, Tiejun

    2017-10-01

    As an important metric for describing vertical forest structure, the plant area index (PAI) profile is used for many applications including biomass estimation and wildlife habitat assessment. PAI profiles can be estimated with the vertically resolved gap fraction from airborne LiDAR data. Most research utilizes a height normalization algorithm to retrieve local or relative height by assuming the terrain to be flat. However, for many forests this assumption is not valid. In this research, the effect of topographic normalization of airborne LiDAR data on the retrieval of PAI profile was studied in a mountainous forest area in Germany. Results show that, although individual tree height may be retained after topographic normalization, the spatial arrangement of trees is changed. Specifically, topographic normalization vertically condenses and distorts the PAI profile, which consequently alters the distribution pattern of plant area density in space. This effect becomes more evident as the slope increases. Furthermore, topographic normalization may also undermine the complexity (i.e., canopy layer number and entropy) of the PAI profile. The decrease in PAI profile complexity is not solely determined by local topography, but is determined by the interaction between local topography and the spatial distribution of each tree. This research demonstrates that when calculating the PAI profile from airborne LiDAR data, local topography needs to be taken into account. We therefore suggest that for ecological applications, such as vertical forest structure analysis and modeling of biodiversity, topographic normalization should not be applied in non-flat areas when using LiDAR data.

  8. TOTAL NUMBER, DISTRIBUTION, AND PHENOTYPE OF CELLS EXPRESSING CHONDROITIN SULPHATE PROTEOGLYCANS IN THE NORMAL HUMAN AMYGDALA

    Science.gov (United States)

    Pantazopoulos, Harry; Murray, Elisabeth A.; Berretta, Sabina

    2009-01-01

    Chondroitin sulphate proteoglycans (CSPGs) are a key structural component of the brain extracellular matrix. They are involved in critical neurodevelopmental functions and are one of the main components of pericellular aggregates known as perineuronal nets. As a step toward investigating their functional and pathophysiological roles in the human amygdala, we assessed the pattern of CSPG expression in the normal human amygdala using wisteria floribunda agglutinin (WFA) lectin-histochemistry. Total numbers of WFA-labeled elements were measured in the lateral (LN), basal (BN), accessory basal (ABN) and cortical (CO) nuclei of the amygdala from 15 normal adult human subjects. For interspecies qualitative comparison, we also investigated the pattern of WFA labeling in the amygdala of naïve rats (n=32) and rhesus monkeys (Macaca mulatta; n=6). In human amygdala, WFA lectin-histochemistry resulted in labeling of perineuronal nets and cells with clear glial morphology, while neurons did not show WFA-labeling. Total numbers of WFA-labeled glial cells showed high interindividual variability. These cells aggregated in clusters with a consistent between-subjects spatial distribution. In a subset of human subjects (n=5), dual color fluorescence using an antibody raised against glial fibrillary acidic protein (GFAP) and WFA showed that the majority (93.7%) of WFA-labeled glial cells correspond to astrocytes. In rat and monkey amygdala, WFA histochemistry labeled perineuronal nets, but not glial cells. These results suggest that astrocytes are the main cell type expressing CSPGs in the adult human amygdala. Their highly segregated distribution pattern suggests that these cells serve specialized functions within human amygdalar nuclei. PMID:18374308

  9. Impact of distributions on the archetypes and prototypes in heterogeneous nanoparticle ensembles.

    Science.gov (United States)

    Fernandez, Michael; Wilson, Hugh F; Barnard, Amanda S

    2017-01-05

    The magnitude and complexity of the structural and functional data available on nanomaterials requires data analytics, statistical analysis and information technology to drive discovery. We demonstrate that multivariate statistical analysis can recognise the sets of truly significant nanostructures and their most relevant properties in heterogeneous ensembles with different probability distributions. The prototypical and archetypal nanostructures of five virtual ensembles of Si quantum dots (SiQDs) with Boltzmann, frequency, normal, Poisson and random distributions are identified using clustering and archetypal analysis, where we find that their diversity is defined by size and shape, regardless of the type of distribution. At the complex hull of the SiQD ensembles, simple configuration archetypes can efficiently describe a large number of SiQDs, whereas more complex shapes are needed to represent the average ordering of the ensembles. This approach provides a route towards the characterisation of computationally intractable virtual nanomaterial spaces, which can convert big data into smart data, and significantly reduce the workload to simulate experimentally relevant virtual samples.

  10. The normal and abnormal distribution of the adrenomedullary imaging agent m-[I-131]iodobenzylguanidine (I-131 MIBG) in man: evaluation by scintigraphy

    International Nuclear Information System (INIS)

    Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.

    1983-01-01

    The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I- 131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I- 131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The ''normal'' adrenal glands were seldom seen and faintly imaged in 2% at 24 hr after injection and in 16% at 48 hr, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extraadrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I- 131 MIBG uptake at 24 through 72 hr

  11. Advection-diffusion model for normal grain growth and the stagnation of normal grain growth in thin films

    International Nuclear Information System (INIS)

    Lou, C.

    2002-01-01

    An advection-diffusion model has been set up to describe normal grain growth. In this model grains are divided into different groups according to their topological classes (number of sides of a grain). Topological transformations are modelled by advective and diffusive flows governed by advective and diffusive coefficients respectively, which are assumed to be proportional to topological classes. The ordinary differential equations governing self-similar time-independent grain size distribution can be derived analytically from continuity equations. It is proved that the time-independent distributions obtained by solving the ordinary differential equations have the same form as the time-dependent distributions obtained by solving the continuity equations. The advection-diffusion model is extended to describe the stagnation of normal grain growth in thin films. Grain boundary grooving prevents grain boundaries from moving, and the correlation between neighbouring grains accelerates the stagnation of normal grain growth. After introducing grain boundary grooving and the correlation between neighbouring grains into the model, the grain size distribution is close to a lognormal distribution, which is usually found in experiments. A vertex computer simulation of normal grain growth has also been carried out to make a cross comparison with the advection-diffusion model. The result from the simulation did not verify the assumption that the advective and diffusive coefficients are proportional to topological classes. Instead, we have observed that topological transformations usually occur on certain topological classes. This suggests that the advection-diffusion model can be improved by making a more realistic assumption on topological transformations. (author)

  12. Simulating the Daylight Performance of Complex Fenestration Systems Using Bidirectional Scattering Distribution Functions within Radiance

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Gregory; Mistrick, Ph.D., Richard; Lee, Eleanor; McNeil, Andrew; Jonsson, Ph.D., Jacob

    2011-01-21

    We describe two methods which rely on bidirectional scattering distribution functions (BSDFs) to model the daylighting performance of complex fenestration systems (CFS), enabling greater flexibility and accuracy in evaluating arbitrary assemblies of glazing, shading, and other optically-complex coplanar window systems. Two tools within Radiance enable a) efficient annual performance evaluations of CFS, and b) accurate renderings of CFS despite the loss of spatial resolution associated with low-resolution BSDF datasets for inhomogeneous systems. Validation, accuracy, and limitations of the methods are discussed.

  13. Cellular complexity in subcortical white matter: a distributed control circuit?

    Science.gov (United States)

    Colombo, Jorge A

    2018-03-01

    The subcortical white matter (SWM) has been traditionally considered as a site for passive-neutral-information transfer through cerebral cortex association and projection fibers. Yet, the presence of subcortical neuronal and glial "interstitial" cells expressing immunolabelled neurotransmitters/neuromodulators and synaptic vesicular proteins, and recent immunohistochemical and electrophysiological observations on the rat visual cortex as well as interactive regulation of myelinating processes support the possibility that SWM nests subcortical, regionally variable, distributed neuronal-glial circuits, that could influence information transfer. Their hypothetical involvement in regulating the timing and signal transfer probability at the SWM axonal components ought to be considered and experimentally analysed. Thus, the "interstitial" neuronal cells-associated with local glial cells-traditionally considered to be vestigial and functionally inert under normal conditions, they may well turn to be critical in regulating information transfer at the SWM.

  14. Distributed Low-Complexity Controller for Wind Power Plant in Derated Operation

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Madjidian, Daria; Spudic, Vedrana

    2013-01-01

    We consider a wind power plant of megawatt wind turbines operating in derated mode. When operating in this mode, the wind power plant controller is free to distribute power set-points to the individual turbines, as long as the total power demand is met. In this work, we design a controller...... that exploits this freedom to reduce the fatigue on the turbines in the wind power plant. We show that the controller can be designed in a decentralized manner, such that each wind turbine is equipped with a local low-complexity controller relying only on few measurements and little communication. As a basis...... for the controller design, a linear wind turbine model is constructed and verified in an operational wind power plant of megawatt turbines. Due to limitations of the wind power plant available for tests, it is not possible to implement the developed controller; instead the final distributed controller is evaluated...

  15. Some case studies of skewed (and other ab-normal) data distributions arising in low-level environmental research

    International Nuclear Information System (INIS)

    Currie, L.A.

    2001-01-01

    Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85 Kr and 14 C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger- Mueller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test - for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14 C- 12 C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban ''soot''. The third, environmentally skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85 Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom. (orig.)

  16. Some case studies of skewed (and other ab-normal) data distributions arising in low-level environmental research.

    Science.gov (United States)

    Currie, L A

    2001-07-01

    Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85Kr and 14C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger-Muller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test--for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14C-12C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban "soot". The third, environmentally, skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom.

  17. Quantification of differences between nailfold capillaroscopy images with a scleroderma pattern and normal pattern using measures of geometric and algorithmic complexity.

    Science.gov (United States)

    Urwin, Samuel George; Griffiths, Bridget; Allen, John

    2017-02-01

    This study aimed to quantify and investigate differences in the geometric and algorithmic complexity of the microvasculature in nailfold capillaroscopy (NFC) images displaying a scleroderma pattern and those displaying a 'normal' pattern. 11 NFC images were qualitatively classified by a capillary specialist as indicative of 'clear microangiopathy' (CM), i.e. a scleroderma pattern, and 11 as 'not clear microangiopathy' (NCM), i.e. a 'normal' pattern. Pre-processing was performed, and fractal dimension (FD) and Kolmogorov complexity (KC) were calculated following image binarisation. FD and KC were compared between groups, and a k-means cluster analysis (n  =  2) on all images was performed, without prior knowledge of the group assigned to them (i.e. CM or NCM), using FD and KC as inputs. CM images had significantly reduced FD and KC compared to NCM images, and the cluster analysis displayed promising results that the quantitative classification of images into CM and NCM groups is possible using the mathematical measures of FD and KC. The analysis techniques used show promise for quantitative microvascular investigation in patients with systemic sclerosis.

  18. Similar distributions of repaired sites in chromatin of normal and xeroderma pigmentosum variant cells damaged by ultraviolet light

    International Nuclear Information System (INIS)

    Cleaver, J.E.

    1979-01-01

    Excision repair of damage from ultraviolet light in both normal and xeroderma pigmentosum variant fibroblasts at early times after irradiation occurred preferentially in regions of DNA accessible to micrococcal nuclease digestion. These regions are predominantly the linker regions between nucleosomes in chromatin. The alterations reported at polymerization and ligation steps of excision repair in the variant are therefore not associated with changes in the relative distributions of repair sites in linker and core particle regions of DNA. (Auth.)

  19. Real-Time Reactive Power Distribution in Microgrids by Dynamic Programing

    DEFF Research Database (Denmark)

    Levron, Yoash; Beck, Yuval; Katzir, Liran

    2017-01-01

    In this paper a new real-time optimization method for reactive power distribution in microgrids is proposed. The method enables location of a globally optimal distribution of reactive power under normal operating conditions. The method exploits the typical compact structure of microgrids to obtain...... combination of reactive powers, by means of dynamic programming. Since every single step involves a one-dimensional problem, the complexity of the solution is only linear with the number of clusters, and as a result, a globally optimal solution may be obtained in real time. The paper includes the results...

  20. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1978-01-01

    This paper addresses the problem of obtaining numerically maximum-likelihood estimates of the parameters for a mixture of normal distributions. In recent literature, a certain successive-approximations procedure, based on the likelihood equations, was shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, we introduce a general iterative procedure, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. We show that, with probability 1 as the sample size grows large, this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. We also show that the step-size which yields optimal local convergence rates for large samples is determined in a sense by the 'separation' of the component normal densities and is bounded below by a number between 1 and 2.

  1. Comparative pharmacokinetics and tissue distribution profiles of lignan components in normal and hepatic fibrosis rats after oral administration of Fuzheng Huayu recipe.

    Science.gov (United States)

    Yang, Tao; Liu, Shan; Zheng, Tian-Hui; Tao, Yan-Yan; Liu, Cheng-Hai

    2015-05-26

    Fuzheng Huayu recipe (FZHY) is formulated on the basis of Chinese medicine theory in treating liver fibrosis. To illuminate the influence of the pathological state of liver fibrosis on the pharmacokinetics and tissue distribution profiles of lignan components from FZHY. Male Wistar rats were randomly divided into normal group and Hepatic fibrosis group (induced by dimethylnitrosamine). Six lignan components were detected and quantified by ultrahigh performance liquid chromatography-tandem mass spectrometry(UHPLC-MS/MS)in the plasma and tissue of normal and hepatic fibrosis rats. A rapid, sensitive and convenient UHPLC-MS/MS method has been developed for the simultaneous determination of six lignan components in different rat biological samples successfully. After oral administration of FZHY at a dose of 15g/kg, the pharmacokinetic behaviors of schizandrin A (SIA), schizandrin B (SIB), schizandrin C (SIC), schisandrol A (SOA), Schisandrol B (SOB) and schisantherin A (STA) have been significantly changed in hepatic fibrosis rats compared with the normal rats, and their AUC(0-t) values were increased by 235.09%, 388.44%, 223.30%, 669.30%, 295.08% and 267.63% orderly (Pdistribution results showed the amount of SIA, SIB, SOA and SOB were significant increased in heart, lung, spleen and kidney of hepatic fibrosis rats compared with normal rats at most of the time point (Pdistribution of lignan components in normal and hepatic fibrosis rats. The hepatic fibrosis could alter the pharmacokinetics and tissue distribution properties of lignan components in rats after administration of FZHY. The results might be helpful for guide the clinical application of this medicine. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. A complex network based model for detecting isolated communities in water distribution networks

    Science.gov (United States)

    Sheng, Nan; Jia, Youwei; Xu, Zhao; Ho, Siu-Lau; Wai Kan, Chi

    2013-12-01

    Water distribution network (WDN) is a typical real-world complex network of major infrastructure that plays an important role in human's daily life. In this paper, we explore the formation of isolated communities in WDN based on complex network theory. A graph-algebraic model is proposed to effectively detect the potential communities due to pipeline failures. This model can properly illustrate the connectivity and evolution of WDN during different stages of contingency events, and identify the emerging isolated communities through spectral analysis on Laplacian matrix. A case study on a practical urban WDN in China is conducted, and the consistency between the simulation results and the historical data are reported to showcase the feasibility and effectiveness of the proposed model.

  3. Bivariate Rayleigh Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2007-01-01

    Full Text Available Rayleigh (1880 observed that the sea waves follow no law because of the complexities of the sea, but it has been seen that the probability distributions of wave heights, wave length, wave induce pitch, wave and heave motions of the ships follow the Rayleigh distribution. At present, several different quantities are in use for describing the state of the sea; for example, the mean height of the waves, the root mean square height, the height of the “significant waves” (the mean height of the highest one-third of all the waves the maximum height over a given interval of the time, and so on. At present, the ship building industry knows less than any other construction industry about the service conditions under which it must operate. Only small efforts have been made to establish the stresses and motions and to incorporate the result of such studies in to design. This is due to the complexity of the problem caused by the extensive variability of the sea and the corresponding response of the ships. Although the problem appears feasible, yet it is possible to predict service conditions for ships in an orderly and relatively simple manner Rayleigh (1980 derived it from the amplitude of sound resulting from many independent sources. This distribution is also connected with one or two dimensions and is sometimes referred to as “random walk” frequency distribution. The Rayleigh distribution can be derived from the bivariate normal distribution when the variate are independent and random with equal variances. We try to construct bivariate Rayleigh distribution with marginal Rayleigh distribution function and discuss its fundamental properties.

  4. Synchronization of Markovian jumping stochastic complex networks with distributed time delays and probabilistic interval discrete time-varying delays

    International Nuclear Information System (INIS)

    Li Hongjie; Yue Dong

    2010-01-01

    The paper investigates the synchronization stability problem for a class of complex dynamical networks with Markovian jumping parameters and mixed time delays. The complex networks consist of m modes and the networks switch from one mode to another according to a Markovian chain with known transition probability. The mixed time delays are composed of discrete and distributed delays, the discrete time delay is assumed to be random and its probability distribution is known a priori. In terms of the probability distribution of the delays, the new type of system model with probability-distribution-dependent parameter matrices is proposed. Based on the stochastic analysis techniques and the properties of the Kronecker product, delay-dependent synchronization stability criteria in the mean square are derived in the form of linear matrix inequalities which can be readily solved by using the LMI toolbox in MATLAB, the solvability of derived conditions depends on not only the size of the delay, but also the probability of the delay-taking values in some intervals. Finally, a numerical example is given to illustrate the feasibility and effectiveness of the proposed method.

  5. Lyophilized kits of diamino dithiol compounds for labelling with 99m-technetium. Pharmacokinetics studies and distribution compartmental models of the related complexes

    International Nuclear Information System (INIS)

    Araujo, Elaine Bortoleti de

    1995-01-01

    The present work reflects the clinical interest for labelling diamino dithiol compounds with technetium-99m. Both chosen compounds, L,L-Ethylene dicysteine (L,L-EC) and L,L-Ethylene dicysteine diethyl esther (L,L-ECD) were obtained with relative good yield and characterized by IR and NMR. The study of labelling conditions with technetium-99m showed the influence of the type and mass of reducing agent as well as the pH on the formation of complexes with desired biological characteristics. Radiochemical purity was determined by thin layer chromatography (TLC) and high performance liquid chromatography (HPLC). Lyophilised kits of L,L-EC and L,L-ECD for labelling with 99m Tc were obtained, with stability superior to 120 days, when stored under refrigeration, enabling the kits marketing. The ideal formulation of the kits as well as the use of liquid nitrogen in the freezing process, determined the lyophilization success. Distribution biological studies of the 99m Tc complexes were performed on mice by invasive method and on bigger animals by scintigraphic evaluation. Biological distribution studies of the complex 99m Tc-L,L-EC showed fast blood clearance, with the elimination of about 90% of the administered dose after 60 minutes, almost exclusively by the urinary system. The biological distribution results were adjusted to a three compartmental distribution model, as expected for a radiopharmaceutical designed to renal dynamic studies, with tubular elimination. The complex interaction with renal tubular receptors is related with structural characteristics of the compound, more specifically with the presence and location of polar groups. In comparison with 99m Tc-L,L-EC, biological studies of the complex 99m Tc -L,L-ECD showed different distribution aspects, despite some structural similarities. The presence of ethyl groups confers to the complex neutrality and lipophilicity. It cross the intact blood brain barrier and is retained in the brain for enough period

  6. Spatial Distribution of Iron Within the Normal Human Liver Using Dual-Source Dual-Energy CT Imaging.

    Science.gov (United States)

    Abadia, Andres F; Grant, Katharine L; Carey, Kathleen E; Bolch, Wesley E; Morin, Richard L

    2017-11-01

    Explore the potential of dual-source dual-energy (DSDE) computed tomography (CT) to retrospectively analyze the uniformity of iron distribution and establish iron concentration ranges and distribution patterns found in healthy livers. Ten mixtures consisting of an iron nitrate solution and deionized water were prepared in test tubes and scanned using a DSDE 128-slice CT system. Iron images were derived from a 3-material decomposition algorithm (optimized for the quantification of iron). A conversion factor (mg Fe/mL per Hounsfield unit) was calculated from this phantom study as the quotient of known tube concentrations and their corresponding CT values. Retrospective analysis was performed of patients who had undergone DSDE imaging for renal stones. Thirty-seven patients with normal liver function were randomly selected (mean age, 52.5 years). The examinations were processed for iron concentration. Multiple regions of interest were analyzed, and iron concentration (mg Fe/mL) and distribution was reported. The mean conversion factor obtained from the phantom study was 0.15 mg Fe/mL per Hounsfield unit. Whole-liver mean iron concentrations yielded a range of 0.0 to 2.91 mg Fe/mL, with 94.6% (35/37) of the patients exhibiting mean concentrations below 1.0 mg Fe/mL. The most important finding was that iron concentration was not uniform and patients exhibited regionally high concentrations (36/37). These regions of higher concentration were observed to be dominant in the middle-to-upper part of the liver (75%), medially (72.2%), and anteriorly (83.3%). Dual-source dual-energy CT can be used to assess the uniformity of iron distribution in healthy subjects. Applying similar techniques to unhealthy livers, future research may focus on the impact of hepatic iron content and distribution for noninvasive assessment in diseased subjects.

  7. Notes on power of normality tests of error terms in regression models

    International Nuclear Information System (INIS)

    Střelec, Luboš

    2015-01-01

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models

  8. Notes on power of normality tests of error terms in regression models

    Energy Technology Data Exchange (ETDEWEB)

    Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.

  9. The stochastic distribution of available coefficient of friction for human locomotion of five different floor surfaces.

    Science.gov (United States)

    Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi

    2014-05-01

    The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. Interaction between a normal shock wave and a turbulent boundary layer at high transonic speeds. I - Pressure distribution

    Science.gov (United States)

    Messiter, A. F.

    1980-01-01

    Asymptotic solutions are derived for the pressure distribution in the interaction of a weak normal shock wave with a turbulent boundary layer. The undisturbed boundary layer is characterized by the law of the wall and the law of the wake for compressible flow. In the limiting case considered, for 'high' transonic speeds, the sonic line is very close to the wall. Comparisons with experiment are shown, with corrections included for the effect of longitudinal wall curvature and for the boundary-layer displacement effect in a circular pipe.

  11. Distribution of phosphorylated metabolites and magnesium in the red cells of a patient with hyperactive pyruvate kinase

    International Nuclear Information System (INIS)

    Ouwerkerk, R.; van Echteld, C.J.; Staal, G.E.; Rijksen, G.

    1988-01-01

    The intracellular distribution of adenosine 5'-triphosphate (ATP) and 2,3-diphosphoglycerate (2,3-DPG) was studied in the red cells of a patient with a high-ATP syndrome by using 31P nuclear magnetic resonance. In this patient, red cell ATP was increased 2.5-fold, whereas 2,3-DPG was decreased fourfold due to the presence of a hyperactive pyruvate kinase. In oxygenated red cells, these abnormal concentrations were reflected to the same extent in all complexes in which ATP and 2,3-DPG take part. The diminished amount of 2,3-DPG bound to hemoglobin was almost completely replaced by ATP-hemoglobin complexes. Therefore, free hemoglobin was only slightly increased. In deoxygenated cells, the relative distribution of ATP and 2,3-DPG complexes was significantly disturbed. The main difference was a shift in the ratio of magnesium ATP (MgATP) over the ATP-hemoglobin complex; 74% of total ATP was complexed to hemoglobin (45% in normal cells), whereas the concentration of MgATP was only slightly increased with respect to normal. The shortage in 2,3-DPG bound to hemoglobin could partially be replenished by an increase in hemoglobin (Mg) ATP complexes. Therefore, the amount of uncomplexed hemoglobin raised from 15% in normal cells to 38% in the patient's cells. As a result, the oxygen-dissociation curve was only moderately shifted to the left. It is concluded that the regulatory role of 2,3-DPG in oxygen transport is taken over in part by (Mg) ATP in this patient. In both aerobic and anaerobic cells, the increase in magnesium bound to ATP, either free or bound to hemoglobin, exceeds the decrease in 2,3-DPG Mg complex. In spite of this, the amount of intracellular free Mg++ was normal or slightly lowered. This suggests the presence of a compensatory mechanism by which the amount of total cellular magnesium could be increased

  12. Tritium distribution ratios between the 30 % tributyl phosphate(TBP)-normal dodecane(nDD) organic phase and uranyl nitrate-nitric acid aqueous phase

    International Nuclear Information System (INIS)

    Fujine, Sachio; Uchiyama, Gunzou; Sugikawa, Susumu; Maeda, Mitsuru; Tsujino, Takeshi.

    1989-10-01

    Tritium distribution ratios between the organic and aqueous phases were measured for the system of 30 % tributyl phosphate(TBP)-normal dodecane(nDD)/uranyl nitrate-nitric acid water. It was confirmed that tritium is extracted by TBP into the organic phase in both chemical forms of tritiated water (HTO) and tritiated nitric acid (TNO 3 ). The value of tritium distribution ratio ranged from 0.002 to 0.005 for the conditions of 0-6 mol/L nitric acid, 0.5-800 mCi/L tritium in aqueous phase, and 0-125 g-U/L uranium in organic phase. Isotopic distribution coefficient of tritium between the organic and aqueous phases was observed to be about 0.95. (author)

  13. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1976-01-01

    Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study

  14. Priority of a Hesitant Fuzzy Linguistic Preference Relation with a Normal Distribution in Meteorological Disaster Risk Assessment.

    Science.gov (United States)

    Wang, Lihong; Gong, Zaiwu

    2017-10-10

    As meteorological disaster systems are large complex systems, disaster reduction programs must be based on risk analysis. Consequently, judgment by an expert based on his or her experience (also known as qualitative evaluation) is an important link in meteorological disaster risk assessment. In some complex and non-procedural meteorological disaster risk assessments, a hesitant fuzzy linguistic preference relation (HFLPR) is often used to deal with a situation in which experts may be hesitant while providing preference information of a pairwise comparison of alternatives, that is, the degree of preference of one alternative over another. This study explores hesitation from the perspective of statistical distributions, and obtains an optimal ranking of an HFLPR based on chance-restricted programming, which provides a new approach for hesitant fuzzy optimisation of decision-making in meteorological disaster risk assessments.

  15. Heterogeneous distribution of a diffusional tracer in the aortic wall of normal and atherosclerotic rabbits

    International Nuclear Information System (INIS)

    Tsutsui, H.; Tomoike, H.; Nakamura, M.

    1990-01-01

    Tracer distribution as an index of nutritional support across the thoracic and abdominal aortas in rabbits in the presence or absence of atherosclerotic lesions was evaluated using [ 14 C]antipyrine, a metabolically inert, diffusible indicator. Intimal plaques were produced by endothelial balloon denudation of the thoracic aorta and a 1% cholesterol diet. After a steady intravenous infusion of 200 microCi of [ 14 C]antipyrine for 60 seconds, thoracic and abdominal aortas and the heart were excised, and autoradiograms of 20-microns-thick sections were quantified, using microcomputer-aided densitometry. Regional radioactivity and regional diffusional support, as an index of nutritional flow estimated from the timed collections of arterial blood, was 367 and 421 nCi.g-1 (82 and 106 ml.min-1.100 g-1) in thoracic aortic media of the normal and atherosclerotic rabbits, respectively. Radioactivity at the thickened intima was 179 nCi.g-1 (p less than 0.01 versus media). The gruel was noted at a deeper site within the thickened intima, and diffusional support here was 110 nCi.g-1 (p less than 0.01 versus an average radioactivity at the thickened intima). After ligating the intercostal arteries, regional tracer distribution in the media beneath the fibrofatty lesion, but not the plaque-free intima, was reduced to 46%. Thus, in the presence of advanced intimal thickening, the heterogeneous distribution of diffusional flow is prominent across the vessel wall, and abluminal routes are crucial to meet the increased demands of nutritional requirements

  16. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions, 2

    Science.gov (United States)

    Peters, B. C., Jr.; Walker, H. F.

    1976-01-01

    The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.

  17. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  18. Measurement of activity-weighted size distributions of radon decay products in a normally occupied home

    International Nuclear Information System (INIS)

    Hopke, P.K.; Wasiolek, P.; Montassier, N.; Cavallo, A.; Gadsby, K.; Socolow, R.

    1992-01-01

    In order to assess the exposure of individuals to the presence of indoor radioactivity arising from the decay of radon, an automated, semicontinuous graded screen array system was developed to permit the measurement of the activity-weighted size distributions of the radon progeny in homes. The system has been modified so that the electronics and sampling heads can be separated from the pump by approximately 15 m. The system was placed in the living room of a one-storey house with basement in Princeton, NJ and operated for 2 weeks while the house was occupied by the home owners in their normal manner. One of the house occupants was a cigarette smoker. Radon and potential alpha energy concentration (PAEC) measurements were also made, but condensation nuclei counts were not performed. PAEC values ranged from 23.4 to 461.6 mWL. In the measured activity size distributions, the amount of activity in the 0.5-1.5 nm size range can be considered to be the unattached fraction. The mean value for the 218 Po unattached fraction is 0.217 with a range of 0.054-0.549. The median value for the unattached fraction of PAEC is 0.077 with a range of 0.022-0.178. (author)

  19. Enhancing the Temporal Complexity of Distributed Brain Networks with Patterned Cerebellar Stimulation

    Science.gov (United States)

    Farzan, Faranak; Pascual-Leone, Alvaro; Schmahmann, Jeremy D.; Halko, Mark

    2016-01-01

    Growing evidence suggests that sensory, motor, cognitive and affective processes map onto specific, distributed neural networks. Cerebellar subregions are part of these networks, but how the cerebellum is involved in this wide range of brain functions remains poorly understood. It is postulated that the cerebellum contributes a basic role in brain functions, helping to shape the complexity of brain temporal dynamics. We therefore hypothesized that stimulating cerebellar nodes integrated in different networks should have the same impact on the temporal complexity of cortical signals. In healthy humans, we applied intermittent theta burst stimulation (iTBS) to the vermis lobule VII or right lateral cerebellar Crus I/II, subregions that prominently couple to the dorsal-attention/fronto-parietal and default-mode networks, respectively. Cerebellar iTBS increased the complexity of brain signals across multiple time scales in a network-specific manner identified through electroencephalography (EEG). We also demonstrated a region-specific shift in power of cortical oscillations towards higher frequencies consistent with the natural frequencies of targeted cortical areas. Our findings provide a novel mechanism and evidence by which the cerebellum contributes to multiple brain functions: specific cerebellar subregions control the temporal dynamics of the networks they are engaged in. PMID:27009405

  20. Finite-size effects in transcript sequencing count distribution: its power-law correction necessarily precedes downstream normalization and comparative analysis.

    Science.gov (United States)

    Wong, Wing-Cheong; Ng, Hong-Kiat; Tantoso, Erwin; Soong, Richie; Eisenhaber, Frank

    2018-02-12

    signal-to-noise ratio by 50% and the statistical/detection sensitivity by as high as 30% regardless of the downstream mapping and normalization methods. Most importantly, the power-law correction improves concordance in significant calls among different normalization methods of a data series averagely by 22%. When presented with a higher sequence depth (4 times difference), the improvement in concordance is asymmetrical (32% for the higher sequencing depth instance versus 13% for the lower instance) and demonstrates that the simple power-law correction can increase significant detection with higher sequencing depths. Finally, the correction dramatically enhances the statistical conclusions and eludes the metastasis potential of the NUGC3 cell line against AGS of our dilution analysis. The finite-size effects due to undersampling generally plagues transcript count data with reproducibility issues but can be minimized through a simple power-law correction of the count distribution. This distribution correction has direct implication on the biological interpretation of the study and the rigor of the scientific findings. This article was reviewed by Oliviero Carugo, Thomas Dandekar and Sandor Pongor.

  1. nth roots of normal contractions

    International Nuclear Information System (INIS)

    Duggal, B.P.

    1992-07-01

    Given a complex separable Hilbert space H and a contraction A on H such that A n , n≥2 some integer, is normal it is shown that if the defect operator D A = (1 - A * A) 1/2 is of the Hilbert-Schmidt class, then A is similar to a normal contraction, either A or A 2 is normal, and if A 2 is normal (but A is not) then there is a normal contraction N and a positive definite contraction P of trace class such that parallel to A - N parallel to 1 = 1/2 parallel to P + P parallel to 1 (where parallel to · parallel to 1 denotes the trace norm). If T is a compact contraction such that its characteristics function admits a scalar factor, if T = A n for some integer n≥2 and contraction A with simple eigen-values, and if both T and A satisfy a ''reductive property'', then A is a compact normal contraction. (author). 16 refs

  2. The role of bed-parallel slip in the development of complex normal fault zones

    Science.gov (United States)

    Delogkos, Efstratios; Childs, Conrad; Manzocchi, Tom; Walsh, John J.; Pavlides, Spyros

    2017-04-01

    Normal faults exposed in Kardia lignite mine, Ptolemais Basin, NW Greece formed at the same time as bed-parallel slip-surfaces, so that while the normal faults grew they were intermittently offset by bed-parallel slip. Following offset by a bed-parallel slip-surface, further fault growth is accommodated by reactivation on one or both of the offset fault segments. Where one fault is reactivated the site of bed-parallel slip is a bypassed asperity. Where both faults are reactivated, they propagate past each other to form a volume between overlapping fault segments that displays many of the characteristics of relay zones, including elevated strains and transfer of displacement between segments. Unlike conventional relay zones, however, these structures contain either a repeated or a missing section of stratigraphy which has a thickness equal to the throw of the fault at the time of the bed-parallel slip event, and the displacement profiles along the relay-bounding fault segments have discrete steps at their intersections with bed-parallel slip-surfaces. With further increase in displacement, the overlapping fault segments connect to form a fault-bound lens. Conventional relay zones form during initial fault propagation, but with coeval bed-parallel slip, relay-like structures can form later in the growth of a fault. Geometrical restoration of cross-sections through selected faults shows that repeated bed-parallel slip events during fault growth can lead to complex internal fault zone structure that masks its origin. Bed-parallel slip, in this case, is attributed to flexural-slip arising from hanging-wall rollover associated with a basin-bounding fault outside the study area.

  3. Normal tissue dose-effect models in biological dose optimisation

    International Nuclear Information System (INIS)

    Alber, M.

    2008-01-01

    Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)

  4. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering in Outer RB

    Science.gov (United States)

    Khazanov, G. V.; Gamayunov, K. V.

    2007-01-01

    We present the equatorial and bounce average pitch angle diffusion coefficients for scattering of relativistic electrons by the H+ mode of EMIC waves. Both the model (prescribed) and self consistent distributions over the wave normal angle are considered. The main results of our calculation can be summarized as follows: First, in comparison with field aligned waves, the intermediate and highly oblique waves reduce the pitch angle range subject to diffusion, and strongly suppress the scattering rate for low energy electrons (E less than 2 MeV). Second, for electron energies greater than 5 MeV, the |n| = 1 resonances operate only in a narrow region at large pitch-angles, and despite their greatest contribution in case of field aligned waves, cannot cause electron diffusion into the loss cone. For those energies, oblique waves at |n| greater than 1 resonances are more effective, extending the range of pitch angle diffusion down to the loss cone boundary, and increasing diffusion at small pitch angles by orders of magnitude.

  5. Financing options and economic impact: distributed generation using solar photovoltaic systems in Normal, Illinois

    Directory of Open Access Journals (Sweden)

    Jin H. Jo

    2016-04-01

    Full Text Available Due to increasing price volatility in fossil-fuel-produced energy, the demand for clean, renewable, and abundant energy is more prevalent than in past years. Solar photovoltaic (PV systems have been well documented for their ability to produce electrical energy while at the same time offering support to mitigate the negative externalities associated with fossil fuel combustion. Prices for PV systems have decreased over the past few years, however residential and commercial owners may still opt out of purchasing a system due to the overall price required for a PV system installation. Therefore, determining optimal financing options for residential and small-scale purchasers is a necessity. We report on payment methods currently used for distributed community solar projects throughout the US and suggest appropriate options for purchasers in Normal, Illinois given their economic status. We also examine the jobs and total economic impact of a PV system implementation in the case study area.

  6. Experiment Design for Complex VTOL Aircraft with Distributed Propulsion and Tilt Wing

    Science.gov (United States)

    Murphy, Patrick C.; Landman, Drew

    2015-01-01

    Selected experimental results from a wind tunnel study of a subscale VTOL concept with distributed propulsion and tilt lifting surfaces are presented. The vehicle complexity and automated test facility were ideal for use with a randomized designed experiment. Design of Experiments and Response Surface Methods were invoked to produce run efficient, statistically rigorous regression models with minimized prediction error. Static tests were conducted at the NASA Langley 12-Foot Low-Speed Tunnel to model all six aerodynamic coefficients over a large flight envelope. This work supports investigations at NASA Langley in developing advanced configurations, simulations, and advanced control systems.

  7. Simulation of drift dynamics of arbitrary carrier distributions in complex semiconductor detectors

    CERN Document Server

    De Castro Manzano, Pablo

    2014-01-01

    An extensible open-source C++ software for the simulation of elec- trons and holes drift in semiconductor detectors of complex geometries has been developed in order to understand transient currents and charge collection efficiencies of arbitrary charge distributions. The simulation is based on Ramo’s theorem formalism to obtain induced currents in the electrodes. Efficient open source C++ numerical libraries are used to ob- tain the electric and weighting field using finite-element methods and to simulate the carrier transport. A graphical user interface is also provided. The tool has already been proved useful to model laser induced transient currents

  8. Determination and correlation of spatial distribution of trace elements in normal and neoplastic breast tissues evaluated by μ-XRF

    International Nuclear Information System (INIS)

    Silva, M.P.; Oliveira, M.A.; Poletti, M.E.

    2012-01-01

    Full text: Some trace elements, naturally present in breast tissues, participate in a large number of biological processes, which include among others, activation or inhibition of enzymatic reactions and changes on cell membranes permeability, suggesting that these elements may influence carcinogenic processes. Thus, knowledge of the amounts of these elements and their spatial distribution in normal and neoplastic tissues may help in understanding the role of these elements in the carcinogenic process and tumor progression of breast cancers. Concentrations of trace elements like Ca, Fe, Cu and Zn, previously studied at LNLS using TXRF and conventional XRF, were elevated in neoplastic breast tissues compared to normal tissues. In this study we determined the spatial distribution of these elements in normal and neoplastic breast tissues using μ-XRF technique. We analyzed 22 samples of normal and neoplastic breast tissues (malignant and benign) obtained from paraffin blocks available for study at the Department of Pathology HC-FMRP/USP. From the blocks, a small fraction of material was removed and subjected to histological sections of 60 μm thick made with a microtome. The slices where placed in holder samples and covered with ultralen film. Tissue samples were irradiated with a white beam of synchrotron radiation. The samples were positioned at 45 degrees with respect to the incident beam on a table with 3 freedom degrees (x, y and z), allowing independent positioning of the sample in these directions. The white beam was collimated by a 20 μm microcapillary and samples were fully scanned. At each step, a spectrum was detected for 10 s. The fluorescence emitted by elements present in the sample was detected by a Si (Li) detector with 165 eV at 5.9 keV energy resolution, placed at 90 deg with respect to the incident beam. Results reveal that trace elements Ca-Zn and Fe-Cu could to be correlated in malignant breast tissues. Quantitative results, achieved by Spearman

  9. Complex Network Theory Applied to the Growth of Kuala Lumpur's Public Urban Rail Transit Network.

    Science.gov (United States)

    Ding, Rui; Ujang, Norsidah; Hamid, Hussain Bin; Wu, Jianjun

    2015-01-01

    Recently, the number of studies involving complex network applications in transportation has increased steadily as scholars from various fields analyze traffic networks. Nonetheless, research on rail network growth is relatively rare. This research examines the evolution of the Public Urban Rail Transit Networks of Kuala Lumpur (PURTNoKL) based on complex network theory and covers both the topological structure of the rail system and future trends in network growth. In addition, network performance when facing different attack strategies is also assessed. Three topological network characteristics are considered: connections, clustering and centrality. In PURTNoKL, we found that the total number of nodes and edges exhibit a linear relationship and that the average degree stays within the interval [2.0488, 2.6774] with heavy-tailed distributions. The evolutionary process shows that the cumulative probability distribution (CPD) of degree and the average shortest path length show good fit with exponential distribution and normal distribution, respectively. Moreover, PURTNoKL exhibits clear cluster characteristics; most of the nodes have a 2-core value, and the CPDs of the centrality's closeness and betweenness follow a normal distribution function and an exponential distribution, respectively. Finally, we discuss four different types of network growth styles and the line extension process, which reveal that the rail network's growth is likely based on the nodes with the biggest lengths of the shortest path and that network protection should emphasize those nodes with the largest degrees and the highest betweenness values. This research may enhance the networkability of the rail system and better shape the future growth of public rail networks.

  10. Group normalization for genomic data.

    Science.gov (United States)

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  11. Intracellular distribution and stability of a luminescent rhenium(I) tricarbonyl tetrazolato complex using epifluorescence microscopy in conjunction with X-ray fluorescence imaging

    International Nuclear Information System (INIS)

    Wedding, Jason L.; Harris, Hugh H.; Bader, Christie A.; Plush, Sally E.; Mak, Rachel

    2016-01-01

    Optical fluorescence microscopy was used in conjunction with X-ray fluorescence microscopy to monitor the stability and intracellular distribution of the luminescent rhenium(I) complex fac-[Re(CO) 3 (phen)L], where phen = 1,10-phenathroline and L = 5-(4-iodophenyl)tetrazolato, in 22Rv1 cells. The rhenium complex showed no signs of ancillary ligand dissociation, a conclusion based on data obtained via X-ray fluorescence imaging aligning iodine and rhenium distributions. A diffuse reticular localisation was detected for the complex, in the nuclear/perinuclear region of cells, by either optical or X-ray fluorescence techniques. Furthermore, X-ray fluorescence also showed that the Re-I complex disrupted the homeostasis of some biologically relevant elements, such as chlorine, potassium and zinc.

  12. Optimal distribution of incentives for public cooperation in heterogeneous interaction environments

    Directory of Open Access Journals (Sweden)

    Xiaojie eChen

    2014-07-01

    Full Text Available In the framework of evolutionary games with institutional reciprocity, limited incentives are at disposal for rewarding cooperators and punishing defectors. In the simplest case, it can be assumed that, depending on their strategies, all players receive equal incentives from the common pool. The question arises, however, what is the optimal distribution of institutional incentives? How should we best reward and punish individuals for cooperation to thrive? We study this problem for the public goods game on a scale-free network. We show that if the synergetic effects of group interactions are weak, the level of cooperation in the population can be maximized simply by adopting the simplest ''equal distribution'' scheme. If synergetic effects are strong, however, it is best to reward high-degree nodes more than low-degree nodes. These distribution schemes for institutional rewards are independent of payoff normalization. For institutional punishment, however, the same optimization problem is more complex, and its solution depends on whether absolute or degree-normalized payoffs are used. We find that degree-normalized payoffs require high-degree nodes be punished more lenient than low-degree nodes. Conversely, if absolute payoffs count, then high-degree nodes should be punished stronger than low-degree nodes.

  13. Global Bi-ventricular endocardial distribution of activation rate during long duration ventricular fibrillation in normal and heart failure canines.

    Science.gov (United States)

    Luo, Qingzhi; Jin, Qi; Zhang, Ning; Han, Yanxin; Wang, Yilong; Huang, Shangwei; Lin, Changjian; Ling, Tianyou; Chen, Kang; Pan, Wenqi; Wu, Liqun

    2017-04-13

    The objective of this study was to detect differences in the distribution of the left and right ventricle (LV & RV) activation rate (AR) during short-duration ventricular fibrillation (SDVF, 1 min) in normal and heart failure (HF) canine hearts. Ventricular fibrillation (VF) was electrically induced in six healthy dogs (control group) and six dogs with right ventricular pacing-induced congestive HF (HF group). Two 64-electrode basket catheters deployed in the LV and RV were used for global endocardium electrical mapping. The AR of VF was estimated by fast Fourier transform analysis from each electrode. In the control group, the LV was activated faster than the RV in the first 20 s, after which there was no detectable difference in the AR between them. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the posterior LV was activated fastest, while the anterior was slowest. In the HF group, a detectable AR gradient existed between the two ventricles within 3 min of VF, with the LV activating more quickly than the RV. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the septum of the LV was activated fastest, while the anterior was activated slowest. A global bi-ventricular endocardial AR gradient existed within the first 20 s of VF but disappeared in the LDVF in healthy hearts. However, the AR gradient was always observed in both SDVF and LDVF in HF hearts. The findings of this study suggest that LDVF in HF hearts can be maintained differently from normal hearts, which accordingly should lead to the development of different management strategies for LDVF resuscitation.

  14. Energy Consumption in the Process of Excavator-Automobile Complexes Distribution at Kuzbass Open Pit Mines

    Directory of Open Access Journals (Sweden)

    Panachev Ivan

    2017-01-01

    Full Text Available Every year worldwide coal mining companies seek to maintain the tendency of the mining machine fleet renewal. Various activities to maintain the service life of already operated mining equipment are implemented. In this regard, the urgent issue is the problem of efficient distribution of available machines in different geological conditions. The problem of “excavator-automobile” complex effective distribution occurs when heavy dump trucks are used in mining. For this reason, excavation and transportation of blasted rock mass are the most labor intensive and costly processes, considering the volume of transported overburden and coal, as well as diesel fuel, electricity, fuel and lubricants costs, consumables for repair works and downtime, etc. Currently, it is recommended to take the number of loading buckets in the range of 3 to 5, according to which the dump trucks are distributed to faces.

  15. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    Science.gov (United States)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  16. Precise on-machine extraction of the surface normal vector using an eddy current sensor array

    International Nuclear Information System (INIS)

    Wang, Yongqing; Lian, Meng; Liu, Haibo; Ying, Yangwei; Sheng, Xianjun

    2016-01-01

    To satisfy the requirements of on-machine measurement of the surface normal during complex surface manufacturing, a highly robust normal vector extraction method using an Eddy current (EC) displacement sensor array is developed, the output of which is almost unaffected by surface brightness, machining coolant and environmental noise. A precise normal vector extraction model based on a triangular-distributed EC sensor array is first established. Calibration of the effects of object surface inclination and coupling interference on measurement results, and the relative position of EC sensors, is involved. A novel apparatus employing three EC sensors and a force transducer was designed, which can be easily integrated into the computer numerical control (CNC) machine tool spindle and/or robot terminal execution. Finally, to test the validity and practicability of the proposed method, typical experiments were conducted with specified testing pieces using the developed approach and system, such as an inclined plane and cylindrical and spherical surfaces. (paper)

  17. Precise on-machine extraction of the surface normal vector using an eddy current sensor array

    Science.gov (United States)

    Wang, Yongqing; Lian, Meng; Liu, Haibo; Ying, Yangwei; Sheng, Xianjun

    2016-11-01

    To satisfy the requirements of on-machine measurement of the surface normal during complex surface manufacturing, a highly robust normal vector extraction method using an Eddy current (EC) displacement sensor array is developed, the output of which is almost unaffected by surface brightness, machining coolant and environmental noise. A precise normal vector extraction model based on a triangular-distributed EC sensor array is first established. Calibration of the effects of object surface inclination and coupling interference on measurement results, and the relative position of EC sensors, is involved. A novel apparatus employing three EC sensors and a force transducer was designed, which can be easily integrated into the computer numerical control (CNC) machine tool spindle and/or robot terminal execution. Finally, to test the validity and practicability of the proposed method, typical experiments were conducted with specified testing pieces using the developed approach and system, such as an inclined plane and cylindrical and spherical surfaces.

  18. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  19. A preliminary evaluation of myoelectrical energy distribution of the front neck muscles in pharyngeal phase during normal swallowing.

    Science.gov (United States)

    Mingxing Zhu; Wanzhang Yang; Samuel, Oluwarotimi Williams; Yun Xiang; Jianping Huang; Haiqing Zou; Guanglin Li

    2016-08-01

    Pharyngeal phase is a central hub of swallowing in which food bolus pass through from the oral cavity to the esophageal. Proper understanding of the muscular activities in the pharyngeal phase is useful for assessing swallowing function and the occurrence of dysphagia in humans. In this study, high-density (HD) surface electromyography (sEMG) was used to study the muscular activities in the pharyngeal phase during swallowing tasks involving three healthy male subjects. The root mean square (RMS) of the HD sEMG data was computed by using a series of segmented windows as myoelectrical energy. And the RMS of each window covering all channels (16×5) formed a matrix. During the pharyngeal phase of swallowing, three of the matrixes were chosen and normalized to obtain the HD energy maps and the statistical parameter. The maps across different viscosity levels offered the energy distribution which showed the muscular activities of the left and right sides of the front neck muscles. In addition, the normalized average RMS (NARE) across different viscosity levels revealed a left-right significant correlation (r=0.868±0.629, pstronger correlation when swallowing water. This pilot study suggests that HD sEMG would be a potential tool to evaluate muscular activities in pharyngeal phase during normal swallowing. Also, it might provide useful information for dysphagia diagnosis.

  20. Aerosol lung inhalation scintigraphy in normal subjects

    Energy Technology Data Exchange (ETDEWEB)

    Sui, Osamu; Shimazu, Hideki

    1985-03-01

    We previously reported basic and clinical evaluation of aerosol lung inhalation scintigraphy with /sup 99m/Tc-millimicrosphere albumin (milli MISA) and concluded aerosol inhalation scintigraphy with /sup 99m/Tc-milli MISA was useful for routine examination. But central airway deposit of aerosol particles was found in not only the patients with chronic obstructive pulmonary disease (COPD) but also normal subjects. So we performed aerosol inhalation scintigraphy in normal subjects and evaluated their scintigrams. The subjects had normal values of FEVsub(1.0)% (more than 70%) in lung function tests, no abnormal findings in chest X-ray films and no symptoms and signs. The findings of aerosol inhalation scintigrams in them were classified into 3 patterns; type I: homogeneous distribution without central airway deposit, type II: homogeneous distribution with central airway deposit, type III: inhomogeneous distribution. These patterns were compared with lung function tests. There was no significant correlation between type I and type II in lung function tests. Type III was different from type I and type II in inhomogeneous distribution. This finding showed no correlation with %VC, FEVsub(1.0)%, MMF, V radical50 and V radical50/V radical25, but good correlation with V radical25 in a maximum forced expiratory flow-volume curve. Flow-volume curve is one of the sensitive methods in early detection of COPD, so inhomogeneous distribution of type III is considered to be due to small airway dysfunction.

  1. Stellar Distributions and NIR Colours of Normal Galaxies

    NARCIS (Netherlands)

    Peletier, R. F.; Grijs, R. de

    1997-01-01

    Abstract: We discuss some results of a morphological study of edge-on galaxies, based on optical and especially near-infrared surface photometry. We find that the vertical surface brightness distributions of galaxies are fitted very well by exponential profiles, much better than by isothermal

  2. The Semiparametric Normal Variance-Mean Mixture Model

    DEFF Research Database (Denmark)

    Korsholm, Lars

    1997-01-01

    We discuss the normal vairance-mean mixture model from a semi-parametric point of view, i.e. we let the mixing distribution belong to a non parametric family. The main results are consistency of the non parametric maximum likelihood estimat or in this case, and construction of an asymptotically...... normal and efficient estimator....

  3. Random-Number Generator Validity in Simulation Studies: An Investigation of Normality.

    Science.gov (United States)

    Bang, Jung W.; Schumacker, Randall E.; Schlieve, Paul L.

    1998-01-01

    The normality of number distributions generated by various random-number generators were studied, focusing on when the random-number generator reached a normal distribution and at what sample size. Findings suggest the steps that should be followed when using a random-number generator in a Monte Carlo simulation. (SLD)

  4. Group normalization for genomic data.

    Directory of Open Access Journals (Sweden)

    Mahmoud Ghandi

    Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  5. Experimental microangiographic study in normal rabbit liver

    International Nuclear Information System (INIS)

    Kim, Yoon Gyoo; Park, Jong Yeon; Han, Kook Sang; Moon, Ki Ho; Choi, Chang Ho; Han, Koon Taek; Lee, Suck Hong; Kim, Byung Soo

    1994-01-01

    Microangiography is an experimental radiologic technique for evaluation of the morphology and the function of small vessels. The purpose of this study is to introduce a good microangiographic technique and to present the microangiographic appearance of normal hepatic vascular pattern. Five white rabbits weighing 2.5-2.9Kg were objected. Polyethylene catheters were inserted in portal vein and then in IVC. Heparin mixed normal saline (2cc/1000cc) was infused through portal vein and blood was drained to IVC. Barium suspension was infused via the catheter placed in portal vein until the liver surface showed satisfactory finding in barium filling. The liver was removed and this preparation was fixed in 10% formaline for 7 days. After fixation, the liver was sectioned on 1-2mm thickness. The slices were radiographed on high resolution plate using Faxitron. H-E staining of liver tissue was also done. The microbrium was well distributed in all small vessels without filling defect. And we could find the hexagonal shaped classic liver lobule, in which the central vein was located at central portion and portal vein at periphery. The enlargement was showed numerous sinusoids, but there was less dye in the central portion of lobule, but the central vein was well filled by microbarium. The peripheral portion of lobule was well filled with microbarium. So, we could find diamond shaped liver acinus, in which central vein was located at priperal portion and the center of liver acinus was terminal portal vein that growed out from a small portal space. The three acini made the complex acinus and acinar agglomerate was composed of three or four complex acini. It is considered that the liver acinus pattern of Rapparport is more acceptable on microangiography than the classic concept of hepatic lobule

  6. Physics of collisionless scrape-off-layer plasma during normal and off-normal Tokamak operating conditions

    International Nuclear Information System (INIS)

    Hassanein, A.; Konkashbaev, I.

    1999-01-01

    The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutions provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters

  7. Partial LVAD restores ventricular outputs and normalizes LV but not RV stress distributions in the acutely failing heart in silico

    OpenAIRE

    Sack, Kevin L.; Baillargeon, Brian; Acevedo-Bolton, Gabriel; Genet, Martin; Rebelo, Nuno; Kuhl, Ellen; Klein, Liviu; Weiselthaler, Georg M.; Burkhoff, Daniel; Franz, Thomas; Guccione, Julius M.

    2016-01-01

    Purpose: Heart failure is a worldwide epidemic that is unlikely to change as the population ages and life expectancy increases. We sought to detail significant recent improvements to the Dassault Systèmes Living Heart Model (LHM) and use the LHM to compute left ventricular (LV) and right ventricular (RV) myofiber stress distributions under the following 4 conditions: (1) normal cardiac function; (2) acute left heart failure (ALHF); (3) ALHF treated using an LV assist device (LVAD) flow rate o...

  8. Computer modeling the boron compound factor in normal brain tissue

    International Nuclear Information System (INIS)

    Gavin, P.R.; Huiskamp, R.; Wheeler, F.J.; Griebenow, M.L.

    1993-01-01

    The macroscopic distribution of borocaptate sodium (Na 2 B 12 H 11 SH or BSH) in normal tissues has been determined and can be accurately predicted from the blood concentration. The compound para-borono-phenylalanine (p-BPA) has also been studied in dogs and normal tissue distribution has been determined. The total physical dose required to reach a biological isoeffect appears to increase directly as the proportion of boron capture dose increases. This effect, together with knowledge of the macrodistribution, led to estimates of the influence of the microdistribution of the BSH compound. This paper reports a computer model that was used to predict the compound factor for BSH and p-BPA and, hence, the equivalent radiation in normal tissues. The compound factor would need to be calculated for other compounds with different distributions. This information is needed to design appropriate normal tissue tolerance studies for different organ systems and/or different boron compounds

  9. Complex Network Theory Applied to the Growth of Kuala Lumpur's Public Urban Rail Transit Network.

    Directory of Open Access Journals (Sweden)

    Rui Ding

    Full Text Available Recently, the number of studies involving complex network applications in transportation has increased steadily as scholars from various fields analyze traffic networks. Nonetheless, research on rail network growth is relatively rare. This research examines the evolution of the Public Urban Rail Transit Networks of Kuala Lumpur (PURTNoKL based on complex network theory and covers both the topological structure of the rail system and future trends in network growth. In addition, network performance when facing different attack strategies is also assessed. Three topological network characteristics are considered: connections, clustering and centrality. In PURTNoKL, we found that the total number of nodes and edges exhibit a linear relationship and that the average degree stays within the interval [2.0488, 2.6774] with heavy-tailed distributions. The evolutionary process shows that the cumulative probability distribution (CPD of degree and the average shortest path length show good fit with exponential distribution and normal distribution, respectively. Moreover, PURTNoKL exhibits clear cluster characteristics; most of the nodes have a 2-core value, and the CPDs of the centrality's closeness and betweenness follow a normal distribution function and an exponential distribution, respectively. Finally, we discuss four different types of network growth styles and the line extension process, which reveal that the rail network's growth is likely based on the nodes with the biggest lengths of the shortest path and that network protection should emphasize those nodes with the largest degrees and the highest betweenness values. This research may enhance the networkability of the rail system and better shape the future growth of public rail networks.

  10. Power distribution monitoring and control in the RBMK type reactors

    International Nuclear Information System (INIS)

    Emel'yanov, I.Ya.; Postnikov, V.V.; Volod'ko, Yu.I.

    1980-01-01

    Considered are the structures of monitoring and control systems for the RBMK-1000 reactor including three main systems with high independence: the control and safety system (CSS); the system for physical control of energy distribution (SPCED) as well as the Scala system for centralized control (SCC). Main functions and peculiarities of each system are discussed. Main attention is paid to new structural solutions and new equipment components used in these systems. Described are the RBMK operation software and routine of energy distribution control in it. It is noted that the set of reactor control and monitoring systems has a hierarchical structure, the first level of which includes analog systems (CSS and SPCED) normalizing and transmitting detector signals to the systems of the second level based on computers and realizing computer data processing, data representation to the operator, automatic (through CSS) control for energy distribution, diagnostics of equipment condition and local safety with provision for existing reserves with respect to crisis and thermal loading of fuel assemblies. The third level includes a power computer carrying out complex physical and optimization calculations and providing interconnections with the external computer of power system. A typical feature of the complex is the provision of local automatic safety of the reactor from erroneous withdrawal of any control rod. The complex is designed for complete automatization of energy distribution control in reactor in steady and transient operation conditions

  11. Absolute quantification of pharmacokinetic distribution of RES colloids in individuals with normal liver function

    International Nuclear Information System (INIS)

    Herzog, H.; Spohr, G.; Notohamiprodjo, G.; Feinendegen, L.E.

    1987-01-01

    Estimates of the radiation dose resulting from liver-spleen scintigraphy 99 TCsup(m)-labelled colloids are based on pharmacokinetic data mainly determined in animals. The aim of this study was to check the pharmacokinetic data by direct, absolute in vivo quantification in man. Liver and spleen activities were directly measured using a double-energy window technique. Activities in other organs were quantified by conjugate whole-body scans. All measurement procedures were checked using the whole-body Alderson phantom. Pharmacokinetic data for sulphur colloid, tin colloid, human serum albumin (HSA) millimicrospheres, and phytate were obtained in 13 to 20 normal subjects for each type of colloid. Depending on the colloid type liver uptake was between 54 and 75% of the total administered dose (TAD) and spleen uptake was 3.5 to 21% TAD. Activity measured in blood, urine, lung and thyroid proved to be far from negligible. The results of this work suggest a correction of the animal-based data of colloid distribution and radiation dose on the basis of the direct measurement of absolute uptake in man. (author)

  12. On the maximum entropy distributions of inherently positive nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Taavitsainen, A., E-mail: aapo.taavitsainen@gmail.com; Vanhanen, R.

    2017-05-11

    The multivariate log-normal distribution is used by many authors and statistical uncertainty propagation programs for inherently positive quantities. Sometimes it is claimed that the log-normal distribution results from the maximum entropy principle, if only means, covariances and inherent positiveness of quantities are known or assumed to be known. In this article we show that this is not true. Assuming a constant prior distribution, the maximum entropy distribution is in fact a truncated multivariate normal distribution – whenever it exists. However, its practical application to multidimensional cases is hindered by lack of a method to compute its location and scale parameters from means and covariances. Therefore, regardless of its theoretical disadvantage, use of other distributions seems to be a practical necessity. - Highlights: • Statistical uncertainty propagation requires a sampling distribution. • The objective distribution of inherently positive quantities is determined. • The objectivity is based on the maximum entropy principle. • The maximum entropy distribution is the truncated normal distribution. • Applicability of log-normal or normal distribution approximation is limited.

  13. powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks

    Science.gov (United States)

    Murray, Steven G.

    2018-05-01

    powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.

  14. S-curve networks and an approximate method for estimating degree distributions of complex networks

    OpenAIRE

    Guo, Jin-Li

    2010-01-01

    In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (Logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference value for optimizing the distribution of IPv4 address resource and the development of IPv6. Based o...

  15. Developing TOPSIS method using statistical normalization for selecting knowledge management strategies

    Directory of Open Access Journals (Sweden)

    Amin Zadeh Sarraf

    2013-09-01

    Full Text Available Purpose: Numerous companies are expecting their knowledge management (KM to be performed effectively in order to leverage and transform the knowledge into competitive advantages. However, here raises a critical issue of how companies can better evaluate and select a favorable KM strategy prior to a successful KM implementation. Design/methodology/approach: An extension of TOPSIS, a multi-attribute decision making (MADM technique, to a group decision environment is investigated. TOPSIS is a practical and useful technique for ranking and selection of a number of externally determined alternatives through distance measures. The entropy method is often used for assessing weights in the TOPSIS method. Entropy in information theory is a criterion uses for measuring the amount of disorder represented by a discrete probability distribution. According to decrease resistance degree of employees opposite of implementing a new strategy, it seems necessary to spot all managers’ opinion. The normal distribution considered the most prominent probability distribution in statistics is used to normalize gathered data. Findings: The results of this study show that by considering 6 criteria for alternatives Evaluation, the most appropriate KM strategy to implement  in our company was ‘‘Personalization’’. Research limitations/implications: In this research, there are some assumptions that might affect the accuracy of the approach such as normal distribution of sample and community. These assumptions can be changed in future work. Originality/value: This paper proposes an effective solution based on combined entropy and TOPSIS approach to help companies that need to evaluate and select KM strategies. In represented solution, opinions of all managers is gathered and normalized by using standard normal distribution and central limit theorem. Keywords: Knowledge management; strategy; TOPSIS; Normal distribution; entropy

  16. NEAR-INFRARED POLARIMETRY OF A NORMAL SPIRAL GALAXY VIEWED THROUGH THE TAURUS MOLECULAR CLOUD COMPLEX

    International Nuclear Information System (INIS)

    Clemens, Dan P.; Cashman, L. R.; Pavel, M. D.

    2013-01-01

    Few normal galaxies have been probed using near-infrared polarimetry, even though it reveals magnetic fields in the cool interstellar medium better than either optical or radio polarimetry. Deep H-band (1.6 μm) linear imaging polarimetry toward Taurus serendipitously included the galaxy 2MASX J04412715+2433110 with adequate sensitivity and resolution to map polarization across nearly its full extent. The observations revealed the galaxy to be a steeply inclined (∼75°) disk type with a diameter, encompassing 90% of the Petrosian flux, of 4.2 kpc at a distance of 53 Mpc. Because the sight line passes through the Taurus Molecular Cloud complex, the foreground polarization needed to be measured and removed. The foreground extinction A V of 2.00 ± 0.10 mag and reddening E(H – K) of 0.125 ± 0.009 mag were also assessed and removed, based on analysis of Two Micron All Sky Survey, UKIRT Infrared Deep Sky Survey, Spitzer, and Wide-field Infrared Survey Explorer photometry using the Near-Infrared Color Excess, NICE-Revisited, and Rayleigh-Jeans Color Excess methods. Corrected for the polarized foreground, the galaxy polarization values range from 0% to 3%. The polarizations are dominated by a disk-parallel magnetic field geometry, especially to the northeast, while either a vertical field or single scattering of bulge light produces disk-normal polarizations to the southwest. The multi-kiloparsec coherence of the magnetic field revealed by the infrared polarimetry is in close agreement with short-wavelength radio synchrotron observations of edge-on galaxies, indicating that both cool and warm interstellar media of disk galaxies may be threaded by common magnetic fields.

  17. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Directory of Open Access Journals (Sweden)

    Wesley K Thompson

    2015-12-01

    Full Text Available Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD and the other for schizophrenia (SZ. A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the

  18. An Empirical Bayes Mixture Model for Effect Size Distributions in Genome-Wide Association Studies.

    Science.gov (United States)

    Thompson, Wesley K; Wang, Yunpeng; Schork, Andrew J; Witoelar, Aree; Zuber, Verena; Xu, Shujing; Werge, Thomas; Holland, Dominic; Andreassen, Ole A; Dale, Anders M

    2015-12-01

    Characterizing the distribution of effects from genome-wide genotyping data is crucial for understanding important aspects of the genetic architecture of complex traits, such as number or proportion of non-null loci, average proportion of phenotypic variance explained per non-null effect, power for discovery, and polygenic risk prediction. To this end, previous work has used effect-size models based on various distributions, including the normal and normal mixture distributions, among others. In this paper we propose a scale mixture of two normals model for effect size distributions of genome-wide association study (GWAS) test statistics. Test statistics corresponding to null associations are modeled as random draws from a normal distribution with zero mean; test statistics corresponding to non-null associations are also modeled as normal with zero mean, but with larger variance. The model is fit via minimizing discrepancies between the parametric mixture model and resampling-based nonparametric estimates of replication effect sizes and variances. We describe in detail the implications of this model for estimation of the non-null proportion, the probability of replication in de novo samples, the local false discovery rate, and power for discovery of a specified proportion of phenotypic variance explained from additive effects of loci surpassing a given significance threshold. We also examine the crucial issue of the impact of linkage disequilibrium (LD) on effect sizes and parameter estimates, both analytically and in simulations. We apply this approach to meta-analysis test statistics from two large GWAS, one for Crohn's disease (CD) and the other for schizophrenia (SZ). A scale mixture of two normals distribution provides an excellent fit to the SZ nonparametric replication effect size estimates. While capturing the general behavior of the data, this mixture model underestimates the tails of the CD effect size distribution. We discuss the implications of

  19. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  20. Pharmacokinetics of tritiated water in normal and dietary-induced obese rats

    International Nuclear Information System (INIS)

    Shum, L.Y.; Jusko, W.J.

    1986-01-01

    Tritiated water disposition was characterized in normal and dietary-induced obese rats to assess pharmacokinetic concerns in calculating water space and estimating body fat. A monoexponential decline in serum tritium activity was observed in both groups of rats, thus facilitating use of various computational methods. The volume of distribution and the total clearance of tritium in obese rats were larger than in normal rats because of the increased body weight. The values of water space (volume of distribution) estimated from moment analysis or dose divided by serum tritium activity at time zero (extrapolated) or at 2 hr were all similar. Thus, obesity does not alter the distribution equilibrium time and distribution pattern of tritium, and the conventional 2-hr single blood sampling after intravenous injection is adequate to estimate the water space of normal and obese rats

  1. Effect of vanadium treatment on tissue distribution of biotrace elements in normal and streptozotocin-induced diabetic rats. Simultaneous analysis of V and Zn using radioactive multitracer

    International Nuclear Information System (INIS)

    Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu; Hirunuma, Rieko; Enomoto, Shuichi

    2001-01-01

    Because vanadium ions such as vanadyl (VO 2+ ) and vanadate (VO 3- ) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ( 48 V) and zinc ( 65 Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)

  2. Immunolocalization of transforming growth factor alpha in normal human tissues

    DEFF Research Database (Denmark)

    Christensen, M E; Poulsen, Steen Seier

    1996-01-01

    anchorage-independent growth of normal cells and was, therefore, considered as an "oncogenic" growth factor. Later, its immunohistochemical presence in normal human cells as well as its biological effects in normal human tissues have been demonstrated. The aim of the present investigation was to elucidate...... the distribution of the growth factor in a broad spectrum of normal human tissues. Indirect immunoenzymatic staining methods were used. The polypeptide was detected with a polyclonal as well as a monoclonal antibody. The polyclonal and monoclonal antibodies demonstrated almost identical immunoreactivity. TGF......-alpha was found to be widely distributed in cells of normal human tissues derived from all three germ layers, most often in differentiated cells. In epithelial cells, three different kinds of staining patterns were observed, either diffuse cytoplasmic, cytoplasmic in the basal parts of the cells, or distinctly...

  3. Radiochemical synthesis and tissue distribution of Tc-99m-labeled 7α-substituted estradiol complexes

    International Nuclear Information System (INIS)

    Skaddan, Marc B.; Wuest, Frank R.; Jonson, Stephanie; Syhre, Rosemarie; Welch, Michael J.; Spies, Hartmut; Katzenellenbogen, John A.

    2000-01-01

    The diagnosis and staging of breast cancer could be improved by the development of radiopharmaceutical imaging agents that provide a noninvasive determination of the estrogen receptor (ER) status of tumor cells. Agents labeled with 99m Tc would be especially valuable in this regard. In attempting to achieve this goal, we synthesized four 99m Tc-labeled 7α-substituted estradiol complexes. One complex utilizes the 3+1 mixed ligand design to introduce the Tc metal, whereas the other three took advantage of the cyclopentadienyltricarbonylmetal (CpTM) design. The Tc moieties were attached to the 7α position of estradiol with a hexyl tether, a monoether tether, or a polyether tether. The corresponding rhenium compounds have binding affinities for the ER of 20-45% compared with estradiol. Radiochemical yields of the 99m Tc-labeled compounds ranged from approximately 15% for the CpT-Tc complexes to 95% for the 3+1 inorganic complex. Tissue distribution studies in immature female rats showed low nonreceptor-mediated uptake in the target organs and high uptake in nontarget organs such as the liver and fat. These complexes represent the first time that estradiol has been labeled at the 7α position with 99m Tc and provide a further refinement of our understanding of ligand structure-binding affinity correlations for the ER

  4. Characterization of subgraph relationships and distribution in complex networks

    International Nuclear Information System (INIS)

    Antiqueira, Lucas; Fontoura Costa, Luciano da

    2009-01-01

    A network can be analyzed at different topological scales, ranging from single nodes to motifs, communities, up to the complete structure. We propose a novel approach which extends from single nodes to the whole network level by considering non-overlapping subgraphs (i.e. connected components) and their interrelationships and distribution through the network. Though such subgraphs can be completely general, our methodology focuses on the cases in which the nodes of these subgraphs share some special feature, such as being critical for the proper operation of the network. The methodology of subgraph characterization involves two main aspects: (i) the generation of histograms of subgraph sizes and distances between subgraphs and (ii) a merging algorithm, developed to assess the relevance of nodes outside subgraphs by progressively merging subgraphs until the whole network is covered. The latter procedure complements the histograms by taking into account the nodes lying between subgraphs, as well as the relevance of these nodes to the overall subgraph interconnectivity. Experiments were carried out using four types of network models and five instances of real-world networks, in order to illustrate how subgraph characterization can help complementing complex network-based studies.

  5. Some features of light propagation through layers with a complex refractive index

    International Nuclear Information System (INIS)

    Efimov, V.V.; Sementsov, D.I.

    1994-01-01

    By solving Maxwell's equations, expressions are obtained for the energy fluxes both inside and outside a layer with a complex refractive index at normal incidence of light. It is shown that inside the layer, along with fluxes of forward and backward waves, an interference flux can be distinguished whose magnitude is proportional to the imaginary part of the refractive index. A detailed numerical analysis of the energy transmission (T) and reflection (R) coefficients versus the thickness of the layer with negative absorption is performed for normal incidence of light onto the layer surface. Total distribution of the energy flux over the layer thickness is considered both for absorbing and amplifying layers. 13 refs., 4 figs

  6. DFT calculations of the structures and vibrational spectra of the [Fe(bpy)3]2+ and [Ru(bpy)3]2+ complexes

    International Nuclear Information System (INIS)

    Alexander, Bruce D.; Dines, Trevor J.; Longhurst, Rayne W.

    2008-01-01

    Structures of the [M(bpy) 3 ] 2+ complexes (M = Fe and Ru) have been calculated at the B3-LYP/DZVP level. IR and Raman spectra were calculated using the optimised geometries, employing a scaled quantum chemical force field, and compared with an earlier normal coordinate analysis of [Ru(bpy) 3 ] 2+ which was based upon experimental data alone, and the use of a simplified model. The results of the calculations provide a highly satisfactory fit to the experimental data and the normal coordinate analyses, in terms of potential energy distributions, allow a detailed understanding of the vibrational spectra of both complexes. Evidence is presented for Jahn-Teller distortion in the 1 E MLCT excited state

  7. DFT calculations of the structures and vibrational spectra of the [Fe(bpy) 3] 2+ and [Ru(bpy) 3] 2+ complexes

    Science.gov (United States)

    Alexander, Bruce D.; Dines, Trevor J.; Longhurst, Rayne W.

    2008-09-01

    Structures of the [M(bpy) 3] 2+ complexes (M = Fe and Ru) have been calculated at the B3-LYP/DZVP level. IR and Raman spectra were calculated using the optimised geometries, employing a scaled quantum chemical force field, and compared with an earlier normal coordinate analysis of [Ru(bpy) 3] 2+ which was based upon experimental data alone, and the use of a simplified model. The results of the calculations provide a highly satisfactory fit to the experimental data and the normal coordinate analyses, in terms of potential energy distributions, allow a detailed understanding of the vibrational spectra of both complexes. Evidence is presented for Jahn-Teller distortion in the 1E MLCT excited state.

  8. Molecular orbital calculations of the unpaired electron distribution and electric field gradients in divalent paramagnetic Ir complexes

    International Nuclear Information System (INIS)

    Nogueira, S.R.; Vugman, N.V.; Guenzburger, D.

    1988-01-01

    Semi-empirical Molecular Orbital calculations were performed for the paramagnetic complex ions [Ir(CN) 5 ] 3- , [Ir(CN) 5 Cl] 4- and [Ir(CN) 4 Cl 2 ] 4- . Energy levels schemes and Mulliken-type populations were obtained. The distribution of the unpaired spin over the atoms in the complexes was derived, and compared to data obtained from Electron Paramagnetic Resonance spectra with the aid of a Ligand Field model. The electric field gradients at the Ir nucleus were calculated and compared to experiment. The results are discussed in terms of the chemical bonds formed by Ir and the ligands. (author) [pt

  9. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  10. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  11. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  12. Effect of vanadium treatment on tissue distribution of biotrace elements in normal and streptozotocin-induced diabetic rats. Simultaneous analysis of V and Zn using radioactive multitracer

    Energy Technology Data Exchange (ETDEWEB)

    Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu [Department of Analytical and Bioinorganic Chemistry, Kyoto Pharmaceutical University, Kyoto (Japan); Hirunuma, Rieko; Enomoto, Shuichi [Radioisotope Technology Division, Cyclotron Center, Institute of Physical and Chemical Research (RIKEN), Wako, Saitama (Japan)

    2001-05-01

    Because vanadium ions such as vanadyl (VO{sup 2+}) and vanadate (VO{sup 3-}) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ({sup 48}V) and zinc ({sup 65}Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)

  13. Complex saddle points and the sign problem in complex Langevin simulation

    International Nuclear Information System (INIS)

    Hayata, Tomoya; Hidaka, Yoshimasa; Tanizaki, Yuya

    2016-01-01

    We show that complex Langevin simulation converges to a wrong result within the semiclassical analysis, by relating it to the Lefschetz-thimble path integral, when the path-integral weight has different phases among dominant complex saddle points. Equilibrium solution of the complex Langevin equation forms local distributions around complex saddle points. Its ensemble average approximately becomes a direct sum of the average in each local distribution, where relative phases among them are dropped. We propose that by taking these phases into account through reweighting, we can solve the wrong convergence problem. However, this prescription may lead to a recurrence of the sign problem in the complex Langevin method for quantum many-body systems.

  14. Log-Normal Turbulence Dissipation in Global Ocean Models

    Science.gov (United States)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  15. Application of multicomponent diffusion theory for description of impurities distribution in complex diffusive doping of semiconductors

    International Nuclear Information System (INIS)

    Uskov, V.A.; Kondrachenko, O.E.; Kondrachenko, L.A.

    1977-01-01

    A phenomenological theory of multicomponent diffusion involving interaction between the components is employed to analyze how the interaction between two admixtures affects their simultaneous or consequent diffusion into a semiconductor. The theory uses the equations of multicomponent dissusion under common conditions (constant diffusion coefficients and equilibrium distribution of vacancies). The experiments are described on In and Sb simultaneous diffusion into Ge. The diffusion is performed according to the routine gas phase technology with the use of radioactive isotopes In 114 and Sb 124 . It is shown that the introduction of an additional diffusion coefficient D 12 makes it possible to simply and precisely describe the distribution of interacting admixtures in complex diffusion alloying of semiconductors

  16. Determining the theoretical reliability function of thermal power system using simple and complex Weibull distribution

    Directory of Open Access Journals (Sweden)

    Kalaba Dragan V.

    2014-01-01

    Full Text Available The main subject of this paper is the representation of the probabilistic technique for thermal power system reliability assessment. Exploitation research of the reliability of the fossil fuel power plant system has defined the function, or the probabilistic law, according to which the random variable behaves (occurrence of complete unplanned standstill. Based on these data, and by applying the reliability theory to this particular system, using simple and complex Weibull distribution, a hypothesis has been confirmed that the distribution of the observed random variable fully describes the behaviour of such a system in terms of reliability. Establishing a comprehensive insight in the field of probabilistic power system reliability assessment technique could serve as an input for further research and development in the area of power system planning and operation.

  17. Climate and the complexity of migratory phenology: sexes, migratory distance, and arrival distributions

    Science.gov (United States)

    Macmynowski, Dena P.; Root, Terry L.

    2007-05-01

    The intra- and inter-season complexity of bird migration has received limited attention in climatic change research. Our phenological analysis of 22 species collected in Chicago, USA, (1979 2002) evaluates the relationship between multi-scalar climate variables and differences (1) in arrival timing between sexes, (2) in arrival distributions among species, and (3) between spring and fall migration. The early migratory period for earliest arriving species (i.e., short-distance migrants) and earliest arriving individuals of a species (i.e., males) most frequently correlate with climate variables. Compared to long-distance migrant species, four times as many short-distance migrants correlate with spring temperature, while 8 of 11 (73%) of long-distance migrant species’ arrival is correlated with the North Atlantic Oscillation (NAO). While migratory phenology has been correlated with NAO in Europe, we believe that this is the first documentation of a significant association in North America. Geographically proximate conditions apparently influence migratory timing for short-distance migrants while continental-scale climate (e.g., NAO) seemingly influences the phenology of Neotropical migrants. The preponderance of climate correlations is with the early migratory period, not the median of arrival, suggesting that early spring conditions constrain the onset or rate of migration for some species. The seasonal arrival distribution provides considerable information about migratory passage beyond what is apparent from statistical analyses of phenology. A relationship between climate and fall phenology is not detected at this location. Analysis of the within-season complexity of migration, including multiple metrics of arrival, is essential to detect species’ responses to changing climate as well as evaluate the underlying biological mechanisms.

  18. Distribution of crushing strength of tablets

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2002-01-01

    The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....

  19. UPLC-MS method for quantification of pterostilbene and its application to comparative study of bioavailability and tissue distribution in normal and Lewis lung carcinoma bearing mice.

    Science.gov (United States)

    Deng, Li; Li, Yongzhi; Zhang, Xinshi; Chen, Bo; Deng, Yulin; Li, Yujuan

    2015-10-10

    A UPLC-MS method was developed for determination of pterostilbene (PTS) in plasma and tissues of mice. PTS was separated on Agilent Zorbax XDB-C18 column (50 × 2.1 mm, 1.8 μm) with gradient mobile phase at the flow rate of 0.2 ml/min. The detection was performed by negative ion electrospray ionization in multiple reaction monitoring mode. The linear calibration curve of PTS in mouse plasma and tissues ranged from 1.0 to 5000 and 0.50 to 500 ng/ml (r(2)>0.9979), respectively, with lowest limits of quantification (LLOQ) were between 0.5 and 2.0 ng/ml, respectively. The accuracy and precision of the assay were satisfactory. The validated method was applied to the study of bioavailability and tissue distribution of PTS in normal and Lewis lung carcinoma (LLC) bearing mice. The bioavailability of PTS (dose 14, 28 and 56 mg/kg) in normal mice were 11.9%, 13.9% and 26.4%, respectively; and the maximum level (82.1 ± 14.2 μg/g) was found in stomach (dose 28 mg/kg). The bioavailability, peak concentration (Cmax), time to peak concentration (Tmax) of PTS in LLC mice was increased compared with normal mice. The results indicated the UPLC-MS method is reliable and bioavailability and tissue distribution of PTS in normal and LLC mice were dramatically different. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Spatial distribution of cannabinoid receptor type 1 (CB1 in normal canine central and peripheral nervous system.

    Directory of Open Access Journals (Sweden)

    Jessica Freundt-Revilla

    Full Text Available The endocannabinoid system is a regulatory pathway consisting of two main types of cannabinoid receptors (CB1 and CB2 and their endogenous ligands, the endocannabinoids. The CB1 receptor is highly expressed in the central and peripheral nervous systems (PNS in mammalians and is involved in neuromodulatory functions. Since endocannabinoids were shown to be elevated in cerebrospinal fluid of epileptic dogs, knowledge about the species specific CB receptor expression in the nervous system is required. Therefore, we assessed the spatial distribution of CB1 receptors in the normal canine CNS and PNS. Immunohistochemistry of several regions of the brain, spinal cord and peripheral nerves from a healthy four-week-old puppy, three six-month-old dogs, and one ten-year-old dog revealed strong dot-like immunoreactivity in the neuropil of the cerebral cortex, Cornu Ammonis (CA and dentate gyrus of the hippocampus, midbrain, cerebellum, medulla oblongata and grey matter of the spinal cord. Dense CB1 expression was found in fibres of the globus pallidus and substantia nigra surrounding immunonegative neurons. Astrocytes were constantly positive in all examined regions. CB1 labelled neurons and satellite cells of the dorsal root ganglia, and myelinating Schwann cells in the PNS. These results demonstrate for the first time the spatial distribution of CB1 receptors in the healthy canine CNS and PNS. These results can be used as a basis for further studies aiming to elucidate the physiological consequences of this particular anatomical and cellular distribution.

  1. Uncertainty and dissent in climate risk assessment : a post-normal perspective

    NARCIS (Netherlands)

    Sluijs, J.P. van der

    2012-01-01

    Uncertainty complexity and dissent make climate change hard to tackle with normal scientific procedures. In a post-normal perspective the normal science task of “getting the facts right” is still regarded as necessary but no longer as fully feasible nor as sufficient to interface science and

  2. Value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations

    Directory of Open Access Journals (Sweden)

    Luo Li-Qin

    2016-01-01

    Full Text Available In this paper, we investigate the value distribution of meromorphic solutions of homogeneous and non-homogeneous complex linear differential-difference equations, and obtain the results on the relations between the order of the solutions and the convergence exponents of the zeros, poles, a-points and small function value points of the solutions, which show the relations in the case of non-homogeneous equations are sharper than the ones in the case of homogeneous equations.

  3. Probability distribution relationships

    Directory of Open Access Journals (Sweden)

    Yousry Abdelkader

    2013-05-01

    Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.

  4. Capture of impurity atoms by defects and the distribution of the complexes under ion bormbardment of growing films

    International Nuclear Information System (INIS)

    Radzhabov, T.D.; Iskanderova, Z.A.; Arutyunova, E.O.; Samigulin, K.R.

    1982-01-01

    Theoretical study of capture of impurity gas atoms with defects during ion introduction of the impurity in the process of film growth with simultaneous diffusion has been carried out. Concentration profiles of forned impurity-defect complexes have been calculated analytically and numerically by means of a computer in film depth and in a substrate; basic peculiarities of impurity component formation captured with defects in a wide range of changing basic experimental parameters have been revealed. Effect of impurity capture with defects on amount and distribution of total concentration of impurity atoms and intensity of complete absorption of bombarding ions in films have been analyzed. Shown is a possibility for producing films with a high concentration level and almost uniform distribution of the impurity-defect complexes for real, achievable an experiment, values of process parameters as well as a possibility for increasing complete absorption of gaseous impurity wiht concentration growth of capture defects-traps

  5. Normal accidents

    International Nuclear Information System (INIS)

    Perrow, C.

    1989-01-01

    The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de

  6. The rank of a normally distributed matrix and positive definiteness of a noncentral Wishart distributed matrix

    NARCIS (Netherlands)

    Steerneman, A. G. M.; van Perlo-ten Kleij, Frederieke

    2008-01-01

    If X similar to N-nxk(M, I-n circle times Sigma), then S = X'X has the noncentral Wishart distribution W-k(')(n, Sigma; A), where Lambda = M'M. Here Sigma is allowed to be singular. It is well known that if Lambda = 0, then S has a (central) Wishart distribution and. S is positive definite with

  7. Probing the structure of complex solids using a distributed computing approach-Applications in zeolite science

    International Nuclear Information System (INIS)

    French, Samuel A.; Coates, Rosie; Lewis, Dewi W.; Catlow, C. Richard A.

    2011-01-01

    We demonstrate the viability of distributed computing techniques employing idle desktop computers in investigating complex structural problems in solids. Through the use of a combined Monte Carlo and energy minimisation method, we show how a large parameter space can be effectively scanned. By controlling the generation and running of different configurations through a database engine, we are able to not only analyse the data 'on the fly' but also direct the running of jobs and the algorithms for generating further structures. As an exemplar case, we probe the distribution of Al and extra-framework cations in the structure of the zeolite Mordenite. We compare our computed unit cells with experiment and find that whilst there is excellent correlation between computed and experimentally derived unit cell volumes, cation positioning and short-range Al ordering (i.e. near neighbour environment), there remains some discrepancy in the distribution of Al throughout the framework. We also show that stability-structure correlations only become apparent once a sufficiently large sample is used. - Graphical Abstract: Aluminium distributions in zeolites are determined using e-science methods. Highlights: → Use of e-science methods to search configurationally space. → Automated control of space searching. → Identify key structural features conveying stability. → Improved correlation of computed structures with experimental data.

  8. Distributions of extreme bursts above thresholds in a fractional Lévy toy model of natural complexity.

    Science.gov (United States)

    Watkins, Nicholas; Chapman, Sandra; Rosenberg, Sam; Credgington, Dan; Sanchez, Raul

    2010-05-01

    In 2 far-sighted contributions in the 1960s Mandelbrot showed the ubiquity of both non-Gaussian fluctuations and long-ranged temporal memory (the "Noah" and "Joseph" effects, respectively) in the natural and man-made worlds. Much subsequent work in complexity science has contributed to the physical underpinning of these effects, particularly in cases where complex interactions in a system cause a driven or random perturbation to be nonlinearly amplified in amplitude and/or spread out over a wide range of frequencies. In addition the modelling of catastrophes has begun to incorporate the insights which these approaches have offered into the likelihood of extreme and long-lived fluctuations. I will briefly survey how the application of the above ideas in the earth system has been a key focus and motivation of research into natural complexity at BAS [e.g. Watkins & Freeman, Science, 2008; Edwards et al, Nature, 2007]. I will then discuss in detail a standard toy model (linear fractional stable motion, LFSM) which combines the Noah and Joseph effects in a controllable way and explain how it differs from the widely used continuous time random walk. I will describe how LFSM is being used to explore the interplay of the above two effects in the distribution of bursts above thresholds. I will describe ongoing work to improve the accuracy of maximum likelihood-based estimation of burst size and waiting time distributions for LFSM first reported in [Watkins et al, PRE, 2009]; and will also touch on similar work for multifractal models [Watkins et al, PRL comment, 2009].

  9. Influence of Transformation Plasticity on the Distribution of Internal Stress in Three Water-Quenched Cylinders

    Science.gov (United States)

    Liu, Yu; Qin, Shengwei; Zhang, Jiazhi; Wang, Ying; Rong, Yonghua; Zuo, Xunwei; Chen, Nailu

    2017-10-01

    Based on the hardenability of three medium carbon steels, cylinders with the same 60-mm diameter and 240-mm length were designed for quenching in water to obtain microstructures, including a pearlite matrix (Chinese steel mark: 45), a bainite matrix (42CrMo), and a martensite matrix (40CrNiMo). Through the combination of normalized functions describing transformation plasticity (TP), the thermo-elasto-plastic constitutive equation was deduced. The results indicate that the finite element simulation (FES) of the internal stress distribution in the three kinds of hardenable steel cylinders based on the proposed exponent-modified (Ex-Modified) normalized function is more consistent with the X-ray diffraction (XRD) measurements than those based on the normalized functions proposed by Abrassart, Desalos, and Leblond, which is attributed to the fact that the Ex-Modified normalized function better describes the TP kinetics. In addition, there was no significant difference between the calculated and measured stress distributions, even though TP was taken into account for the 45 carbon steel; that is, TP can be ignored in FES. In contrast, in the 42CrMo and 40CrNiMo alloyed steels, the significant effect of TP on the residual stress distributions was demonstrated, meaning that TP must be included in the FES. The rationality of the preceding conclusions was analyzed. The complex quenching stress is a consequence of interactions between the thermal and phase transformation stresses. The separated calculations indicate that the three steels exhibit similar thermal stress distributions for the same water-quenching condition, but different phase transformation stresses between 45 carbon steel and alloyed steels, leading to different distributions of their axial and tangential stresses.

  10. Penalized Maximum Likelihood Estimation for univariate normal mixture distributions

    International Nuclear Information System (INIS)

    Ridolfi, A.; Idier, J.

    2001-01-01

    Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test

  11. X-ray emssion from normal galaxies

    International Nuclear Information System (INIS)

    Speybroeck, L. van; Bechtold, J.

    1981-01-01

    A summary of results obtained with the Einstein Observatory is presented. There are two general categories of normal galaxy investigation being pursued - detailed studies of nearby galaxies where individual sources can be detected and possibly correlated with galactic morphology, and shorter observations of many more distant objects to determine the total luminosity distribution of normal galaxies. The principal examples of the first type are the CFA study of M31 and the Columbia study of the Large Magellanic Cloud. The Columbia normal galaxy survey is the principal example of the second type, although there also are smaller CFA programs concentrating on early galaxies and peculiar galaxies, and MIT has observed some members of the local group. (Auth.)

  12. Deformation around basin scale normal faults

    International Nuclear Information System (INIS)

    Spahic, D.

    2010-01-01

    Faults in the earth crust occur within large range of scales from microscale over mesoscopic to large basin scale faults. Frequently deformation associated with faulting is not only limited to the fault plane alone, but rather forms a combination with continuous near field deformation in the wall rock, a phenomenon that is generally called fault drag. The correct interpretation and recognition of fault drag is fundamental for the reconstruction of the fault history and determination of fault kinematics, as well as prediction in areas of limited exposure or beyond comprehensive seismic resolution. Based on fault analyses derived from 3D visualization of natural examples of fault drag, the importance of fault geometry for the deformation of marker horizons around faults is investigated. The complex 3D structural models presented here are based on a combination of geophysical datasets and geological fieldwork. On an outcrop scale example of fault drag in the hanging wall of a normal fault, located at St. Margarethen, Burgenland, Austria, data from Ground Penetrating Radar (GPR) measurements, detailed mapping and terrestrial laser scanning were used to construct a high-resolution structural model of the fault plane, the deformed marker horizons and associated secondary faults. In order to obtain geometrical information about the largely unexposed master fault surface, a standard listric balancing dip domain technique was employed. The results indicate that for this normal fault a listric shape can be excluded, as the constructed fault has a geologically meaningless shape cutting upsection into the sedimentary strata. This kinematic modeling result is additionally supported by the observation of deformed horizons in the footwall of the structure. Alternatively, a planar fault model with reverse drag of markers in the hanging wall and footwall is proposed. Deformation around basin scale normal faults. A second part of this thesis investigates a large scale normal fault

  13. Mathematical analysis of the normal anatomy of the aging fovea.

    Science.gov (United States)

    Nesmith, Brooke; Gupta, Akash; Strange, Taylor; Schaal, Yuval; Schaal, Shlomit

    2014-08-28

    To mathematically analyze anatomical changes that occur in the normal fovea during aging. A total of 2912 spectral-domain optical coherence tomography (SD-OCT) normal foveal scans were analyzed. Subjects were healthy individuals, aged 13 to 97 years, with visual acuity ≥20/40 and without evidence of foveal pathology. Using automated symbolic regression software Eureqa (version 0.98), foveal thickness maps of 390 eyes were analyzed using several measurements: parafoveal retinal thickness at 50 μm consecutive intervals, parafoveal maximum retinal thickness at two points lateral to central foveal depression, distance between two points of maximum retinal thickness, maximal foveal slope at two intervals lateral to central foveal depression, and central length of foveal depression. A unique mathematical equation representing the mathematical analog of foveal anatomy was derived for every decade, between 10 and 100 years. The mathematical regression function for normal fovea followed first order sine curve of level 10 complexity for the second decade of life. The mathematical regression function became more complex with normal aging, up to level 43 complexity (0.085 fit; P < 0.05). Young foveas had higher symmetry (0.92 ± 0.10) along midline, whereas aged foveas had significantly less symmetry (0.76 ± 0.27, P < 0.01) along midline and steeper maximal slopes (29 ± 32°, P < 0.01). Normal foveal anatomical configuration changes with age. Normal aged foveas are less symmetric along midline with steeper slopes. Differentiating between normal aging and pathologic changes using SD-OCT scans may allow early diagnosis, follow-up, and better management of the aging population. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  14. The clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast

    International Nuclear Information System (INIS)

    Lee, Jin Hwa; Yoon, Seong Kuk; Choi, Sun Seob; Nam, Kyung Jin; Cho, Se Heon; Kim, Dae Cheol; Kim, Jung Il; Kim, Eun Kyung

    2006-01-01

    We wanted to evaluate the clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast. From Apr 2003 to Feb 2005, 107 patients with 113 palpable abnormalities who had combined normal sonographic and normal mammographic findings were retrospectively studied. The evaluated parameters included age of the patients, the clinical referrals, the distribution of the locations of the palpable abnormalities, whether there was a past surgical history, the mammographic densities and the sonographic echo patterns (purely hyperechoic fibrous tissue, mixed fibroglandular breast tissue, predominantly isoechoic glandular tissue and isoechoic subcutaneous fat tissue) at the sites of clinical concern, whether there was a change in imaging and/or the physical examination results at follow-up, and whether there were biopsy results. This study period was chosen to allow a follow-up period of at least 12 months. The patients' ages ranged from 22 to 66 years (mean age: 48.8 years) and 62 (58%) of the 107 patients were between 41 and 50 years old (58%). The most common location of the palpable abnormalities was the upper outer portion of the breast (45%) and most of the mammographic densities were dense patterns (BI-RADS Type 3 or 4: 91%). Our cases showed similar distribution for all the types of sonographic echo patterns. 23 patients underwent biopsy; all the biopsy specimens were benign. For the 84 patients with 90 palpable abnormalities who were followed, there was no interval development of breast cancer in the areas of clinical concern. Our results suggest that we can follow up and prevent unnecessary biopsies in women with palpable abnormalities when both the mammography and ultrasonography show normal tissue, but this study was limited by its small sample size. Therefore, a larger study will be needed to better define the negative predictive value of combined normal sonographic and mammographic findings

  15. Data Normalization to Accelerate Training for Linear Neural Net to Predict Tropical Cyclone Tracks

    Directory of Open Access Journals (Sweden)

    Jian Jin

    2015-01-01

    Full Text Available When pure linear neural network (PLNN is used to predict tropical cyclone tracks (TCTs in South China Sea, whether the data is normalized or not greatly affects the training process. In this paper, min.-max. method and normal distribution method, instead of standard normal distribution, are applied to TCT data before modeling. We propose the experimental schemes in which, with min.-max. method, the min.-max. value pair of each variable is mapped to (−1, 1 and (0, 1; with normal distribution method, each variable’s mean and standard deviation pair is set to (0, 1 and (100, 1. We present the following results: (1 data scaled to the similar intervals have similar effects, no matter the use of min.-max. or normal distribution method; (2 mapping data to around 0 gains much faster training speed than mapping them to the intervals far away from 0 or using unnormalized raw data, although all of them can approach the same lower level after certain steps from their training error curves. This could be useful to decide data normalization method when PLNN is used individually.

  16. Unusual distribution of Burkholderia cepacia complex species in Danish cystic fibrosis clinics may stem from restricted transmission between patients

    DEFF Research Database (Denmark)

    Nørskov-Lauritsen, Niels; Johansen, Helle Krogh; Fenger, Mette G

    2010-01-01

    Forty-four of 48 Burkholderia cepacia complex strains cultured from Danish cystic fibrosis patients were Burkholderia multivorans, a distribution of species that has not been reported before. Although cases of cross infections were demonstrated, no major epidemic clone was found. The species...

  17. S-curve networks and an approximate method for estimating degree distributions of complex networks

    Science.gov (United States)

    Guo, Jin-Li

    2010-12-01

    In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference values for optimizing the distribution of IPv4 address resource and the development of IPv6. Based on the laws of IPv4 growth, that is, the bulk growth and the finitely growing limit, it proposes a finite network model with a bulk growth. The model is said to be an S-curve network. Analysis demonstrates that the analytic method based on uniform distributions (i.e., Barabási-Albert method) is not suitable for the network. It develops an approximate method to predict the growth dynamics of the individual nodes, and uses this to calculate analytically the degree distribution and the scaling exponents. The analytical result agrees with the simulation well, obeying an approximately power-law form. This method can overcome a shortcoming of Barabási-Albert method commonly used in current network research.

  18. S-curve networks and an approximate method for estimating degree distributions of complex networks

    International Nuclear Information System (INIS)

    Guo Jin-Li

    2010-01-01

    In the study of complex networks almost all theoretical models have the property of infinite growth, but the size of actual networks is finite. According to statistics from the China Internet IPv4 (Internet Protocol version 4) addresses, this paper proposes a forecasting model by using S curve (logistic curve). The growing trend of IPv4 addresses in China is forecasted. There are some reference values for optimizing the distribution of IPv4 address resource and the development of IPv6. Based on the laws of IPv4 growth, that is, the bulk growth and the finitely growing limit, it proposes a finite network model with a bulk growth. The model is said to be an S-curve network. Analysis demonstrates that the analytic method based on uniform distributions (i.e., Barabási-Albert method) is not suitable for the network. It develops an approximate method to predict the growth dynamics of the individual nodes, and uses this to calculate analytically the degree distribution and the scaling exponents. The analytical result agrees with the simulation well, obeying an approximately power-law form. This method can overcome a shortcoming of Barabási-Albert method commonly used in current network research. (general)

  19. Distribution of age at menopause in two Danish samples

    DEFF Research Database (Denmark)

    Boldsen, J L; Jeune, B

    1990-01-01

    We analyzed the distribution of reported age at natural menopause in two random samples of Danish women (n = 176 and n = 150) to determine the shape of the distribution and to disclose any possible trends in the distribution parameters. It was necessary to correct the frequencies of the reported...... ages for the effect of differing ages at reporting. The corrected distribution of age at menopause differs from the normal distribution in the same way in both samples. Both distributions could be described by a mixture of two normal distributions. It appears that most of the parameters of the normal...... distribution mixtures remain unchanged over a 50-year time lag. The position of the distribution, that is, the mean age at menopause, however, increases slightly but significantly....

  20. Subchondral bone density distribution of the talus in clinically normal Labrador Retrievers.

    Science.gov (United States)

    Dingemanse, W; Müller-Gerbl, M; Jonkers, I; Vander Sloten, J; van Bree, H; Gielen, I

    2016-03-15

    Bones continually adapt their morphology to their load bearing function. At the level of the subchondral bone, the density distribution is highly correlated with the loading distribution of the joint. Therefore, subchondral bone density distribution can be used to study joint biomechanics non-invasively. In addition physiological and pathological joint loading is an important aspect of orthopaedic disease, and research focusing on joint biomechanics will benefit veterinary orthopaedics. This study was conducted to evaluate density distribution in the subchondral bone of the canine talus, as a parameter reflecting the long-term joint loading in the tarsocrural joint. Two main density maxima were found, one proximally on the medial trochlear ridge and one distally on the lateral trochlear ridge. All joints showed very similar density distribution patterns and no significant differences were found in the localisation of the density maxima between left and right limbs and between dogs. Based on the density distribution the lateral trochlear ridge is most likely subjected to highest loads within the tarsocrural joint. The joint loading distribution is very similar between dogs of the same breed. In addition, the joint loading distribution supports previous suggestions of the important role of biomechanics in the development of OC lesions in the tarsus. Important benefits of computed tomographic osteoabsorptiometry (CTOAM), i.e. the possibility of in vivo imaging and temporal evaluation, make this technique a valuable addition to the field of veterinary orthopaedic research.

  1. Vibrational Spectra And Potential Energy Distributions of Normal Modes of N,N'-Etilenbis(P-Toluen sulfonamide)

    International Nuclear Information System (INIS)

    Alyar, S.

    2008-01-01

    N-substituted sulfonamides are well known for their diuretic, antidiabetic, antibacterial and antifungal, anticancer e.g., and are widely used in the therapy of patients. These important bioactive properties are strongly affected by the special features of -CH 2 -SO 2 -NR-linker and intramolecular motion Thus, the studies of energetic and spatial properties on N-substituted sulfonamides are of great importance to improve our understanding of their biological activities and enhance abilities to predict new drugs. Density Functional Theory B3LYP /6-31G(d,p) level has been applied to obtain the vibrational force field for the most stable conformation of N,N'-etilenbis(p-toluensulfonamit)(ptsen)having sulfonamide moiety. The results of these calculation have been compared with spectroscopic data to verify accuracy of calculation and applicability of the DFT approach to ptsen. Additionally, complete normal coordinate analyses with quantum mechanical scaling (SQM) were performed to derive the potential energy distributions (PE)

  2. Measurement of stress distributions in truck tyre contact patch in real rolling conditions

    Science.gov (United States)

    Anghelache, Gabriel; Moisescu, Raluca

    2012-12-01

    Stress distributions on three orthogonal directions have been measured across the contact patch of truck tyres using the complex measuring system that contains a transducer assembly with 30 sensing elements placed in the road surface. The measurements have been performed in straight line, in real rolling conditions. Software applications for calibration, data acquisition, and data processing were developed. The influence of changes in inflation pressure and rolling speed on the shapes and sizes of truck tyre contact patch has been shown. The shapes and magnitudes of normal, longitudinal, and lateral stress distributions, measured at low speed, have been presented and commented. The effect of wheel toe-in and camber on the stress distribution results was observed. The paper highlights the impact of the longitudinal tread ribs on the shear stress distributions. The ratios of stress distributions in the truck tyre contact patch have been computed and discussed.

  3. Multivariate phase type distributions - Applications and parameter estimation

    DEFF Research Database (Denmark)

    Meisch, David

    The best known univariate probability distribution is the normal distribution. It is used throughout the literature in a broad field of applications. In cases where it is not sensible to use the normal distribution alternative distributions are at hand and well understood, many of these belonging...... and statistical inference, is the multivariate normal distribution. Unfortunately only little is known about the general class of multivariate phase type distribution. Considering the results concerning parameter estimation and inference theory of univariate phase type distributions, the class of multivariate...... projects and depend on reliable cost estimates. The Successive Principle is a group analysis method primarily used for analyzing medium to large projects in relation to cost or duration. We believe that the mathematical modeling used in the Successive Principle can be improved. We suggested a novel...

  4. Normal Anti-Invariant Submanifolds of Paraquaternionic Kähler Manifolds

    Directory of Open Access Journals (Sweden)

    Novac-Claudiu Chiriac

    2006-12-01

    Full Text Available We introduce normal anti-invariant submanifolds of paraquaternionic Kähler manifolds and study the geometric structures induced on them. We obtain necessary and sufficient conditions for the integrability of the distributions defined on a normal anti-invariant submanifold. Also, we present characterizations of local (global anti-invariant products.

  5. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model.

    Directory of Open Access Journals (Sweden)

    Habib Baghirov

    Full Text Available The treatment of brain diseases is hindered by the blood-brain barrier (BBB preventing most drugs from entering the brain. Focused ultrasound (FUS with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma.

  6. Ultrasound-mediated delivery and distribution of polymeric nanoparticles in the normal brain parenchyma of a metastatic brain tumour model

    Science.gov (United States)

    Baghirov, Habib; Snipstad, Sofie; Sulheim, Einar; Berg, Sigrid; Hansen, Rune; Thorsen, Frits; Mørch, Yrr; Åslund, Andreas K. O.

    2018-01-01

    The treatment of brain diseases is hindered by the blood-brain barrier (BBB) preventing most drugs from entering the brain. Focused ultrasound (FUS) with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate) nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma. PMID:29338016

  7. The R-Shell approach - Using scheduling agents in complex distributed real-time systems

    Science.gov (United States)

    Natarajan, Swaminathan; Zhao, Wei; Goforth, Andre

    1993-01-01

    Large, complex real-time systems such as space and avionics systems are extremely demanding in their scheduling requirements. The current OS design approaches are quite limited in the capabilities they provide for task scheduling. Typically, they simply implement a particular uniprocessor scheduling strategy and do not provide any special support for network scheduling, overload handling, fault tolerance, distributed processing, etc. Our design of the R-Shell real-time environment fcilitates the implementation of a variety of sophisticated but efficient scheduling strategies, including incorporation of all these capabilities. This is accomplished by the use of scheduling agents which reside in the application run-time environment and are responsible for coordinating the scheduling of the application.

  8. Comparison of the procedures of Fleishman and Ramberg et al. for generating non-normal data in simulation studies

    Directory of Open Access Journals (Sweden)

    Rebecca Bendayan

    2014-01-01

    Full Text Available Simulation techniques must be able to generate the types of distributions most commonly encountered in real data, for example, non-normal distributions. Two recognized procedures for generating non-normal data are Fleishman's linear transformation method and the method proposed by Ramberg et al. that is based on generalization of the Tukey lambda distribution. This study compares tríese procedures in terms of the extent to which the distributions they generate fit their respective theoretical models, and it also examines the number of simulations needed to achieve this fit. To this end, the paper considers, in addition to the normal distribution, a series of non-normal distributions that are commonly found in real data, and then analyses fit according to the extent to which normality is violated and the number of simulations performed. The results show that the two data generation procedures behave similarly. As the degree of contamination of the theoretical distribution increases, so does the number of simulations required to ensure a good fit to the generated data. The two procedures generate more accurate normal and non-normal distributions when at least 7000 simulations are performed, although when the degree of contamination is severe (with values of skewness and kurtosis of 2 and 6, respectively it is advisable to perform 15000 simulations.

  9. Three-Dimensional Numerical Analysis of Compound Lining in Complex Underground Surge-Shaft Structure

    Directory of Open Access Journals (Sweden)

    Juntao Chen

    2015-01-01

    Full Text Available The mechanical behavior of lining structure of deep-embedded cylinder surge shaft with multifork tunnel is analyzed using three-dimensional nonlinear FEM. With the elastic-plastic constitutive relations of rock mass imported and the implicit bolt element and distributed concrete cracking model adopted, a computing method of complex surge shaft is presented for the simulation of underground excavations and concrete lining cracks. In order to reflect the interaction and initial gap between rock mass and concrete lining, a three-dimensional nonlinear interface element is adopted, which can take into account both the normal and tangential characteristics. By an actual engineering computation, the distortion characteristics and stress distribution rules of the dimensional multifork surge-shaft lining structure under different behavior are revealed. The results verify the rationality and feasibility of this computation model and method and provide a new idea and reference for the complex surge-shaft design and construction.

  10. Fracture transmissivity as a function of normal and shear stress: first results in Opalinus Clay

    International Nuclear Information System (INIS)

    Cuss, R.J.; Milodowski, A.; Noy, D.J.; Harrington, J.F.

    2010-01-01

    Document available in extended abstract form only. Rock-mass failure around openings is usually observed in the form of a highly complex fracture network (EDZ), which is heterogeneous in distribution around a circular tunnel opening because of the heterogeneous stress distribution. The orientation of stress with respect to the fracture network is known to be important. The complex heterogeneous stress trajectory and heterogeneous fracture network results in a broad range of stresses and stress directions acting on the open fracture network. During the open stage of a repository, stress will slowly alter as shear movements occur along the fractures, as well as other time-dependent phenomena. As the repository is back filled, the stress field is further altered as the backfill settles and changes volume because of re-saturation. Therefore, a complex and wide ranging stress regime and stress history will result. In a purely mechanical sense, fracture transmissivity is a function of normal stress, shear stress, and fracture aperture. The Selfrac test from Mont Terri showed the change in transmissivity with effective normal stress. This work showed that fracture transmissivity decreased with increasing normal load and that an effective normal stress of 2.5 MPa is sufficient to yield a transmissivity similar to that seen in intact Opalinus clay (OPA). Therefore fracture closure because of normal stresses has been proven to be a quite efficient mechanism in OPA. A new shear rig was designed to investigate the detail of fracture transmissivity in OPA. The experimental configuration uses two prepared blocks that are 60 x 60 mm in size and approximately 20 mm thick. The first test sample had machine ground surfaces in contact with each other, with pore fluid being delivered through the centre of the top block directly to the fracture surface. The experimental programme included two distinct stages. In the first normal load was altered to investigate fracture transmissivity

  11. Technical Note: Modeling a complex micro-multileaf collimator using the standard BEAMnrc distribution

    International Nuclear Information System (INIS)

    Kairn, T.; Kenny, J.; Crowe, S. B.; Fielding, A. L.; Franich, R. D.; Johnston, P. N.; Knight, R. T.; Langton, C. M.; Schlect, D.; Trapp, J. V.

    2010-01-01

    Purpose: The component modules in the standard BEAMnrc distribution may appear to be insufficient to model micro-multileaf collimators that have trifaceted leaf ends and complex leaf profiles. This note indicates, however, that accurate Monte Carlo simulations of radiotherapy beams defined by a complex collimation device can be completed using BEAMnrc's standard VARMLC component module. Methods: That this simple collimator model can produce spatially and dosimetrically accurate microcollimated fields is illustrated using comparisons with ion chamber and film measurements of the dose deposited by square and irregular fields incident on planar, homogeneous water phantoms. Results: Monte Carlo dose calculations for on-axis and off-axis fields are shown to produce good agreement with experimental values, even on close examination of the penumbrae. Conclusions: The use of a VARMLC model of the micro-multileaf collimator, along with a commissioned model of the associated linear accelerator, is therefore recommended as an alternative to the development or use of in-house or third-party component modules for simulating stereotactic radiotherapy and radiosurgery treatments. Simulation parameters for the VARMLC model are provided which should allow other researchers to adapt and use this model to study clinical stereotactic radiotherapy treatments.

  12. On the generation of log-Levy distributions and extreme randomness

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2011-01-01

    The log-normal distribution is prevalent across the sciences, as it emerges from the combination of multiplicative processes and the central limit theorem (CLT). The CLT, beyond yielding the normal distribution, also yields the class of Levy distributions. The log-Levy distributions are the Levy counterparts of the log-normal distribution, they appear in the context of ultraslow diffusion processes, and they are categorized by Mandelbrot as belonging to the class of extreme randomness. In this paper, we present a natural stochastic growth model from which both the log-normal distribution and the log-Levy distributions emerge universally-the former in the case of deterministic underlying setting, and the latter in the case of stochastic underlying setting. In particular, we establish a stochastic growth model which universally generates Mandelbrot's extreme randomness. (paper)

  13. Log-normality of indoor radon data in the Walloon region of Belgium

    International Nuclear Information System (INIS)

    Cinelli, Giorgia; Tondeur, François

    2015-01-01

    The deviations of the distribution of Belgian indoor radon data from the log-normal trend are examined. Simulated data are generated to provide a theoretical frame for understanding these deviations. It is shown that the 3-component structure of indoor radon (radon from subsoil, outdoor air and building materials) generates deviations in the low- and high-concentration tails, but this low-C trend can be almost completely compensated by the effect of measurement uncertainties and by possible small errors in background subtraction. The predicted low-C and high-C deviations are well observed in the Belgian data, when considering the global distribution of all data. The agreement with the log-normal model is improved when considering data organised in homogeneous geological groups. As the deviation from log-normality is often due to the low-C tail for which there is no interest, it is proposed to use the log-normal fit limited to the high-C half of the distribution. With this prescription, the vast majority of the geological groups of data are compatible with the log-normal model, the remaining deviations being mostly due to a few outliers, and rarely to a “fat tail”. With very few exceptions, the log-normal modelling of the high-concentration part of indoor radon data is expected to give reasonable results, provided that the data are organised in homogeneous geological groups. - Highlights: • Deviations of the distribution of Belgian indoor Rn data from the log-normal trend. • 3-component structure of indoor Rn: subsoil, outdoor air and building materials. • Simulated data generated to provide a theoretical frame for understanding deviations. • Data organised in homogeneous geological groups; better agreement with the log-normal

  14. Vesicular glutamate transporter-immunoreactivities in the vestibular nuclear complex of rat.

    Science.gov (United States)

    Deng, Jiao; Zhang, Fu-Xing; Pang, You-Wang; Li, Jin-Lian; Li, Yun-Qing

    2006-07-01

    Objective Aims to delineate the distribution profile of three isoforms of vesicular glutamate transporter (VGluT), viz. VGluT1-3, and their cellular localization within vestibular nuclear complex (VNC). Methods Brain sections from normal Sprague-Dawley rats were processed immunohistochemically for VGluT detection, employing avidin-biotinylated peroxidase complex method with 3-3'-diaminobenzidine (DAB) as chromogen. Results The whole VNC expressed all of the three transporters that were observed to be localized to the fiber endings. Compared with VGluT1 and VGluT3, VGluT2 demonstrated a relatively homogeneous distribution, with much higher density in VNC. VGluT3 displayed the highest density in lateral vestibular nucleus and group X, contrasting with the sparse immunostained puncta within vestibular medial and inferior nuclei. Conclusion Glutamtatergic pathways participate in the processing of vestibular signals within VNC mainly through the re-uptake of glutamate into synaptic vesicles by VGluT1 and 2, whereas VGluT3 may play a similar role mainly in areas other than medial and inferior nuclei of VNC.

  15. Vesicular glutamate transporter-immunoreactivities in the vestibular nuclear complex of rat

    Institute of Scientific and Technical Information of China (English)

    Jiao DENG; Fu-Xing ZHANG; You-Wang PANG; Jin-Lian LI; Yun-Qing LI

    2006-01-01

    Objective Aims to delineate the distribution profile of three isoforms of vesicular glutamate transporter (VGluT), viz. VGluT1~3, and their cellular localization within vestibular nuclear complex (VNC). Methods Brain sections from normal Sprague-Dawley rats were processed immunohistochemically for VGluT detection, employing avidinbiotinylated peroxidase complex method with 3-3'-diaminobenzidine (DAB) as chromogen. Results The whole VNC expressed all of the three transporters that were observed to be localized to the fiber endings. Compared with VGluT1 and VGluT3, VGluT2 demonstrated a relatively homogeneous distribution, with much higher density in VNC. VGluT3 displayed the highest density in lateral vestibular nucleus and group X, contrasting with the sparse immunostained puncta within vestibular medial and inferior nuclei. Conclusion Glutamtatergic pathways participate in the processing of vestibular signals within VNC mainly through the re-uptake of glutamate into synaptic vesicles by VGluT1 and 2, whereas VGluT3 may play a similar role mainly in areas other than medial and inferior nuclei of VNC.

  16. The distribution of interlaboratory comparison data

    DEFF Research Database (Denmark)

    Heydorn, Kaj

    2008-01-01

    The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality.......The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality....

  17. A Complex Network Approach to Distributional Semantic Models.

    Directory of Open Access Journals (Sweden)

    Akira Utsumi

    Full Text Available A number of studies on network analysis have focused on language networks based on free word association, which reflects human lexical knowledge, and have demonstrated the small-world and scale-free properties in the word association network. Nevertheless, there have been very few attempts at applying network analysis to distributional semantic models, despite the fact that these models have been studied extensively as computational or cognitive models of human lexical knowledge. In this paper, we analyze three network properties, namely, small-world, scale-free, and hierarchical properties, of semantic networks created by distributional semantic models. We demonstrate that the created networks generally exhibit the same properties as word association networks. In particular, we show that the distribution of the number of connections in these networks follows the truncated power law, which is also observed in an association network. This indicates that distributional semantic models can provide a plausible model of lexical knowledge. Additionally, the observed differences in the network properties of various implementations of distributional semantic models are consistently explained or predicted by considering the intrinsic semantic features of a word-context matrix and the functions of matrix weighting and smoothing. Furthermore, to simulate a semantic network with the observed network properties, we propose a new growing network model based on the model of Steyvers and Tenenbaum. The idea underlying the proposed model is that both preferential and random attachments are required to reflect different types of semantic relations in network growth process. We demonstrate that this model provides a better explanation of network behaviors generated by distributional semantic models.

  18. A framework for analysis of abortive colony size distributions using a model of branching processes in irradiated normal human fibroblasts.

    Science.gov (United States)

    Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Ouchi, Noriyuki B; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki

    2013-01-01

    Clonogenicity gives important information about the cellular reproductive potential following ionizing irradiation, but an abortive colony that fails to continue to grow remains poorly characterized. It was recently reported that the fraction of abortive colonies increases with increasing dose. Thus, we set out to investigate the production kinetics of abortive colonies using a model of branching processes. We firstly plotted the experimentally determined colony size distribution of abortive colonies in irradiated normal human fibroblasts, and found the linear relationship on the log-linear or log-log plot. By applying the simple model of branching processes to the linear relationship, we found the persistent reproductive cell death (RCD) over several generations following irradiation. To verify the estimated probability of RCD, abortive colony size distribution (≤ 15 cells) and the surviving fraction were simulated by the Monte Carlo computational approach for colony expansion. Parameters estimated from the log-log fit demonstrated the good performance in both simulations than those from the log-linear fit. Radiation-induced RCD, i.e. excess probability, lasted over 16 generations and mainly consisted of two components in the early (probability over 5 generations, whereas abortive colony size distribution was robust against it. These results suggest that, whereas short-term RCD is critical to the abortive colony size distribution, long-lasting RCD is important for the dose response of the surviving fraction. Our present model provides a single framework for understanding the behavior of primary cell colonies in culture following irradiation.

  19. The 2016-2017 Central Italy Seismic Sequence: Source Complexity Inferred from Rupture Models.

    Science.gov (United States)

    Scognamiglio, L.; Tinti, E.; Casarotti, E.; Pucci, S.; Villani, F.; Cocco, M.; Magnoni, F.; Michelini, A.

    2017-12-01

    The Apennines have been struck by several seismic sequences in recent years, showing evidence of the activation of multiple segments of normal fault systems in a variable and, relatively short, time span, as in the case of the 1980 Irpinia earthquake (three shocks in 40 s), the 1997 Umbria-Marche sequence (four main shocks in 18 days) and the 2009 L'Aquila earthquake having three segments activated within a few weeks. The 2016-2017 central Apennines seismic sequence begin on August 24th with a MW 6.0 earthquake, which strike the region between Amatrice and Accumoli causing 299 fatalities. This earthquake ruptures a nearly 20 km long normal fault and shows a quite heterogeneous slip distribution. On October 26th, another main shock (MW 5.9) occurs near Visso extending the activated seismogenic area toward the NW. It is a double event rupturing contiguous patches on the fault segment of the normal fault system. Four days after the second main shock, on October 30th, a third earthquake (MW 6.5) occurs near Norcia, roughly midway between Accumoli and Visso. In this work we have inverted strong motion waveforms and GPS data to retrieve the source model of the MW 6.5 event with the aim of interpreting the rupture process in the framework of this complex sequence of moderate magnitude earthquakes. We noted that some preliminary attempts to model the slip distribution of the October 30th main shock using a single fault plane oriented along the Apennines did not provide convincing fits to the observed waveforms. In addition, the deformation pattern inferred from satellite observations suggested the activation of a multi-fault structure, that is coherent to the complexity and the extension of the geological surface deformation. We investigated the role of multi-fault ruptures and we found that this event revealed an extraordinary complexity of the rupture geometry and evolution: the coseismic rupture propagated almost simultaneously on a normal fault and on a blind fault

  20. Flexible mixture modeling via the multivariate t distribution with the Box-Cox transformation: an alternative to the skew-t distribution.

    Science.gov (United States)

    Lo, Kenneth; Gottardo, Raphael

    2012-01-01

    Cluster analysis is the automated search for groups of homogeneous observations in a data set. A popular modeling approach for clustering is based on finite normal mixture models, which assume that each cluster is modeled as a multivariate normal distribution. However, the normality assumption that each component is symmetric is often unrealistic. Furthermore, normal mixture models are not robust against outliers; they often require extra components for modeling outliers and/or give a poor representation of the data. To address these issues, we propose a new class of distributions, multivariate t distributions with the Box-Cox transformation, for mixture modeling. This class of distributions generalizes the normal distribution with the more heavy-tailed t distribution, and introduces skewness via the Box-Cox transformation. As a result, this provides a unified framework to simultaneously handle outlier identification and data transformation, two interrelated issues. We describe an Expectation-Maximization algorithm for parameter estimation along with transformation selection. We demonstrate the proposed methodology with three real data sets and simulation studies. Compared with a wealth of approaches including the skew-t mixture model, the proposed t mixture model with the Box-Cox transformation performs favorably in terms of accuracy in the assignment of observations, robustness against model misspecification, and selection of the number of components.

  1. Modeling error distributions of growth curve models through Bayesian methods.

    Science.gov (United States)

    Zhang, Zhiyong

    2016-06-01

    Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided.

  2. A distributed dynamic model of a monolith hydrogen membrane reactor

    International Nuclear Information System (INIS)

    Michelsen, Finn Are; Wilhelmsen, Øivind; Zhao, Lei; Aasen, Knut Ingvar

    2013-01-01

    Highlights: ► We model a rigorous distributed dynamic model for a HMR unit. ► The model includes enough complexity for steady-state and dynamic analysis. ► Simulations show that the model is non-linear within the normal operating range. ► The model is useful for studying and handling disturbances such as inlet changes and membrane leakage. - Abstract: This paper describes a distributed mechanistic dynamic model of a hydrogen membrane reformer unit (HMR) used for methane steam reforming. The model is based on a square channel monolith structure concept, where air flows adjacent to a mix of natural gas and water distributed in a chess pattern of channels. Combustion of hydrogen gives energy to the endothermic steam reforming reactions. The model is used for both steady state and dynamic analyses. It therefore needs to be computationally attractive, but still include enough complexity to study the important steady state and dynamic features of the process. Steady-state analysis of the model gives optimum for the steam to carbon and steam to oxygen ratios, where the conversion of methane is 92% and the hydrogen used as energy for the endothermic reactions is 28% at the nominal optimum. The dynamic analysis shows that non-linear control schemes may be necessary for satisfactory control performance

  3. Diverse Complexities, Complex Diversities: Resisting "Normal Science" in Pedagogical and Research Methodologies. A Perspective from Aotearoa (New Zealand)

    Science.gov (United States)

    Ritchie, Jenny

    2016-01-01

    This paper offers an overview of complexities of the contexts for education in Aotearoa, which include the need to recognise and include Maori (Indigenous) perspectives, but also to extend this inclusion to the context of increasing ethnic diversity. These complexities include the situation of worsening disparities between rich and poor which…

  4. Evaluation of directional normalization methods for Landsat TM/ETM+ over primary Amazonian lowland forests

    Science.gov (United States)

    Van doninck, Jasper; Tuomisto, Hanna

    2017-06-01

    Biodiversity mapping in extensive tropical forest areas poses a major challenge for the interpretation of Landsat images, because floristically clearly distinct forest types may show little difference in reflectance. In such cases, the effects of the bidirectional reflection distribution function (BRDF) can be sufficiently strong to cause erroneous image interpretation and classification. Since the opening of the Landsat archive in 2008, several BRDF normalization methods for Landsat have been developed. The simplest of these consist of an empirical view angle normalization, whereas more complex approaches apply the semi-empirical Ross-Li BRDF model and the MODIS MCD43-series of products to normalize directional Landsat reflectance to standard view and solar angles. Here we quantify the effect of surface anisotropy on Landsat TM/ETM+ images over old-growth Amazonian forests, and evaluate five angular normalization approaches. Even for the narrow swath of the Landsat sensors, we observed directional effects in all spectral bands. Those normalization methods that are based on removing the surface reflectance gradient as observed in each image were adequate to normalize TM/ETM+ imagery to nadir viewing, but were less suitable for multitemporal analysis when the solar vector varied strongly among images. Approaches based on the MODIS BRDF model parameters successfully reduced directional effects in the visible bands, but removed only half of the systematic errors in the infrared bands. The best results were obtained when the semi-empirical BRDF model was calibrated using pairs of Landsat observation. This method produces a single set of BRDF parameters, which can then be used to operationally normalize Landsat TM/ETM+ imagery over Amazonian forests to nadir viewing and a standard solar configuration.

  5. Normal accidents living with high-risk technologies

    CERN Document Server

    Perrow, Charles

    1984-01-01

    Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.

  6. A NEW STATISTICAL PERSPECTIVE TO THE COSMIC VOID DISTRIBUTION

    International Nuclear Information System (INIS)

    Pycke, J-R; Russell, E.

    2016-01-01

    In this study, we obtain the size distribution of voids as a three-parameter redshift-independent log-normal void probability function (VPF) directly from the Cosmic Void Catalog (CVC). Although many statistical models of void distributions are based on the counts in randomly placed cells, the log-normal VPF that we obtain here is independent of the shape of the voids due to the parameter-free void finder of the CVC. We use three void populations drawn from the CVC generated by the Halo Occupation Distribution (HOD) Mocks, which are tuned to three mock SDSS samples to investigate the void distribution statistically and to investigate the effects of the environments on the size distribution. As a result, it is shown that void size distributions obtained from the HOD Mock samples are satisfied by the three-parameter log-normal distribution. In addition, we find that there may be a relation between the hierarchical formation, skewness, and kurtosis of the log-normal distribution for each catalog. We also show that the shape of the three-parameter distribution from the samples is strikingly similar to the galaxy log-normal mass distribution obtained from numerical studies. This similarity between void size and galaxy mass distributions may possibly indicate evidence of nonlinear mechanisms affecting both voids and galaxies, such as large-scale accretion and tidal effects. Considering the fact that in this study, all voids are generated by galaxy mocks and show hierarchical structures in different levels, it may be possible that the same nonlinear mechanisms of mass distribution affect the void size distribution.

  7. STOCHASTIC PRICING MODEL FOR THE REAL ESTATE MARKET: FORMATION OF LOG-NORMAL GENERAL POPULATION

    Directory of Open Access Journals (Sweden)

    Oleg V. Rusakov

    2015-01-01

    Full Text Available We construct a stochastic model of real estate pricing. The method of the pricing construction is based on a sequential comparison of the supply prices. We proof that under standard assumptions imposed upon the comparison coefficients there exists an unique non-degenerated limit in distribution and this limit has the lognormal law of distribution. The accordance of empirical distributions of prices to thetheoretically obtained log-normal distribution we verify by numerous statistical data of real estate prices from Saint-Petersburg (Russia. For establishing this accordance we essentially apply the efficient and sensitive test of fit of Kolmogorov-Smirnov. Basing on “The Russian Federal Estimation Standard N2”, we conclude that the most probable price, i.e. mode of distribution, is correctly and uniquely defined under the log-normal approximation. Since the mean value of log-normal distribution exceeds the mode - most probable value, it follows that the prices valued by the mathematical expectation are systematically overstated.

  8. Effect of fatty acids on functional properties of normal wheat and waxy wheat starches: A structural basis.

    Science.gov (United States)

    Wang, Shujun; Wang, Jinrong; Yu, Jinglin; Wang, Shuo

    2016-01-01

    The effects of three saturated fatty acids on functional properties of normal wheat and waxy wheat starches were investigated. The complexing index (CI) of normal wheat starch-fatty acid complexes decreased with increasing carbon chain length. In contrast, waxy wheat starch-fatty acid complexes presented much lower CI. V-type crystalline polymorphs were formed between normal wheat starch and three fatty acids, with shorter chain fatty acids producing more crystalline structure. FTIR and Raman spectroscopy presented the similar results with XRD. The formation of amylose-fatty acid complex inhibited granule swelling, gelatinization progression, retrogradation and pasting development of normal wheat starch, with longer chain fatty acids showing greater inhibition. Amylopectin can also form complexes with fatty acids, but the amount of complex was too little to be detected by XRD, FTIR, Raman and DSC. As a consequence, small changes were observed in the functional properties of waxy wheat starch with the addition of fatty acids. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Werner complex deficiency in cells disrupts the Nuclear Pore Complex and the distribution of lamin B1.

    Science.gov (United States)

    Li, Zhi; Zhu, Yizhou; Zhai, Yujia; R Castroagudin, Michelle; Bao, Yifei; White, Tommy E; Glavy, Joseph S

    2013-12-01

    From the surrounding shell to the inner machinery, nuclear proteins provide the functional plasticity of the nucleus. This study highlights the nuclear association of Pore membrane (POM) protein NDC1 and Werner protein (WRN), a RecQ helicase responsible for the DNA instability progeria disorder, Werner Syndrome. In our previous publication, we connected the DNA damage sensor Werner's Helicase Interacting Protein (WHIP), a binding partner of WRN, to the NPC. Here, we confirm the association of the WRN/WHIP complex and NDC1. In established WRN/WHIP knockout cell lines, we further demonstrate the interdependence of WRN/WHIP and Nucleoporins (Nups). These changes do not completely abrogate the barrier of the Nuclear Envelope (NE) but do affect the distribution of FG Nups and the RAN gradient, which are necessary for nuclear transport. Evidence from WRN/WHIP knockout cell lines demonstrates changes in the processing and nucleolar localization of lamin B1. The appearance of "RAN holes" void of RAN corresponds to regions within the nucleolus filled with condensed pools of lamin B1. From WRN/WHIP knockout cell line extracts, we found three forms of lamin B1 that correspond to mature holoprotein and two potential post-translationally modified forms of the protein. Upon treatment with topoisomerase inhibitors lamin B1 cleavage occurs only in WRN/WHIP knockout cells. Our data suggest the link of the NDC1 and WRN as one facet of the network between the nuclear periphery and genome stability. Loss of WRN complex leads to multiple alterations at the NPC and the nucleolus. © 2013. Published by Elsevier B.V. All rights reserved.

  10. Topological resilience in non-normal networked systems

    Science.gov (United States)

    Asllani, Malbor; Carletti, Timoteo

    2018-04-01

    The network of interactions in complex systems strongly influences their resilience and the system capability to resist external perturbations or structural damages and to promptly recover thereafter. The phenomenon manifests itself in different domains, e.g., parasitic species invasion in ecosystems or cascade failures in human-made networks. Understanding the topological features of the networks that affect the resilience phenomenon remains a challenging goal for the design of robust complex systems. We hereby introduce the concept of non-normal networks, namely networks whose adjacency matrices are non-normal, propose a generating model, and show that such a feature can drastically change the global dynamics through an amplification of the system response to exogenous disturbances and eventually impact the system resilience. This early stage transient period can induce the formation of inhomogeneous patterns, even in systems involving a single diffusing agent, providing thus a new kind of dynamical instability complementary to the Turing one. We provide, first, an illustrative application of this result to ecology by proposing a mechanism to mute the Allee effect and, second, we propose a model of virus spreading in a population of commuters moving using a non-normal transport network, the London Tube.

  11. Land-use history affects understorey plant species distributions in a large temperate-forest complex, Denmark

    DEFF Research Database (Denmark)

    Svenning, J.-C.; Baktoft, Karen H.; Balslev, Henrik

    2009-01-01

    In Europe, forests have been strongly influenced by human land-use for millennia. Here, we studied the importance of anthropogenic historical factors as determinants of understorey species distributions in a 967 ha Danish forest complex using 156 randomly placed 100-m2 plots, 15 environmental, 9...... dispersal and a strong literature record as ancient-forest species, were still concentrated in areas that were high forest in 1805. Among the younger forests, there were clear floristic differences between those on reclaimed bogs and those not. Apparently remnant populations of wet-soil plants were still...

  12. Fluid Distribution Pattern in Adult-Onset Congenital, Idiopathic, and Secondary Normal-Pressure Hydrocephalus: Implications for Clinical Care.

    Science.gov (United States)

    Yamada, Shigeki; Ishikawa, Masatsune; Yamamoto, Kazuo

    2017-01-01

    In spite of growing evidence of idiopathic normal-pressure hydrocephalus (NPH), a viewpoint about clinical care for idiopathic NPH is still controversial. A continuous divergence of viewpoints might be due to confusing classifications of idiopathic and adult-onset congenital NPH. To elucidate the classification of NPH, we propose that adult-onset congenital NPH should be explicitly distinguished from idiopathic and secondary NPH. On the basis of conventional CT scan or MRI, idiopathic NPH was defined as narrow sulci at the high convexity in concurrent with enlargement of the ventricles, basal cistern and Sylvian fissure, whereas adult-onset congenital NPH was defined as huge ventricles without high-convexity tightness. We compared clinical characteristics and cerebrospinal fluid distribution among 85 patients diagnosed with idiopathic NPH, 17 patients with secondary NPH, and 7 patients with adult-onset congenital NPH. All patients underwent 3-T MRI examinations and tap-tests. The volumes of ventricles and subarachnoid spaces were measured using a 3D workstation based on T2-weighted 3D sequences. The mean intracranial volume for the patients with adult-onset congenital NPH was almost 100 mL larger than the volumes for patients with idiopathic and secondary NPH. Compared with the patients with idiopathic or secondary NPH, patients with adult-onset congenital NPH exhibited larger ventricles but normal sized subarachnoid spaces. The mean volume ratio of the high-convexity subarachnoid space was significantly less in idiopathic NPH than in adult-onset congenital NPH, whereas the mean volume ratio of the basal cistern and Sylvian fissure in idiopathic NPH was >2 times larger than that in adult-onset congenital NPH. The symptoms of gait disturbance, cognitive impairment, and urinary incontinence in patients with adult-onset congenital NPH tended to progress more slowly compared to their progress in patients with idiopathic NPH. Cerebrospinal fluid distributions and

  13. Modeling Complex Systems

    International Nuclear Information System (INIS)

    Schreckenberg, M

    2004-01-01

    This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)

  14. Clinically practical intensity modulation for complex head and neck lesions using multiple, static MLC fields

    International Nuclear Information System (INIS)

    Verhey, L.J.; Xia, P.; Akazawa, P.

    1997-01-01

    Purpose: A number of different beam delivery methods have been proposed for implementing intensity modulated radiotherapy (IMRT), including fixed gantry with multiple static MLC fields (MSMLC - often referred to as 'stop and shoot'), fixed gantry with dynamic MLC (DMLC), intensity modulated arc therapy (IMAT), Tomotherapy and Peacock MIMiC. Using two complex head and neck cases as examples, we have compared dose distributions achievable with 3-D conformal radiotherapy (3DCRT) to those which can be achieved using IMRT delivered with MSMLC, DMLC and Peacock MIMiC. The goal is to demonstrate the potential value of IMRT in the treatment of complex lesions in the head and neck and to determine whether MSMLC, the simplest of the proposed IMRT methods, can produce dose distributions which are competitive with dynamic IMRT methods and which can be implemented in clinically acceptable times. Materials and Methods: Two patients with nasopharyngeal carcinoma were selected from the archives of the Department of Radiation Oncology at the University of California, San Francisco (UCSF). These patients were previously planned and treated with CT-based 3-D treatment planning methods which are routinely used at UCSF, including non-axial beam directions and partial transmission blocks when indicated. The CT data tapes were then read into a test version of CORVUS, an inverse treatment planning program being developed by NOMOS Corporation, target volumes and critical normal structures were outlined on axial CT slices and dose goals and limits were defined for the targets and normal tissues of interest. Optimized dose plans were then obtained for each delivery method including MSMLC (4 or 5 hand-selected beams with 3 levels of intensity), DMLC (9 evenly spaced axial beams with 10 levels of intensity) and Peacock MIMiC (55 axial beams spanning 270 degrees with 10 levels of intensity). Dose-volume histograms (DVH's) for all IMRT plans were then compared with the 3DCRT plans. Treatment

  15. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1975-09-01

    Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed

  16. Loss of Dendritic Complexity Precedes Neurodegeneration in a Mouse Model with Disrupted Mitochondrial Distribution in Mature Dendrites

    Directory of Open Access Journals (Sweden)

    Guillermo López-Doménech

    2016-10-01

    Full Text Available Correct mitochondrial distribution is critical for satisfying local energy demands and calcium buffering requirements and supporting key cellular processes. The mitochondrially targeted proteins Miro1 and Miro2 are important components of the mitochondrial transport machinery, but their specific roles in neuronal development, maintenance, and survival remain poorly understood. Using mouse knockout strategies, we demonstrate that Miro1, as opposed to Miro2, is the primary regulator of mitochondrial transport in both axons and dendrites. Miro1 deletion leads to depletion of mitochondria from distal dendrites but not axons, accompanied by a marked reduction in dendritic complexity. Disrupting postnatal mitochondrial distribution in vivo by deleting Miro1 in mature neurons causes a progressive loss of distal dendrites and compromises neuronal survival. Thus, the local availability of mitochondrial mass is critical for generating and sustaining dendritic arbors, and disruption of mitochondrial distribution in mature neurons is associated with neurodegeneration.

  17. Multivariate extended skew-t distributions and related families

    KAUST Repository

    Arellano-Valle, Reinaldo B.

    2010-12-01

    A class of multivariate extended skew-t (EST) distributions is introduced and studied in detail, along with closely related families such as the subclass of extended skew-normal distributions. Besides mathematical tractability and modeling flexibility in terms of both skewness and heavier tails than the normal distribution, the most relevant properties of the EST distribution include closure under conditioning and ability to model lighter tails as well. The first part of the present paper examines probabilistic properties of the EST distribution, such as various stochastic representations, marginal and conditional distributions, linear transformations, moments and in particular Mardia’s measures of multivariate skewness and kurtosis. The second part of the paper studies statistical properties of the EST distribution, such as likelihood inference, behavior of the profile log-likelihood, the score vector and the Fisher information matrix. Especially, unlike the extended skew-normal distribution, the Fisher information matrix of the univariate EST distribution is shown to be non-singular when the skewness is set to zero. Finally, a numerical application of the conditional EST distribution is presented in the context of confidential data perturbation.

  18. Multivariate extended skew-t distributions and related families

    KAUST Repository

    Arellano-Valle, Reinaldo B.; Genton, Marc G.

    2010-01-01

    A class of multivariate extended skew-t (EST) distributions is introduced and studied in detail, along with closely related families such as the subclass of extended skew-normal distributions. Besides mathematical tractability and modeling flexibility in terms of both skewness and heavier tails than the normal distribution, the most relevant properties of the EST distribution include closure under conditioning and ability to model lighter tails as well. The first part of the present paper examines probabilistic properties of the EST distribution, such as various stochastic representations, marginal and conditional distributions, linear transformations, moments and in particular Mardia’s measures of multivariate skewness and kurtosis. The second part of the paper studies statistical properties of the EST distribution, such as likelihood inference, behavior of the profile log-likelihood, the score vector and the Fisher information matrix. Especially, unlike the extended skew-normal distribution, the Fisher information matrix of the univariate EST distribution is shown to be non-singular when the skewness is set to zero. Finally, a numerical application of the conditional EST distribution is presented in the context of confidential data perturbation.

  19. The effects of lower crustal strength and preexisting midcrustal shear zones on the formation of continental core complexes and low-angle normal faults

    KAUST Repository

    Wu, Guangliang

    2016-08-22

    To investigate the formation of core complexes and low-angle normal faults, we devise thermomechanical simulations on a simplified wedge-like orogenic hinterland that has initial topography, Moho relief, and a preexisting midcrustal shear zone that can accommodate shear at very low angles (<20°). We mainly vary the strength of the lower crust and the frictional strength of the preexisting midcrustal shear zone. We find that the strength of the lower crust and the existence and strength of a preexisting shear zone significantly affect the formation and evolution of core complexes. With increasing lower crustal strength, we recognize varying extensional features with decreasing exhumation rate: these are characterized by bivergent metamorphic massifs, classic Cordilleran metamorphic core complexes, multiple consecutive core complexes (or boudinage structures), and a flexural core complex underlined by a large subsurface low-angle detachment fault with a small convex curvature. Topographic loading and mantle buoyancy forces, together with divergent boundaries, drive a regional lower crustal flow that leads to the exhumation of the lower crust where intensive upper crustal faulting induces strong unloading. The detachment fault is a decoupling zone that accommodates large displacement and accumulates sustained shear strain at very low angle between upper and lower crust. Though the regional stress is largely Andersonian, we find non-Andersonian stress in regions adjacent to the preexisting shear zone and those with high topographic gradient. Our new models provide a view that is generally consistent with geological and geophysical observations on how core complexes form and evolve.

  20. The effects of lower crustal strength and preexisting midcrustal shear zones on the formation of continental core complexes and low-angle normal faults

    KAUST Repository

    Wu, Guangliang; Lavier, Luc L.

    2016-01-01

    To investigate the formation of core complexes and low-angle normal faults, we devise thermomechanical simulations on a simplified wedge-like orogenic hinterland that has initial topography, Moho relief, and a preexisting midcrustal shear zone that can accommodate shear at very low angles (<20°). We mainly vary the strength of the lower crust and the frictional strength of the preexisting midcrustal shear zone. We find that the strength of the lower crust and the existence and strength of a preexisting shear zone significantly affect the formation and evolution of core complexes. With increasing lower crustal strength, we recognize varying extensional features with decreasing exhumation rate: these are characterized by bivergent metamorphic massifs, classic Cordilleran metamorphic core complexes, multiple consecutive core complexes (or boudinage structures), and a flexural core complex underlined by a large subsurface low-angle detachment fault with a small convex curvature. Topographic loading and mantle buoyancy forces, together with divergent boundaries, drive a regional lower crustal flow that leads to the exhumation of the lower crust where intensive upper crustal faulting induces strong unloading. The detachment fault is a decoupling zone that accommodates large displacement and accumulates sustained shear strain at very low angle between upper and lower crust. Though the regional stress is largely Andersonian, we find non-Andersonian stress in regions adjacent to the preexisting shear zone and those with high topographic gradient. Our new models provide a view that is generally consistent with geological and geophysical observations on how core complexes form and evolve.

  1. The analysis of annual dose distributions for radiation workers

    International Nuclear Information System (INIS)

    Mill, A.J.

    1984-05-01

    The system of dose limitation recommended by the ICRP includes the requirement that no worker shall exceed the current dose limit of 50mSv/a. Continuous exposure at this limit corresponds to an annual death rate comparable with 'high risk' industries if all workers are continuously exposed at the dose limit. In practice, there is a distribution of doses with an arithmetic mean lower than the dose limit. In its 1977 report UNSCEAR defined a reference dose distribution for the purposes of comparison. However, this two parameter distribution does not show the departure from log-normality normally observed for actual distributions at doses which are a significant proportion of the annual limit. In this report an alternative model is suggested, based on a three parameter log-normal distribution. The third parameter is an ''effective dose limit'' and such a model fits very well the departure from log-normality observed in actual dose distributions. (author)

  2. Combining counts and incidence data: an efficient approach for estimating the log-normal species abundance distribution and diversity indices.

    Science.gov (United States)

    Bellier, Edwige; Grøtan, Vidar; Engen, Steinar; Schartau, Ann Kristin; Diserud, Ola H; Finstad, Anders G

    2012-10-01

    Obtaining accurate estimates of diversity indices is difficult because the number of species encountered in a sample increases with sampling intensity. We introduce a novel method that requires that the presence of species in a sample to be assessed while the counts of the number of individuals per species are only required for just a small part of the sample. To account for species included as incidence data in the species abundance distribution, we modify the likelihood function of the classical Poisson log-normal distribution. Using simulated community assemblages, we contrast diversity estimates based on a community sample, a subsample randomly extracted from the community sample, and a mixture sample where incidence data are added to a subsample. We show that the mixture sampling approach provides more accurate estimates than the subsample and at little extra cost. Diversity indices estimated from a freshwater zooplankton community sampled using the mixture approach show the same pattern of results as the simulation study. Our method efficiently increases the accuracy of diversity estimates and comprehension of the left tail of the species abundance distribution. We show how to choose the scale of sample size needed for a compromise between information gained, accuracy of the estimates and cost expended when assessing biological diversity. The sample size estimates are obtained from key community characteristics, such as the expected number of species in the community, the expected number of individuals in a sample and the evenness of the community.

  3. Distribution Functions of Sizes and Fluxes Determined from Supra-Arcade Downflows

    Science.gov (United States)

    McKenzie, D.; Savage, S.

    2011-01-01

    The frequency distributions of sizes and fluxes of supra-arcade downflows (SADs) provide information about the process of their creation. For example, a fractal creation process may be expected to yield a power-law distribution of sizes and/or fluxes. We examine 120 cross-sectional areas and magnetic flux estimates found by Savage & McKenzie for SADs, and find that (1) the areas are consistent with a log-normal distribution and (2) the fluxes are consistent with both a log-normal and an exponential distribution. Neither set of measurements is compatible with a power-law distribution nor a normal distribution. As a demonstration of the applicability of these findings to improved understanding of reconnection, we consider a simple SAD growth scenario with minimal assumptions, capable of producing a log-normal distribution.

  4. Nonuniform Changes in the Distribution of Visual Attention from Visual Complexity and Action: A Driving Simulation Study.

    Science.gov (United States)

    Park, George D; Reed, Catherine L

    2015-02-01

    Researchers acknowledge the interplay between action and attention, but typically consider action as a response to successful attentional selection or the correlation of performance on separate action and attention tasks. We investigated how concurrent action with spatial monitoring affects the distribution of attention across the visual field. We embedded a functional field of view (FFOV) paradigm with concurrent central object recognition and peripheral target localization tasks in a simulated driving environment. Peripheral targets varied across 20-60 deg eccentricity at 11 radial spokes. Three conditions assessed the effects of visual complexity and concurrent action on the size and shape of the FFOV: (1) with no background, (2) with driving background, and (3) with driving background and vehicle steering. The addition of visual complexity slowed task performance and reduced the FFOV size but did not change the baseline shape. In contrast, the addition of steering produced not only shrinkage of the FFOV, but also changes in the FFOV shape. Nonuniform performance decrements occurred in proximal regions used for the central task and for steering, independent of interference from context elements. Multifocal attention models should consider the role of action and account for nonhomogeneities in the distribution of attention. © 2015 SAGE Publications.

  5. Rapid flattening of butterfly pitch angle distributions of radiation belt electrons by whistler-mode chorus

    Science.gov (United States)

    Yang, Chang; Su, Zhenpeng; Xiao, Fuliang; Zheng, Huinan; Wang, Yuming; Wang, Shui; Spence, H. E.; Reeves, G. D.; Baker, D. N.; Blake, J. B.; Funsten, H. O.

    2016-08-01

    Van Allen radiation belt electrons exhibit complex dynamics during geomagnetically active periods. Investigation of electron pitch angle distributions (PADs) can provide important information on the dominant physical mechanisms controlling radiation belt behaviors. Here we report a storm time radiation belt event where energetic electron PADs changed from butterfly distributions to normal or flattop distributions within several hours. Van Allen Probes observations showed that the flattening of butterfly PADs was closely related to the occurrence of whistler-mode chorus waves. Two-dimensional quasi-linear STEERB simulations demonstrate that the observed chorus can resonantly accelerate the near-equatorially trapped electrons and rapidly flatten the corresponding electron butterfly PADs. These results provide a new insight on how chorus waves affect the dynamic evolution of radiation belt electrons.

  6. Rapid flattening of butterfly pitch angle distributions of radiation belt electrons by whistler-mode chorus

    International Nuclear Information System (INIS)

    Yang, Chang; Changsha University of Science and Technology, Changsha; Su, Zhenpeng; Xiao, Fuliang; Zheng, Huinan

    2016-01-01

    Van Allen radiation belt electrons exhibit complex dynamics during geomagnetically active periods. Investigation of electron pitch angle distributions (PADs) can provide important information on the dominant physical mechanisms controlling radiation belt behaviors. In this paper, we report a storm time radiation belt event where energetic electron PADs changed from butterfly distributions to normal or flattop distributions within several hours. Van Allen Probes observations showed that the flattening of butterfly PADs was closely related to the occurrence of whistler-mode chorus waves. Two-dimensional quasi-linear STEERB simulations demonstrate that the observed chorus can resonantly accelerate the near-equatorially trapped electrons and rapidly flatten the corresponding electron butterfly PADs. Finally, these results provide a new insight on how chorus waves affect the dynamic evolution of radiation belt electrons.

  7. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    Science.gov (United States)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  8. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity.

    Directory of Open Access Journals (Sweden)

    Chansoo Kim

    Full Text Available We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts' forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems.

  9. Effect of Weakly Nonthermal Ion Velocity Distribution on Jeans Instability in a Complex Plasma in Presence of Secondary Electrons

    International Nuclear Information System (INIS)

    Sarkar, S.; Maity, S.

    2013-01-01

    In this paper we have investigated the effect of weak nonthermality of ion velocity distribution on Jean’s instability in a complex plasma in presence of secondary electrons and negatively charged dust grains. The primary and secondary electron temperatures are assumed equal. Thus plasma under consideration consists of three components: Boltzman distributed electrons, non-thermal ions and negatively charged inertial dust grains. From the linear dispersion relation we have calculated the real frequency and growth rate of the Jean’s mode. Numerically we have found that secondary electron emission destabilizes Jean’s mode when ion nonthermality is weak. (author)

  10. A new normalization method based on electrical field lines for electrical capacitance tomography

    International Nuclear Information System (INIS)

    Zhang, L F; Wang, H X

    2009-01-01

    Electrical capacitance tomography (ECT) is considered to be one of the most promising process tomography techniques. The image reconstruction for ECT is an inverse problem to find the spatially distributed permittivities in a pipe. Usually, the capacitance measurements obtained from the ECT system are normalized at the high and low permittivity for image reconstruction. The parallel normalization model is commonly used during the normalization process, which assumes the distribution of materials in parallel. Thus, the normalized capacitance is a linear function of measured capacitance. A recently used model is a series normalization model which results in the normalized capacitance as a nonlinear function of measured capacitance. The newest presented model is based on electrical field centre lines (EFCL), and is a mixture of two normalization models. The multi-threshold method of this model is presented in this paper. The sensitivity matrices based on different normalization models were obtained, and image reconstruction was carried out accordingly. Simulation results indicate that reconstructed images with higher quality can be obtained based on the presented model

  11. Maximum Likelihood Estimates of Parameters in Various Types of Distribution Fitted to Important Data Cases.

    OpenAIRE

    HIROSE,Hideo

    1998-01-01

    TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...

  12. Normal-Gamma-Bernoulli Peak Detection for Analysis of Comprehensive Two-Dimensional Gas Chromatography Mass Spectrometry Data.

    Science.gov (United States)

    Kim, Seongho; Jang, Hyejeong; Koo, Imhoi; Lee, Joohyoung; Zhang, Xiang

    2017-01-01

    Compared to other analytical platforms, comprehensive two-dimensional gas chromatography coupled with mass spectrometry (GC×GC-MS) has much increased separation power for analysis of complex samples and thus is increasingly used in metabolomics for biomarker discovery. However, accurate peak detection remains a bottleneck for wide applications of GC×GC-MS. Therefore, the normal-exponential-Bernoulli (NEB) model is generalized by gamma distribution and a new peak detection algorithm using the normal-gamma-Bernoulli (NGB) model is developed. Unlike the NEB model, the NGB model has no closed-form analytical solution, hampering its practical use in peak detection. To circumvent this difficulty, three numerical approaches, which are fast Fourier transform (FFT), the first-order and the second-order delta methods (D1 and D2), are introduced. The applications to simulated data and two real GC×GC-MS data sets show that the NGB-D1 method performs the best in terms of both computational expense and peak detection performance.

  13. Short-term Music Training Enhances Complex, Distributed Neural Communication during Music and Linguistic Tasks.

    Science.gov (United States)

    Carpentier, Sarah M; Moreno, Sylvain; McIntosh, Anthony R

    2016-10-01

    Musical training is frequently associated with benefits to linguistic abilities, and recent focus has been placed on possible benefits of bilingualism to lifelong executive functions; however, the neural mechanisms for such effects are unclear. The aim of this study was to gain better understanding of the whole-brain functional effects of music and second-language training that could support such previously observed cognitive transfer effects. We conducted a 28-day longitudinal study of monolingual English-speaking 4- to 6-year-old children randomly selected to receive daily music or French language training, excluding weekends. Children completed passive EEG music note and French vowel auditory oddball detection tasks before and after training. Brain signal complexity was measured on source waveforms at multiple temporal scales as an index of neural information processing and network communication load. Comparing pretraining with posttraining, musical training was associated with increased EEG complexity at coarse temporal scales during the music and French vowel tasks in widely distributed cortical regions. Conversely, very minimal decreases in complexity at fine scales and trends toward coarse-scale increases were displayed after French training during the tasks. Spectral analysis failed to distinguish between training types and found overall theta (3.5-7.5 Hz) power increases after all training forms, with spatially fewer decreases in power at higher frequencies (>10 Hz). These findings demonstrate that musical training increased diversity of brain network states to support domain-specific music skill acquisition and music-to-language transfer effects.

  14. Bimodal distribution of the magnetic dipole moment in nanoparticles with a monomodal distribution of the physical size

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2015-01-01

    High-frequency applications of magnetic nanoparticles, such as therapeutic hyperthermia and magnetic particle imaging, are sensitive to nanoparticle size and dipole moment. Usually, it is assumed that magnetic nanoparticles with a log-normal distribution of the physical size also have a log-normal distribution of the magnetic dipole moment. Here, we test this assumption for different types of superparamagnetic iron oxide nanoparticles in the 5–20 nm range, by multimodal fitting of magnetization curves using the MINORIM inversion method. The particles are studied while in dilute colloidal dispersion in a liquid, thereby preventing hysteresis and diminishing the effects of magnetic anisotropy on the interpretation of the magnetization curves. For two different types of well crystallized particles, the magnetic distribution is indeed log-normal, as expected from the physical size distribution. However, two other types of particles, with twinning defects or inhomogeneous oxide phases, are found to have a bimodal magnetic distribution. Our qualitative explanation is that relatively low fields are sufficient to begin aligning the particles in the liquid on the basis of their net dipole moment, whereas higher fields are required to align the smaller domains or less magnetic phases inside the particles. - Highlights: • Multimodal fits of dilute ferrofluids reveal when the particles are multidomain. • No a priori shape of the distribution is assumed by the MINORIM inversion method. • Well crystallized particles have log-normal TEM and magnetic size distributions. • Defective particles can combine a monomodal size and a bimodal dipole moment

  15. Neutron and PIMC determination of the longitudinal momentum distribution of HCP, BCC and normal liquid 4He

    International Nuclear Information System (INIS)

    Blasdell, R.C.; Ceperley, D.M.; Simmons, R.O.

    1993-07-01

    Deep inelastic neutron scattering has been used to measure the neutron Compton profile (NCP) of a series of condensed 4 He samples at densities from 28.8 atoms/nm 3 (essentially the minimum possible density in the solid phase) up to 39.8 atoms/nm 3 using a chopper spectrometer at the Argonne National Laboratory Intense Pulsed Neutron Source. At the lowest density, the NCP was measured along an isochore through the hcp, bcc, and normal liquid phases. Average atomic kinetic energies are extracted from each of the data sets and are compared to both published and new path integral Monte-Carlo (PIMC) calculations as well as other theoretical predictions. In this preliminary analysis of the data, account is taken of the effects of instrumental resolution, multiple scattering, and final-state interactions. Both our measurements and the PIMC theory show that there are only small differences in the kinetic energy and longitudinal momentum distribution of isochoric helium samples, regardless of their phase or crystal structure

  16. AFP Algorithm and a Canonical Normal Form for Horn Formulas

    OpenAIRE

    Majdoddin, Ruhollah

    2014-01-01

    AFP Algorithm is a learning algorithm for Horn formulas. We show that it does not improve the complexity of AFP Algorithm, if after each negative counterexample more that just one refinements are performed. Moreover, a canonical normal form for Horn formulas is presented, and it is proved that the output formula of AFP Algorithm is in this normal form.

  17. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo [University of Ljubljana, Faculty of Electrical Engineering, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2008-04-07

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm ({+-}0.6 mm) for the first and 2.1 mm ({+-}1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and

  18. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    International Nuclear Information System (INIS)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo

    2008-01-01

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm (±0.6 mm) for the first and 2.1 mm (±1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and CA

  19. Isotopes, Inventories and Seasonality: Unraveling Methane Source Distribution in the Complex Landscapes of the United Kingdom.

    Science.gov (United States)

    Lowry, D.; Fisher, R. E.; Zazzeri, G.; Lanoisellé, M.; France, J.; Allen, G.; Nisbet, E. G.

    2017-12-01

    Unlike the big open landscapes of many continents with large area sources dominated by one particular methane emission type that can be isotopically characterized by flight measurements and sampling, the complex patchwork of urban, fossil and agricultural methane sources across NW Europe require detailed ground surveys for characterization (Zazzeri et al., 2017). Here we outline the findings from multiple seasonal urban and rural measurement campaigns in the United Kingdom. These surveys aim to: 1) Assess source distribution and baseline in regions of planned fracking, and relate to on-site continuous baseline climatology. 2) Characterize spatial and seasonal differences in the isotopic signatures of the UNFCCC source categories, and 3) Assess the spatial validity of the 1 x 1 km UK inventory for large continuous emitters, proposed point sources, and seasonal / ephemeral emissions. The UK inventory suggests that 90% of methane emissions are from 3 source categories, ruminants, landfill and gas distribution. Bag sampling and GC-IRMS delta13C analysis shows that landfill gives a constant signature of -57 ±3 ‰ throughout the year. Fugitive gas emissions are consistent regionally depending on the North Sea supply regions feeding the network (-41 ± 2 ‰ in N England, -37 ± 2 ‰ in SE England). Ruminant, mostly cattle, emissions are far more complex as these spend winters in barns and summers in fields, but are essentially a mix of 2 end members, breath at -68 ±3 ‰ and manure at -51 ±3 ‰, resulting in broad summer field emission plumes of -64 ‰ and point winter barn emission plumes of -58 ‰. The inventory correctly locates emission hotspots from landfill, larger sewage treatment plants and gas compressor stations, giving a broad overview of emission distribution for regional model validation. Mobile surveys are adding an extra layer of detail to this which, combined with isotopic characterization, has identified spatial distribution of gas pipe leaks

  20. Dose distribution following selective internal radiation therapy

    International Nuclear Information System (INIS)

    Fox, R.A.; Klemp, P.F.; Egan, G.; Mina, L.L.; Burton, M.A.; Gray, B.N.

    1991-01-01

    Selective Internal Radiation Therapy is the intrahepatic arterial injection of microspheres labelled with 90Y. The microspheres lodge in the precapillary circulation of tumor resulting in internal radiation therapy. The activity of the 90Y injected is managed by successive administrations of labelled microspheres and after each injection probing the liver with a calibrated beta probe to assess the dose to the superficial layers of normal tissue. Predicted doses of 75 Gy have been delivered without subsequent evidence of radiation damage to normal cells. This contrasts with the complications resulting from doses in excess of 30 Gy delivered from external beam radiotherapy. Detailed analysis of microsphere distribution in a cubic centimeter of normal liver and the calculation of dose to a 3-dimensional fine grid has shown that the radiation distribution created by the finite size and distribution of the microspheres results in an highly heterogeneous dose pattern. It has been shown that a third of normal liver will receive less than 33.7% of the dose predicted by assuming an homogeneous distribution of 90Y

  1. Distribution of nuclease attack sites and complexity of DNA in the products of post-irradiation degradiation of rat thymus chromatin

    International Nuclear Information System (INIS)

    Zvonareva, N.B.; Zhivotovsky, B.D.; Hanson, K.P.

    1983-01-01

    The distribution of nuclease attack sites in chromatin has been studied on the basis of the quantitative relationship of the single- and double-stranded fragments of various lengths in the products of post-irradiation degradation of chromatin (PDN). It has been shown that in irradiated thymocytes internucleosome degradation of chromatin occurs and the products of the enzymic digestion of chromatin derive from randomly distributed genome areas accumulate. Analysis of the reassociation curves has not shown any differences in the complexity of the PDN fractions and total DNA. (author)

  2. Organ distribution of 111In-oxine labeled lymphocytes in normal subjects and in patients with chronic lymphocytic leukemia and malignant lymphoma

    International Nuclear Information System (INIS)

    Matsuda, Shin; Uchida, Tatsumi; Yui, Tokuo; Kariyone, Shigeo

    1982-01-01

    T and B lymphocyte survival and organ distribution were studied by using 111 In-oxine labeled autologous lymphocytes in 3 normal subjects, 3 patients with chronic lymphocytic leukemia (CLL) and 9 with malignant lymphoma (ML).FDisappearance curves of the labeled lymphocytes showed two exponential components in all cases. The half time of the first component was within 1 hour in all cases. That of the second one was 50.7 +- 6.4 hours for all lymphocytes, 52.0 +- 5.5 hours for T lymphocytes and 31.6 +- 4.9 hours for B lymphocytes in normal subjects, 192.6 hours for T-CLL and 57.7 +- 46.9 hours for B-CLL, and 60.2 +- 30.7 hours for T cell type of malignant lymphoma (T-ML) and 63.7 +- 24.5 hours for B cell type of malignant lymphoma (B-ML). These data might suggest that all lymphocyte disappearance curve reflected T lymphocyte disappearance curve chiefly, and the half time of B lymphocytes was shorter than that of T lymphocytes. In the T-CLL, the half time of the second component prolonged extremely in comparison with that of normal T lymphocytes. The labeled cells were accumulated in the lungs, spleen and liver immediately after the infusion, then in the spleen most remarkably 1 hour after the infusion in all cases. The radioactivity over the bone marrow was observed from 1 hour in all cases and that of lymph nodes were first noticed 18 hours after the infusion in T-CLL and T-ML, 68 hours in B-CLL but were not noticed in normal subjects and B-ML. The recovery of labeled cells in the blood was 28.5 +- 7.9% for all lymphocytes, 19.7 +- 1.9% for T lymphocytes and 11.0 +- 5.1% for B lymphocytes in normal subjects, 25.8 +- 1.6% for CLL, and 17.6 +- 11.0% for T-ML, 7.7 +- 5.2% for B-ML, respectively. (J.P.N.)

  3. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  4. Distributed redundancy and robustness in complex systems

    KAUST Repository

    Randles, Martin; Lamb, David J.; Odat, Enas M.; Taleb-Bendiab, Azzelarabe

    2011-01-01

    that emerges in complex biological and natural systems. However, in order to promote an evolutionary approach, through emergent self-organisation, it is necessary to specify the systems in an 'open-ended' manner where not all states of the system are prescribed

  5. Research on Normal Human Plantar Pressure Test

    Directory of Open Access Journals (Sweden)

    Liu Xi Yang

    2016-01-01

    Full Text Available FSR400 pressure sensor, nRF905 wireless transceiver and MSP40 SCM are used to design the insole pressure collection system, LabVIEW is used to make HMI of data acquisition, collecting a certain amount of normal human foot pressure data, statistical analysis of pressure distribution relations about five stages of swing phase during walking, using the grid closeness degree to identify plantar pressure distribution pattern recognition, and the algorithm simulation, experimental results demonstrated this method feasible.

  6. Normalization and experimental design for ChIP-chip data

    Directory of Open Access Journals (Sweden)

    Alekseyenko Artyom A

    2007-06-01

    Full Text Available Abstract Background Chromatin immunoprecipitation on tiling arrays (ChIP-chip has been widely used to investigate the DNA binding sites for a variety of proteins on a genome-wide scale. However, several issues in the processing and analysis of ChIP-chip data have not been resolved fully, including the effect of background (mock control subtraction and normalization within and across arrays. Results The binding profiles of Drosophila male-specific lethal (MSL complex on a tiling array provide a unique opportunity for investigating these topics, as it is known to bind on the X chromosome but not on the autosomes. These large bound and control regions on the same array allow clear evaluation of analytical methods. We introduce a novel normalization scheme specifically designed for ChIP-chip data from dual-channel arrays and demonstrate that this step is critical for correcting systematic dye-bias that may exist in the data. Subtraction of the mock (non-specific antibody or no antibody control data is generally needed to eliminate the bias, but appropriate normalization obviates the need for mock experiments and increases the correlation among replicates. The idea underlying the normalization can be used subsequently to estimate the background noise level in each array for normalization across arrays. We demonstrate the effectiveness of the methods with the MSL complex binding data and other publicly available data. Conclusion Proper normalization is essential for ChIP-chip experiments. The proposed normalization technique can correct systematic errors and compensate for the lack of mock control data, thus reducing the experimental cost and producing more accurate results.

  7. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  8. Site-dependent distribution of macrophages in normal human extraocular muscles

    NARCIS (Netherlands)

    Schmidt, E. D.; van der Gaag, R.; Mourits, M. P.; Koornneef, L.

    1993-01-01

    PURPOSE: Clinical data indicate that extraocular muscles have different susceptibilities for some orbital immune disorders depending on their anatomic location. The resident immunocompetent cells may be important mediators in the local pathogenesis of such disorders so the distribution of these

  9. Analysis of the normal optical, Michel and molecular potentials on ...

    Indian Academy of Sciences (India)

    6. — journal of. June 2016 physics pp. 1275–1286. Analysis of the normal ... the levels are obtained for the three optical potentials to estimate the quality ... The experimental angular distribution data for the 40Ca(6Li, d)44Ti reaction .... analysed using the normal optical, Michel and molecular potentials within the framework.

  10. The surface chemistry of divalent metal carbonate minerals; a critical assessment of surface charge and potential data using the charge distribution multi-site ion complexation model

    NARCIS (Netherlands)

    Wolthers, M.; Charlet, L.; Van Cappellen, P.

    2008-01-01

    The Charge Distribution MUltiSite Ion Complexation or CD–MUSIC modeling approach is used to describe the chemical structure of carbonate mineralaqueous solution interfaces. The new model extends existing surface complexation models of carbonate minerals, by including atomic scale information on

  11. DFT calculations of the structures and vibrational spectra of the [Fe(bpy){sub 3}]{sup 2+} and [Ru(bpy){sub 3}]{sup 2+} complexes

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, Bruce D. [School of Science, University of Greenwich at Medway, Central Avenue, Chatham Maritime, Kent ME4 4TB (United Kingdom); Dines, Trevor J. [Division of Electronic Engineering and Physics, University of Dundee, Dundee DD1 4HN (United Kingdom)], E-mail: t.j.dines@dundee.ac.uk; Longhurst, Rayne W. [Division of Electronic Engineering and Physics, University of Dundee, Dundee DD1 4HN (United Kingdom)

    2008-09-03

    Structures of the [M(bpy){sub 3}]{sup 2+} complexes (M = Fe and Ru) have been calculated at the B3-LYP/DZVP level. IR and Raman spectra were calculated using the optimised geometries, employing a scaled quantum chemical force field, and compared with an earlier normal coordinate analysis of [Ru(bpy){sub 3}]{sup 2+} which was based upon experimental data alone, and the use of a simplified model. The results of the calculations provide a highly satisfactory fit to the experimental data and the normal coordinate analyses, in terms of potential energy distributions, allow a detailed understanding of the vibrational spectra of both complexes. Evidence is presented for Jahn-Teller distortion in the {sup 1}E MLCT excited state.

  12. On the distribution of DDO galaxies

    International Nuclear Information System (INIS)

    Sharp, N.A.; Jones, B.J.T.; Jones, J.E.

    1978-01-01

    The distribution of DDO galaxies on the sky and their relationship to normal galaxies have been examined. The results appear to contradict the universality of the luminosity function for galaxies. They also indicate that DDO galaxies are preferentially found in the vicinity of normal galaxies, but not uniformly in that they tend to avoid clusters. This may be due to the dependence of distribution upon morphological type. (author)

  13. PHARMACOKINETIC VARIATIONS OF OFLOXACIN IN NORMAL AND FEBRILE RABBITS

    Directory of Open Access Journals (Sweden)

    M. AHMAD, H. RAZA, G. MURTAZA AND N. AKHTAR

    2008-12-01

    Full Text Available The influence of experimentally Escherichia coli-induced fever (EEIF on the pharmacokinetics of ofloxacin was evaluated. Ofloxacin was administered @ 20 mg.kg-1 body weight intravenously to a group of eight healthy rabbits and compared these results to values in same eight rabbits with EEIF. Pharmacokinetic parameters of ofloxacin in normal and febrile rabbits were determined by using two compartment open kinetic model. Peak plasma level (Cmax and area under the plasma concentration-time curve (AUC0-α in normal and febrile rabbits did not differ (P>0.05. However, area under first moment of plasma concentration-time curve (AUMC0-α in febrile rabbits was significantly (P<0.05 higher than that in normal rabbits. Mean values for elimination rate constant (Ke, elimination half life (t1/2β and apparent volume of distribution (Vd were significantly (P<0.05 lower in febrile rabbits compared to normal rabbits, while mean residence time (MRT and total body clearance (Cl of ofloxacin did not show any significant difference in the normal and febrile rabbits. Clinical significance of the above results can be related to the changes in the volume of distribution and elimination half life that illustrates an altered steady state in febrile condition; hence, the need for an adjustment of dosage regimen in EEIF is required.

  14. Determining prescription durations based on the parametric waiting time distribution

    DEFF Research Database (Denmark)

    Støvring, Henrik; Pottegård, Anton; Hallas, Jesper

    2016-01-01

    two-component mixture model for the waiting time distribution (WTD). The distribution component for prevalent users estimates the forward recurrence density (FRD), which is related to the distribution of time between subsequent prescription redemptions, the inter-arrival density (IAD), for users...... in continued treatment. We exploited this to estimate percentiles of the IAD by inversion of the estimated FRD and defined the duration of a prescription as the time within which 80% of current users will have presented themselves again. Statistical properties were examined in simulation studies......-Normal). When the IAD consisted of a mixture of two Log-Normal distributions, but was analyzed with a single Log-Normal distribution, relative bias did not exceed 9%. Using a Log-Normal FRD, we estimated prescription durations of 117, 91, 137, and 118 days for NSAIDs, warfarin, bendroflumethiazide...

  15. Apparent Transition in the Human Height Distribution Caused by Age-Dependent Variation during Puberty Period

    Science.gov (United States)

    Iwata, Takaki; Yamazaki, Yoshihiro; Kuninaka, Hiroto

    2013-08-01

    In this study, we examine the validity of the transition of the human height distribution from the log-normal distribution to the normal distribution during puberty, as suggested in an earlier study [Kuninaka et al.: J. Phys. Soc. Jpn. 78 (2009) 125001]. Our data analysis reveals that, in late puberty, the variation in height decreases as children grow. Thus, the classification of a height dataset by age at this stage leads us to analyze a mixture of distributions with larger means and smaller variations. This mixture distribution has a negative skewness and is consequently closer to the normal distribution than to the log-normal distribution. The opposite case occurs in early puberty and the mixture distribution is positively skewed, which resembles the log-normal distribution rather than the normal distribution. Thus, this scenario mimics the transition during puberty. Additionally, our scenario is realized through a numerical simulation based on a statistical model. The present study does not support the transition suggested by the earlier study.

  16. Elk Distributions Relative to Spring Normalized Difference Vegetation Index Values

    International Nuclear Information System (INIS)

    Smallidge, S.T.; Baker, T.T.; VanLeeuwen, D.; Gould, W.R.; Thompson, B.C.

    2010-01-01

    Rocky Mountain elk (Cervus elaphus) that winter near San Antonio Mountain in northern New Mexico provide important recreational and economic benefits while creating management challenges related to temporospatial variation in their spring movements. Our objective was to examine spring distributions of elk in relation to vegetative emergence as it progresses across the landscape as measured by remote sensing. Spring distributions of elk were closely associated with greater photosynthetic activity of spring vegetation in 2 of 3 years as determined using NDVI values derived from AVHRR datasets. Observed elk locations were up to 271% greater than expected in the category representing the most photosynthetic activity. This association was not observed when analyses at a finer geographic scale were conducted. Managers facing challenges involving human-wildlife interactions and land-use issues should consider environmental conditions that may influence variation in elk association with greener portions of the landscape.

  17. Statistical Tests for Frequency Distribution of Mean Gravity Anomalies

    African Journals Online (AJOL)

    The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...

  18. Microscopic prediction of speech intelligibility in spatially distributed speech-shaped noise for normal-hearing listeners.

    Science.gov (United States)

    Geravanchizadeh, Masoud; Fallah, Ali

    2015-12-01

    A binaural and psychoacoustically motivated intelligibility model, based on a well-known monaural microscopic model is proposed. This model simulates a phoneme recognition task in the presence of spatially distributed speech-shaped noise in anechoic scenarios. In the proposed model, binaural advantage effects are considered by generating a feature vector for a dynamic-time-warping speech recognizer. This vector consists of three subvectors incorporating two monaural subvectors to model the better-ear hearing, and a binaural subvector to simulate the binaural unmasking effect. The binaural unit of the model is based on equalization-cancellation theory. This model operates blindly, which means separate recordings of speech and noise are not required for the predictions. Speech intelligibility tests were conducted with 12 normal hearing listeners by collecting speech reception thresholds (SRTs) in the presence of single and multiple sources of speech-shaped noise. The comparison of the model predictions with the measured binaural SRTs, and with the predictions of a macroscopic binaural model called extended equalization-cancellation, shows that this approach predicts the intelligibility in anechoic scenarios with good precision. The square of the correlation coefficient (r(2)) and the mean-absolute error between the model predictions and the measurements are 0.98 and 0.62 dB, respectively.

  19. KERNEL MAD ALGORITHM FOR RELATIVE RADIOMETRIC NORMALIZATION

    Directory of Open Access Journals (Sweden)

    Y. Bai

    2016-06-01

    Full Text Available The multivariate alteration detection (MAD algorithm is commonly used in relative radiometric normalization. This algorithm is based on linear canonical correlation analysis (CCA which can analyze only linear relationships among bands. Therefore, we first introduce a new version of MAD in this study based on the established method known as kernel canonical correlation analysis (KCCA. The proposed method effectively extracts the non-linear and complex relationships among variables. We then conduct relative radiometric normalization experiments on both the linear CCA and KCCA version of the MAD algorithm with the use of Landsat-8 data of Beijing, China, and Gaofen-1(GF-1 data derived from South China. Finally, we analyze the difference between the two methods. Results show that the KCCA-based MAD can be satisfactorily applied to relative radiometric normalization, this algorithm can well describe the nonlinear relationship between multi-temporal images. This work is the first attempt to apply a KCCA-based MAD algorithm to relative radiometric normalization.

  20. Coordinating complex decision support activities across distributed applications

    Science.gov (United States)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  1. Nonlinear saturation of wave packets excited by low-energy electron horseshoe distributions.

    Science.gov (United States)

    Krafft, C; Volokitin, A

    2013-05-01

    Horseshoe distributions are shell-like particle distributions that can arise in space and laboratory plasmas when particle beams propagate into increasing magnetic fields. The present paper studies the stability and the dynamics of wave packets interacting resonantly with electrons presenting low-energy horseshoe or shell-type velocity distributions in a magnetized plasma. The linear instability growth rates are determined as a function of the ratio of the plasma to the cyclotron frequencies, of the velocity and the opening angle of the horseshoe, and of the relative thickness of the shell. The nonlinear stage of the instability is investigated numerically using a symplectic code based on a three-dimensional Hamiltonian model. Simulation results show that the dynamics of the system is mainly governed by wave-particle interactions at Landau and normal cyclotron resonances and that the high-order normal cyclotron resonances play an essential role. Specific features of the dynamics of particles interacting simultaneously with two or more waves at resonances of different natures and orders are discussed, showing that such complex processes determine the main characteristics of the wave spectrum's evolution. Simulations with wave packets presenting quasicontinuous spectra provide a full picture of the relaxation of the horseshoe distribution, revealing two main phases of the evolution: an initial stage of wave energy growth, characterized by a fast filling of the shell, and a second phase of slow damping of the wave energy, accompanied by final adjustments of the electron distribution. The influence of the density inhomogeneity along the horseshoe on the wave-particle dynamics is also discussed.

  2. Diversity and Distribution of Cryptic Species of the Bemisia tabaci (Hemiptera: Aleyrodidae) complex in Pakistan.

    Science.gov (United States)

    Masood, Mariyam; Amin, Imran; Hassan, Ishtiaq; Mansoor, Shahid; Brown, Judith K; Briddon, Rob W

    2017-12-05

    Bemisia tabaci (Gennadius; Hempitera: Aleyrodidae) is considered to be a cryptic (sibling) species complex, the members of which exhibit morphological invariability while being genetically and behaviorally distinct. Members of the complex are agricultural pests that cause direct damage by feeding on plants, and indirectly by transmitting viruses that cause diseases leading to reduced crop yield and quality. In Pakistan, cotton leaf curl disease, caused by multiple begomovirus species, is the most economically important viral disease of cotton. In the study outlined here, the diversity and geographic distribution of B. tabaci cryptic species was investigated by analyzing a taxonomically informative fragment of the mitochondrial cytochrome c oxidase 1 gene (mtCOI-3'). The mtCOI-3' sequence was determined for 285 adult whiteflies and found to represent six cryptic species, the most numerous being Asia II-1 and Middle East Asia Minor 1 (MEAM-1), the later also referred to as the B-biotype, which was previously thought to be confined to Sindh province but herein, was also found to be present in the Punjab province. The endemic Asia I was restricted to Sindh province, while an individual in the Asia II-8 was identified in Pakistan for the first time. Also for the first time, samples were collected from northwestern Pakistan and Asia II-1 was identified. Results indicate that in Pakistan the overall diversity of B. tabaci cryptic species is high and, based on comparisons with findings from previous studies, the distribution is dynamic. © The Author(s) 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Fluorescent zinc–terpyridine complex containing coordinated ...

    Indian Academy of Sciences (India)

    Unknown

    Keywords. Zinc peroxo complex; terpyridine complexes; fluorescence ... structure determination 3. Zinc is an essential element for normal function of most .... 63 179; (d) De Silva A P, Gunaratna H Q N, Gunnlaugsson T, Huxley A J M, Mcloy C.

  4. Modulation of in vivo distribution through chelator: Synthesis and evaluation of a 2-nitroimidazole-dipicolylamine-(99m)Tc(CO)3 complex for detecting tumor hypoxia.

    Science.gov (United States)

    Mallia, Madhava B; Mittal, Sweety; Sarma, Haladhar D; Banerjee, Sharmila

    2016-01-01

    Previous studies have clearly demonstrated strong correlation between in vivo distribution and blood clearance of radiopharmaceuticals for the detection of hypoxia. Present study describes an attempt to improve the in vivo distribution of a previously reported 2-nitroimidazole-(99m)Tc(CO)3 complex by tuning its blood clearance pattern through structural modification of the ligand. Herein, a 2-nitroimidazole-dipicolylamine ligand (2-nitroimidazole-DPA) was synthesized in a two-step procedure and radiolabeled with (99m)Tc(CO)3 core. Subsequently, the complex was evaluated in Swiss mice bearing fibrosarcoma tumor. As intended by its design, 2-nitroimidazole-DPA-(99m)Tc(CO)3 complex was more lipophilic than previously reported 2-nitroimidazole-DETA-(99m)Tc(CO)3 complex (DETA-diethylenetriamine) and showed slower blood clearance. Consequently it showed higher tumor uptake than 2-nitroimidazole-DETA-(99m)Tc(CO)3 complex. Significantly, despite structural modifications, other parameters such as the tumor to blood ratio and tumor to muscle ratio of the 2-nitroimidazole-DPA-(99m)Tc(CO)3 complex remained comparable to that of 2-nitroimidazole-DETA-(99m)Tc(CO)3 complex. Present study demonstrates the feasibility of structural modifications for improving in vivo tumor uptake of hypoxia detecting radiopharmaceuticals. This might encourage researchers to improve suboptimal properties of a potential radiopharmaceuticals rather than ignoring it altogether. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Local charge nonequilibrium and anomalous energy dependence of normalized moments in narrow rapidity windows

    International Nuclear Information System (INIS)

    Wu Yuanfang; Liu Lianshou

    1990-01-01

    From the study of even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows, we propose a simple picture for charge correlation with nonzero correlation length and calculate the multiplicity distributions and the normalized moments in different rapidity windows at different energies. The results explain the experimentally observed coincidence and separation of even and odd distributions and also the anomalous energy dependence of normalized moments in narrow rapidity windows. The reason for the separation of even-odd distributions, appearing first at large multiplicities, is shown to be energy conservation. The special role of no-particle events in narrow rapidity windows is pointed out

  6. Unit Root Testing and Estimation in Nonlinear ESTAR Models with Normal and Non-Normal Errors.

    Directory of Open Access Journals (Sweden)

    Umair Khalil

    Full Text Available Exponential Smooth Transition Autoregressive (ESTAR models can capture non-linear adjustment of the deviations from equilibrium conditions which may explain the economic behavior of many variables that appear non stationary from a linear viewpoint. Many researchers employ the Kapetanios test which has a unit root as the null and a stationary nonlinear model as the alternative. However this test statistics is based on the assumption of normally distributed errors in the DGP. Cook has analyzed the size of the nonlinear unit root of this test in the presence of heavy-tailed innovation process and obtained the critical values for both finite variance and infinite variance cases. However the test statistics of Cook are oversized. It has been found by researchers that using conventional tests is dangerous though the best performance among these is a HCCME. The over sizing for LM tests can be reduced by employing fixed design wild bootstrap remedies which provide a valuable alternative to the conventional tests. In this paper the size of the Kapetanios test statistic employing hetroscedastic consistent covariance matrices has been derived and the results are reported for various sample sizes in which size distortion is reduced. The properties for estimates of ESTAR models have been investigated when errors are assumed non-normal. We compare the results obtained through the fitting of nonlinear least square with that of the quantile regression fitting in the presence of outliers and the error distribution was considered to be from t-distribution for various sample sizes.

  7. Competition between clonal plasma cells and normal cells for potentially overlapping bone marrow niches is associated with a progressively altered cellular distribution in MGUS vs myeloma.

    Science.gov (United States)

    Paiva, B; Pérez-Andrés, M; Vídriales, M-B; Almeida, J; de las Heras, N; Mateos, M-V; López-Corral, L; Gutiérrez, N C; Blanco, J; Oriol, A; Hernández, M T; de Arriba, F; de Coca, A G; Terol, M-J; de la Rubia, J; González, Y; Martín, A; Sureda, A; Schmidt-Hieber, M; Schmitz, A; Johnsen, H E; Lahuerta, J-J; Bladé, J; San-Miguel, J F; Orfao, A

    2011-04-01

    Disappearance of normal bone marrow (BM) plasma cells (PC) predicts malignant transformation of monoclonal gammopathy of undetermined significance (MGUS) and smoldering myeloma (SMM) into symptomatic multiple myeloma (MM). The homing, behavior and survival of normal PC, but also CD34(+) hematopoietic stem cells (HSC), B-cell precursors, and clonal PC largely depends on their interaction with stromal cell-derived factor-1 (SDF-1) expressing, potentially overlapping BM stromal cell niches. Here, we investigate the distribution, phenotypic characteristics and competitive migration capacity of these cell populations in patients with MGUS, SMM and MM vs healthy adults (HA) aged >60 years. Our results show that BM and peripheral blood (PB) clonal PC progressively increase from MGUS to MM, the latter showing a slightly more immature immunophenotype. Of note, such increased number of clonal PC is associated with progressive depletion of normal PC, B-cell precursors and CD34(+) HSC in the BM, also with a parallel increase in PB. In an ex vivo model, normal PC, B-cell precursors and CD34(+) HSC from MGUS and SMM, but not MM patients, were able to abrogate the migration of clonal PC into serial concentrations of SDF-1. Overall, our results show that progressive competition and replacement of normal BM cells by clonal PC is associated with more advanced disease in patients with MGUS, SMM and MM.

  8. A closer look at the effect of preliminary goodness-of-fit testing for normality for the one-sample t-test.

    Science.gov (United States)

    Rochon, Justine; Kieser, Meinhard

    2011-11-01

    Student's one-sample t-test is a commonly used method when inference about the population mean is made. As advocated in textbooks and articles, the assumption of normality is often checked by a preliminary goodness-of-fit (GOF) test. In a paper recently published by Schucany and Ng it was shown that, for the uniform distribution, screening of samples by a pretest for normality leads to a more conservative conditional Type I error rate than application of the one-sample t-test without preliminary GOF test. In contrast, for the exponential distribution, the conditional level is even more elevated than the Type I error rate of the t-test without pretest. We examine the reasons behind these characteristics. In a simulation study, samples drawn from the exponential, lognormal, uniform, Student's t-distribution with 2 degrees of freedom (t(2) ) and the standard normal distribution that had passed normality screening, as well as the ingredients of the test statistics calculated from these samples, are investigated. For non-normal distributions, we found that preliminary testing for normality may change the distribution of means and standard deviations of the selected samples as well as the correlation between them (if the underlying distribution is non-symmetric), thus leading to altered distributions of the resulting test statistics. It is shown that for skewed distributions the excess in Type I error rate may be even more pronounced when testing one-sided hypotheses. ©2010 The British Psychological Society.

  9. Normal venous anatomy and physiology of the lower extremity.

    Science.gov (United States)

    Notowitz, L B

    1993-06-01

    Venous disease of the lower extremities is common but is often misunderstood. It seems that the focus is on the exciting world of arterial anatomy and pathology, while the topic of venous anatomy and pathology comes in second place. However, venous diseases such as chronic venous insufficiency, leg ulcers, and varicose veins affect much of the population and may lead to disability and death. Nurses are often required to answer complex questions from the patients and his or her family about the patient's disease. Patients depend on nurses to provide accurate information in terms they can understand. Therefore it is important to have an understanding of the normal venous system of the legs before one can understand the complexities of venous diseases and treatments. This presents an overview of normal venous anatomy and physiology.

  10. A Post-Truncation Parameterization of Truncated Normal Technical Inefficiency

    OpenAIRE

    Christine Amsler; Peter Schmidt; Wen-Jen Tsay

    2013-01-01

    In this paper we consider a stochastic frontier model in which the distribution of technical inefficiency is truncated normal. In standard notation, technical inefficiency u is distributed as N^+ (μ,σ^2). This distribution is affected by some environmental variables z that may or may not affect the level of the frontier but that do affect the shortfall of output from the frontier. We will distinguish the pre-truncation mean (μ) and variance (σ^2) from the post-truncation mean μ_*=E(u) and var...

  11. Are your covariates under control? How normalization can re-introduce covariate effects.

    Science.gov (United States)

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  12. Quasi-normal modes from non-commutative matrix dynamics

    Science.gov (United States)

    Aprile, Francesco; Sanfilippo, Francesco

    2017-09-01

    We explore similarities between the process of relaxation in the BMN matrix model and the physics of black holes in AdS/CFT. Focusing on Dyson-fluid solutions of the matrix model, we perform numerical simulations of the real time dynamics of the system. By quenching the equilibrium distribution we study quasi-normal oscillations of scalar single trace observables, we isolate the lowest quasi-normal mode, and we determine its frequencies as function of the energy. Considering the BMN matrix model as a truncation of N=4 SYM, we also compute the frequencies of the quasi-normal modes of the dual scalar fields in the AdS5-Schwarzschild background. We compare the results, and we finda surprising similarity.

  13. Persistent homology of complex networks

    International Nuclear Information System (INIS)

    Horak, Danijela; Maletić, Slobodan; Rajković, Milan

    2009-01-01

    Long-lived topological features are distinguished from short-lived ones (considered as topological noise) in simplicial complexes constructed from complex networks. A new topological invariant, persistent homology, is determined and presented as a parameterized version of a Betti number. Complex networks with distinct degree distributions exhibit distinct persistent topological features. Persistent topological attributes, shown to be related to the robust quality of networks, also reflect the deficiency in certain connectivity properties of networks. Random networks, networks with exponential connectivity distribution and scale-free networks were considered for homological persistency analysis

  14. Configuration Entropy Calculations for Complex Compounds Technetium

    International Nuclear Information System (INIS)

    Muhayatun; Susanto Imam Rahayu; Surdia, N.M.; Abdul Mutalib

    2002-01-01

    Recently, the study of technetium complexes is rapidly increasing, due to the benefit of 99m Tc complexes (one of Tc nuclear isomers), which are widely used for diagnostics. Study of the structure-stability relationship of Tc complexes based on solid angle has been done by Kung using a Solid Angle Factor Sum (SAS). The SAS is hypothesized to be related to stability. SAS has been used by several researchers either for synthesis or designing the reaction route of the Tc complex formation and predicting the geometry of complex structures. Although the advantages of the SAS were very gratifying, but the model does not have the theoretical basis which is able to explain the correlation of steric parameters to physicochemical properties of complexes especially to those connected to a complex's stability. To improve the SAS model, in this research the model was modified by providing a theoretical basis for SAS. The results obtained from the correlation of the SAS value to the thermodynamic stability parameters of simple complexes show the values to have a similar trend as the standard entropy (S 0 ). The entropy approximation model was created by involving some factors which are not used in Kung's model. Entropy optimization to the bond length (ML) has also been done to several complexes. The calculations of SAS value using the calculated R for more than 100 Tc complexes provide a normalized mean value of 0.8545 ± 0.0851 and have similar curve profiles as those of Kung's model. The entropy value can be obtained by multiplying the natural logarithm of the a priori degeneracy of a certain distribution (Ω) and the Boltzmann constant. The results of Ω and In Ω of the Tc complexes have a narrow range. The results of this research are able to provide a basic concept for the SAS to explain the structure-stability relationship and to improve Kung's model. (author)

  15. Feasibility of quantification of the distribution of blood flow in the normal human fetal circulation using CMR: a cross-sectional study.

    Science.gov (United States)

    Seed, Mike; van Amerom, Joshua F P; Yoo, Shi-Joon; Al Nafisi, Bahiyah; Grosse-Wortmann, Lars; Jaeggi, Edgar; Jansz, Michael S; Macgowan, Christopher K

    2012-11-26

    We present the first phase contrast (PC) cardiovascular magnetic resonance (CMR) measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG). A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30-39 weeks). Flow was measured in the major fetal vessels and indexed to the fetal weight. There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96). Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO) 540 ± 101, main pulmonary artery (MPA) 327 ± 68, ascending aorta (AAo) 198 ± 38, superior vena cava (SVC) 147 ± 46, ductus arteriosus (DA) 220 ± 39,pulmonary blood flow (PBF) 106 ± 59,descending aorta (DAo) 273 ± 85, umbilical vein (UV) 160 ± 62, foramen ovale (FO)107 ± 54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60 ± 4, AAo37 ± 4, SVC 28 ± 7, DA 41 ± 8, PBF 19 ± 10, DAo50 ± 12, UV 30 ± 9, FO 21 ± 12. This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.

  16. Feasibility of quantification of the distribution of blood flow in the normal human fetal circulation using CMR: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Seed Mike

    2012-11-01

    Full Text Available Abstract Background We present the first phase contrast (PC cardiovascular magnetic resonance (CMR measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG. Methods A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30–39 weeks. Flow was measured in the major fetal vessels and indexed to the fetal weight. Results There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96. Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO 540±101, main pulmonary artery (MPA 327±68, ascending aorta (AAo 198±38, superior vena cava (SVC 147±46, ductus arteriosus (DA 220±39,pulmonary blood flow (PBF 106±59,descending aorta (DAo 273±85, umbilical vein (UV 160±62, foramen ovale (FO107±54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60±4, AAo37±4, SVC 28±7, DA 41±8, PBF 19±10, DAo50±12, UV 30±9, FO 21±12. Conclusion This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.

  17. Evaluation of distribution patterns and decision of distribution coefficients of trace elements in high-purity aluminium by INAA

    International Nuclear Information System (INIS)

    Hayakawa, Yasuhiro; Suzuki, Shogo; Hirai, Shoji

    1986-01-01

    Recently, a high-purity aluminium has been used in semi-coductor device, so on. It was required that trace impurities should be reduced and that its content should be quantitatively evaluated. In this study, distribution patterns of many trace impurities in 99.999 % aluminium ingots, which was purified using a normal freezing method, were evaluated by an INAA. The effective distribution coefficient k for each detected elements was calculated using a theoretical distribution equation in the normal freezing method. As a result, the elements of k 1 was Hf. Especially, La, Sm, U and Th could be effectively purified, but Sc and Hf could be scarcely purified. Further more, it was found that the slower freezing gave the effective distribution coefficient close to the equilibrium distribution coefficient, and that the effective distribution coefficient became smaller with the larger atomic radius. (author)

  18. Normal modes of vibration in nickel

    Energy Technology Data Exchange (ETDEWEB)

    Birgeneau, R J [Yale Univ., New Haven, Connecticut (United States); Cordes, J [Cambridge Univ., Cambridge (United Kingdom); Dolling, G; Woods, A D B

    1964-07-01

    The frequency-wave-vector dispersion relation, {nu}(q), for the normal vibrations of a nickel single crystal at 296{sup o}K has been measured for the [{zeta}00], [{zeta}00], [{zeta}{zeta}{zeta}], and [0{zeta}1] symmetric directions using inelastic neutron scattering. The results can be described in terms of the Born-von Karman theory of lattice dynamics with interactions out to fourth-nearest neighbors. The shapes of the dispersion curves are very similar to those of copper, the normal mode frequencies in nickel being about 1.24 times the corresponding frequencies in copper. The fourth-neighbor model was used to calculate the frequency distribution function g({nu}) and related thermodynamic properties. (author)

  19. Is Middle-Upper Arm Circumference “normally” distributed? Secondary data analysis of 852 nutrition surveys

    Directory of Open Access Journals (Sweden)

    Severine Frison

    2016-05-01

    Full Text Available Abstract Background Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 % are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH and/or low Mid-Upper Arm Circumference (MUAC (since 2005. Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. Methods This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise “non-normal” distributions. Results The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 % distributions using the Shapiro–Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 % were skewed (D’Agostino test and 196 (36.8 % had a kurtosis different to the one observed in the normal distribution (Anscombe–Glynn test. Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 % showed high digit preference, 164 (30.8 % had a large design effect, and 204 (38.3 % a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were “normalised” and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating “normal” after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7

  20. CASTOR: Normal-mode analysis of resistive MHD plasmas

    NARCIS (Netherlands)

    Kerner, W.; Goedbloed, J. P.; Huysmans, G. T. A.; Poedts, S.; Schwarz, E.

    1998-01-01

    The CASTOR (complex Alfven spectrum of toroidal plasmas) code computes the entire spectrum of normal-modes in resistive MHD for general tokamak configurations. The applied Galerkin method, in conjunction with a Fourier finite-element discretisation, leads to a large scale eigenvalue problem A (x)

  1. Unifying distribution functions: some lesser known distributions.

    Science.gov (United States)

    Moya-Cessa, J R; Moya-Cessa, H; Berriel-Valdos, L R; Aguilar-Loreto, O; Barberis-Blostein, P

    2008-08-01

    We show that there is a way to unify distribution functions that describe simultaneously a classical signal in space and (spatial) frequency and position and momentum for a quantum system. Probably the most well known of them is the Wigner distribution function. We show how to unify functions of the Cohen class, Rihaczek's complex energy function, and Husimi and Glauber-Sudarshan distribution functions. We do this by showing how they may be obtained from ordered forms of creation and annihilation operators and by obtaining them in terms of expectation values in different eigenbases.

  2. On Introducing Asymmetry into Circular Distributions

    Directory of Open Access Journals (Sweden)

    Dale Umbach

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} We give a brief history of the results which led to the introduction of asymmetry into symmetric circular distributions. This is followed by the presentation of another method of introducing asymmetry. Some properties of the induced distributions are studied. Finally, this new distribution is shown to be a reasonable fit to the Jander ant data as presented in Fisher (1993.

  3. Exponential model normalization for electrical capacitance tomography with external electrodes under gap permittivity conditions

    International Nuclear Information System (INIS)

    Baidillah, Marlin R; Takei, Masahiro

    2017-01-01

    A nonlinear normalization model which is called exponential model for electrical capacitance tomography (ECT) with external electrodes under gap permittivity conditions has been developed. The exponential model normalization is proposed based on the inherently nonlinear relationship characteristic between the mixture permittivity and the measured capacitance due to the gap permittivity of inner wall. The parameters of exponential equation are derived by using an exponential fitting curve based on the simulation and a scaling function is added to adjust the experiment system condition. The exponential model normalization was applied to two dimensional low and high contrast dielectric distribution phantoms by using simulation and experimental studies. The proposed normalization model has been compared with other normalization models i.e. Parallel, Series, Maxwell and Böttcher models. Based on the comparison of image reconstruction results, the exponential model is reliable to predict the nonlinear normalization of measured capacitance in term of low and high contrast dielectric distribution. (paper)

  4. Approach of the value of an annuity when non-central moments of the capitalization factor are known: an R application with interest rates following normal and beta distributions

    Directory of Open Access Journals (Sweden)

    Salvador Cruz Rambaud

    2015-07-01

    Full Text Available This paper proposes an expression of the value of an annuity with payments of 1 unit each when the interest rate is random. In order to attain this objective, we proceed on the assumption that the non-central moments of the capitalization factor are known. Specifically, to calculate the value of these annuities, we propose two different expressions. First, we suppose that the random interest rate is normally distributed; then, we assume that it follows the beta distribution. A practical application of these two methodologies is also implemented using the R statistical software.

  5. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    OpenAIRE

    Yang, Shan; Tong, Xiangqian

    2016-01-01

    Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverte...

  6. Normal human bone marrow and its variations in MRI

    International Nuclear Information System (INIS)

    Vahlensieck, M.; Schmidt, H.M.

    2000-01-01

    Physiology and age dependant changes of human bone marrow are described. The resulting normal distribution patterns of active and inactive bone marrow including the various contrasts on different MR-sequences are discussed. (orig.) [de

  7. Effects of the Distributions of Energy or Charge Transfer Rates on Spectral Hole Burning in Pigment-Protein Complexes at Low Temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Herascu, N.; Ahmouda, S.; Picorel, R.; Seibert, M.; Jankowiak, R.; Zazubovich, V.

    2011-12-22

    Effects of the distributions of excitation energy transfer (EET) rates (homogeneous line widths) on the nonphotochemical (resonant) spectral hole burning (SHB) processes in photosynthetic chlorophyll-protein complexes (reaction center [RC] and CP43 antenna of Photosystem II from spinach) are considered. It is demonstrated that inclusion of such a distribution results in somewhat more dispersive hole burning kinetics. More importantly, however, inclusion of the EET rate distributions strongly affects the dependence of the hole width on the fractional hole depth. Different types of line width distributions have been explored, including those resulting from Foerster type EET between weakly interacting pigments as well as Gaussian ones, which may be a reasonable approximation for those resulting, for instance, from so-called extended Foerster models. For Gaussian line width distributions, it is possible to determine the parameters of both line width and tunneling parameter distributions from SHB data without a priori knowledge of any of them. Concerning more realistic asymmetric distributions, we demonstrate, using the simple example of CP43 antenna, that one can use SHB modeling to estimate electrostatic couplings between pigments and support or exclude assignment of certain pigment(s) to a particular state.

  8. Back to Normal! Gaussianizing posterior distributions for cosmological probes

    Science.gov (United States)

    Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.

    2014-05-01

    We present a method to map multivariate non-Gaussian posterior probability densities into Gaussian ones via nonlinear Box-Cox transformations, and generalizations thereof. This is analogous to the search for normal parameters in the CMB, but can in principle be applied to any probability density that is continuous and unimodal. The search for the optimally Gaussianizing transformation amongst the Box-Cox family is performed via a maximum likelihood formalism. We can judge the quality of the found transformation a posteriori: qualitatively via statistical tests of Gaussianity, and more illustratively by how well it reproduces the credible regions. The method permits an analytical reconstruction of the posterior from a sample, e.g. a Markov chain, and simplifies the subsequent joint analysis with other experiments. Furthermore, it permits the characterization of a non-Gaussian posterior in a compact and efficient way. The expression for the non-Gaussian posterior can be employed to find analytic formulae for the Bayesian evidence, and consequently be used for model comparison.

  9. A Novel Low-Overhead Recovery Approach for Distributed Systems

    Directory of Open Access Journals (Sweden)

    B. Gupta

    2009-01-01

    Full Text Available We have addressed the complex problem of recovery for concurrent failures in distributed computing environment. We have proposed a new approach in which we have effectively dealt with both orphan and lost messages. The proposed checkpointing and recovery approaches enable each process to restart from its recent checkpoint and hence guarantee the least amount of recomputation after recovery. It also means that a process needs to save only its recent local checkpoint. In this regard, we have introduced two new ideas. First, the proposed value of the common checkpointing interval is such that it enables an initiator process to log the minimum number of messages sent by each application process. Second, the determination of the lost messages is always done a priori by an initiator process; besides it is done while the normal distributed application is running. This is quite meaningful because it does not delay the recovery approach in any way.

  10. Distribution of the population for configurations nn, nnn and nnnn of a complex formed by a divalent impurity (M2+) and a vacancy in LiCl and KCl

    International Nuclear Information System (INIS)

    Cardenas-Garcia, D.; Lopez-Tellez, E.R.

    1992-01-01

    The distributions of the populations for different configurations of complexes in alkali halides are calculated. It is found that for LiCl the main configuration up to the room temperature is nn. On the other hand, for KCl the nn, nnn and nnnn configurations are equally important at room temperature. Consequently, this should be taken into account when making polarization energy calculations of the complexes. Graphs showing the distribution of the population and the expressions of some thermodynamic relations are included. (Author)

  11. Prenatal ultrasonographic findings of multicystic dysplastic kidney: Emphasis on cyst distribution

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Min Hoan; Cho, Jeong Yeon [Samsung Cheil Hospital, Sungkunkwan University school of Medicine, Seoul (Korea, Republic of)

    2003-09-15

    To characterize the ultrasonographic findings of multicystic dysplastic kidney on prenatal ultrasonography (US) with a special emphasis on the distribution of cysts. From January 1998 to March 2003, medical records of sixty two subjects with multicystic dysplastic kidney diagnosed on prenatal US examination were retrospectively reviewed, and forty three patients confirmed either by pathology or postnatal follow-up US were selected for this study. US assessment included the time of diagnosis, laterality, size of the multicystic dysplastic and contralateral normal kidneys, distribution of cysts and associated anomalies. The distribution of cysts was categorized as subcapsular and random distribution, and interobserver agreement was determined using the cross table analysis. The largest multicystic and contralateral normal longitudinal diameters were measured, and the data were plotted on the normal reference chart. Multicystic dysplastic kidney was left sided in 55.8%, right sided in 34.8% and bilateral in 9.3%. Subcapsular distribution of cysts was observed in 68.2% (n=15) for radiologist 1 while 59.1% (n=13) for radiologist 2, showing an excellent interobserver agreement (k=0.697). The longitudinal diameter of the multicystic dysplastic kidney was above 95 percentile in 68%. Meanwhile, the diameter of the contralateral normal kidney was more commonly normal, 70%. Fetal karyotyping was done in 18 cases including 2 cases with associated major anomalies, but karyotyping was all normal. On prenatal US, subcapsular distribution of cysts in multicystic dysplastic kidney is more common than random distribution. This characteristic distribution of cysts may be helpful in the prenatal diagnosis of multicystic dysplastic kidney.

  12. Prenatal ultrasonographic findings of multicystic dysplastic kidney: Emphasis on cyst distribution

    International Nuclear Information System (INIS)

    Moon, Min Hoan; Cho, Jeong Yeon

    2003-01-01

    To characterize the ultrasonographic findings of multicystic dysplastic kidney on prenatal ultrasonography (US) with a special emphasis on the distribution of cysts. From January 1998 to March 2003, medical records of sixty two subjects with multicystic dysplastic kidney diagnosed on prenatal US examination were retrospectively reviewed, and forty three patients confirmed either by pathology or postnatal follow-up US were selected for this study. US assessment included the time of diagnosis, laterality, size of the multicystic dysplastic and contralateral normal kidneys, distribution of cysts and associated anomalies. The distribution of cysts was categorized as subcapsular and random distribution, and interobserver agreement was determined using the cross table analysis. The largest multicystic and contralateral normal longitudinal diameters were measured, and the data were plotted on the normal reference chart. Multicystic dysplastic kidney was left sided in 55.8%, right sided in 34.8% and bilateral in 9.3%. Subcapsular distribution of cysts was observed in 68.2% (n=15) for radiologist 1 while 59.1% (n=13) for radiologist 2, showing an excellent interobserver agreement (k=0.697). The longitudinal diameter of the multicystic dysplastic kidney was above 95 percentile in 68%. Meanwhile, the diameter of the contralateral normal kidney was more commonly normal, 70%. Fetal karyotyping was done in 18 cases including 2 cases with associated major anomalies, but karyotyping was all normal. On prenatal US, subcapsular distribution of cysts in multicystic dysplastic kidney is more common than random distribution. This characteristic distribution of cysts may be helpful in the prenatal diagnosis of multicystic dysplastic kidney.

  13. Three-dimensional finite analysis of acetabular contact pressure and contact area during normal walking.

    Science.gov (United States)

    Wang, Guangye; Huang, Wenjun; Song, Qi; Liang, Jinfeng

    2017-11-01

    This study aims to analyze the contact areas and pressure distributions between the femoral head and mortar during normal walking using a three-dimensional finite element model (3D-FEM). Computed tomography (CT) scanning technology and a computer image processing system were used to establish the 3D-FEM. The acetabular mortar model was used to simulate the pressures during 32 consecutive normal walking phases and the contact areas at different phases were calculated. The distribution of the pressure peak values during the 32 consecutive normal walking phases was bimodal, which reached the peak (4.2 Mpa) at the initial phase where the contact area was significantly higher than that at the stepping phase. The sites that always kept contact were concentrated on the acetabular top and leaned inwards, while the anterior and posterior acetabular horns had no pressure concentration. The pressure distributions of acetabular cartilage at different phases were significantly different, the zone of increased pressure at the support phase distributed at the acetabular top area, while that at the stepping phase distributed in the inside of acetabular cartilage. The zones of increased contact pressure and the distributions of acetabular contact areas had important significance towards clinical researches, and could indicate the inductive factors of acetabular osteoarthritis. Copyright © 2016. Published by Elsevier Taiwan.

  14. Reliability assessment of complex mechatronic systems using a modified nonparametric belief propagation algorithm

    International Nuclear Information System (INIS)

    Zhong, X.; Ichchou, M.; Saidi, A.

    2010-01-01

    Various parametric skewed distributions are widely used to model the time-to-failure (TTF) in the reliability analysis of mechatronic systems, where many items are unobservable due to the high cost of testing. Estimating the parameters of those distributions becomes a challenge. Previous research has failed to consider this problem due to the difficulty of dependency modeling. Recently the methodology of Bayesian networks (BNs) has greatly contributed to the reliability analysis of complex systems. In this paper, the problem of system reliability assessment (SRA) is formulated as a BN considering the parameter uncertainty. As the quantitative specification of BN, a normal distribution representing the stochastic nature of TTF distribution is learned to capture the interactions between the basic items and their output items. The approximation inference of our continuous BN model is performed by a modified version of nonparametric belief propagation (NBP) which can avoid using a junction tree that is inefficient for the mechatronic case because of the large treewidth. After reasoning, we obtain the marginal posterior density of each TTF model parameter. Other information from diverse sources and expert priors can be easily incorporated in this SRA model to achieve more accurate results. Simulation in simple and complex cases of mechatronic systems demonstrates that the posterior of the parameter network fits the data well and the uncertainty passes effectively through our BN based SRA model by using the modified NBP.

  15. Estrogen receptor binding radiopharmaceuticals: II. Tissue distribution of 17 α-methylestradiol in normal and tumor-bearing rats

    International Nuclear Information System (INIS)

    Feenstra, A.; Vaalburg, W.; Nolten, G.M.J.; Reiffers, S.; Talma, A.G.; Wiegman, T.; van der Molen, H.D.; Woldring, M.G.

    1983-01-01

    Tritiated 17α-methylestradiol was synthesized to investigate the potential of the carbon-11-labeled analog as an estrogen-receptor-binding radiopharmaceutical. In vitro, 17α-methylestradiol is bound with high affinity to the cytoplasmic estrogen receptor from rabbit uterus (K/sub d/ = 1.96 x 10 -10 M), and it sediments as an 8S hormone-receptor complex in sucrose gradients. The compound shows specific uptake in the uterus of the adult rat, within 1 h after injection. In female rats bearing DMBA-induced tumors, specific uterine and tumor uptakes were observed, although at 30 min the tumor uptake was only 23 to 30% of the uptake in the uterus. Tritiated 17α-methylestradiol with a specific activity of 6 Ci/mmole showed a similar tissue distribution. Our results indicate that a 17 α-methylestradiol is promising as an estrogen-receptor-binding radiopharmaceutical

  16. Predicting distribution of Aedes aegypti and Culex pipiens complex, potential vectors of Rift Valley fever virus in relation to disease epidemics in East Africa

    Directory of Open Access Journals (Sweden)

    Clement Nyamunura Mweya

    2013-10-01

    Full Text Available Background: The East African region has experienced several Rift Valley fever (RVF outbreaks since the 1930s. The objective of this study was to identify distributions of potential disease vectors in relation to disease epidemics. Understanding disease vector potential distributions is a major concern for disease transmission dynamics. Methods: Diverse ecological niche modelling techniques have been developed for this purpose: we present a maximum entropy (Maxent approach for estimating distributions of potential RVF vectors in un-sampled areas in East Africa. We modelled the distribution of two species of mosquitoes (Aedes aegypti and Culex pipiens complex responsible for potential maintenance and amplification of the virus, respectively. Predicted distributions of environmentally suitable areas in East Africa were based on the presence-only occurrence data derived from our entomological study in Ngorongoro District in northern Tanzania. Results: Our model predicted potential suitable areas with high success rates of 90.9% for A. aegypti and 91.6% for C. pipiens complex. Model performance was statistically significantly better than random for both species. Most suitable sites for the two vectors were predicted in central and northwestern Tanzania with previous disease epidemics. Other important risk areas include western Lake Victoria, northern parts of Lake Malawi, and the Rift Valley region of Kenya. Conclusion: Findings from this study show distributions of vectors had biological and epidemiological significance in relation to disease outbreak hotspots, and hence provide guidance for the selection of sampling areas for RVF vectors during inter-epidemic periods.

  17. Predicting distribution of Aedes aegypti and Culex pipiens complex, potential vectors of Rift Valley fever virus in relation to disease epidemics in East Africa.

    Science.gov (United States)

    Mweya, Clement Nyamunura; Kimera, Sharadhuli Iddi; Kija, John Bukombe; Mboera, Leonard E G

    2013-01-01

    The East African region has experienced several Rift Valley fever (RVF) outbreaks since the 1930s. The objective of this study was to identify distributions of potential disease vectors in relation to disease epidemics. Understanding disease vector potential distributions is a major concern for disease transmission dynamics. DIVERSE ECOLOGICAL NICHE MODELLING TECHNIQUES HAVE BEEN DEVELOPED FOR THIS PURPOSE: we present a maximum entropy (Maxent) approach for estimating distributions of potential RVF vectors in un-sampled areas in East Africa. We modelled the distribution of two species of mosquitoes (Aedes aegypti and Culex pipiens complex) responsible for potential maintenance and amplification of the virus, respectively. Predicted distributions of environmentally suitable areas in East Africa were based on the presence-only occurrence data derived from our entomological study in Ngorongoro District in northern Tanzania. Our model predicted potential suitable areas with high success rates of 90.9% for A. aegypti and 91.6% for C. pipiens complex. Model performance was statistically significantly better than random for both species. Most suitable sites for the two vectors were predicted in central and northwestern Tanzania with previous disease epidemics. Other important risk areas include western Lake Victoria, northern parts of Lake Malawi, and the Rift Valley region of Kenya. Findings from this study show distributions of vectors had biological and epidemiological significance in relation to disease outbreak hotspots, and hence provide guidance for the selection of sampling areas for RVF vectors during inter-epidemic periods.

  18. Uptake, distribution, and velocity of organically complexed plutonium in corn (Zea mays).

    Science.gov (United States)

    Thompson, Shannon W; Molz, Fred J; Fjeld, Robert A; Kaplan, Daniel I

    2012-10-01

    Lysimeter experiments and associated simulations suggested that Pu moved into and through plants that invaded field lysimeters during an 11-year study at the Savannah River Site. However, probable plant uptake and transport mechanisms were not well defined, so more detailed study is needed. Therefore, experiments were performed to examine movement, distribution, and velocity of soluble, complexed Pu in corn. Corn was grown and exposed to Pu using a "long root" system in which the primary root extended through a soil pot and into a hydroponic container. To maintain solubility, Pu was complexed with the bacterial siderophore DFOB (Desferrioxamine B) or the chelating agent DTPA (diethylenetriaminepentaacetic acid). Corn plants were exposed to nutrient solutions containing Pu for periods of 10 min to 10 d. Analysis of root and shoot tissues permitted concentration measurement and calculation of uptake velocity and Pu retardation in corn. Results showed that depending on exposure time, 98.3-95.9% of Pu entering the plant was retained in the roots external to the xylem, and that 1.7-4.1% of Pu entered the shoots (shoot fraction increased with exposure time). Corn Pu uptake was 2-4 times greater as Pu(DFOB) than as Pu(2)(DTPA)(3). Pu(DFOB) solution entered the root xylem and moved 1.74 m h(-1) or greater upward, which is more than a million times faster than Pu(III/IV) downward movement through soil during the lysimeter study. The Pu(DFOB) xylem retardation factor was estimated to be 3.7-11, allowing for rapid upward Pu transport and potential environmental release. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Normal zone soliton in large composite superconductors

    International Nuclear Information System (INIS)

    Kupferman, R.; Mints, R.G.; Ben-Jacob, E.

    1992-01-01

    The study of normal zone of finite size (normal domains) in superconductors, has been continuously a subject of interest in the field of applied superconductivity. It was shown that in homogeneous superconductors normal domains are always unstable, so that if a normal domain nucleates, it will either expand or shrink. While testing the stability of large cryostable composite superconductors, a new phenomena was found, the existence of stable propagating normal solitons. The formation of these propagating domains was shown to be a result of the high Joule power generated in the superconductor during the relatively long process of current redistribution between the superconductor and the stabilizer. Theoretical studies were performed in investigate the propagation of normal domains in large composite super conductors in the cryostable regime. Huang and Eyssa performed numerical calculations simulating the diffusion of heat and current redistribution in the conductor, and showed the existence of stable propagating normal domains. They compared the velocity of normal domain propagation with the experimental data, obtaining a reasonable agreement. Dresner presented an analytical method to solve this problem if the time dependence of the Joule power is given. He performed explicit calculations of normal domain velocity assuming that the Joule power decays exponentially during the process of current redistribution. In this paper, the authors propose a system of two one-dimensional diffusion equations describing the dynamics of the temperature and the current density distributions along the conductor. Numerical simulations of the equations reconfirm the existence of propagating domains in the cryostable regime, while an analytical investigation supplies an explicit formula for the velocity of the normal domain

  20. The self-normalized Donsker theorem revisited

    OpenAIRE

    Parczewski, Peter

    2016-01-01

    We extend the Poincar\\'{e}--Borel lemma to a weak approximation of a Brownian motion via simple functionals of uniform distributions on n-spheres in the Skorokhod space $D([0,1])$. This approach is used to simplify the proof of the self-normalized Donsker theorem in Cs\\"{o}rg\\H{o} et al. (2003). Some notes on spheres with respect to $\\ell_p$-norms are given.