Secure bindings of SAML assertions to TLS sessions
DEFF Research Database (Denmark)
Kohlar, Florian; Schwenk, Jörg; Jensen, Meiko
2010-01-01
In recent research work, two approaches to protect SAML based Federated Identity Management (FIM) against man-in-the-middle attacks have been proposed. One approach is to bind the SAML assertion and the SAML artifact to the public key contained in a TLS client certificate. Another approach...... is to strengthen the Same Origin Policy of the browser by taking into account the security guarantees TLS gives. In this paper, we present a third approach which is of further interest beyond IDM protocols: we bind the SAML assertion to the TLS session that has been agreed upon between client and the service...
Indian Academy of Sciences (India)
An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...
P. W. Lunds samling - overraskende og oversete fund
DEFF Research Database (Denmark)
Hansen, Kasper Lykke
2010-01-01
Det kom som en stor overraskelse, da jeg for nylig, under min gennemgang af P.W. Lunds subfossile samling på Zoologisk Museum, opdagede fem gamle og støvede skuffer godt gemt nederst i et skab. Skufferne var fyldte med knogler og tænder fra krokodiller, padder og slanger - dyr der normalt ikke...
A locally adaptive normal distribution
DEFF Research Database (Denmark)
Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren
2016-01-01
entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...
Understanding a Normal Distribution of Data.
Maltenfort, Mitchell G
2015-12-01
Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?
Quantiles for Finite Mixtures of Normal Distributions
Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.
2006-01-01
Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)
Externally studentized normal midrange distribution
Directory of Open Access Journals (Sweden)
Ben Dêivide de Oliveira Batista
Full Text Available ABSTRACT The distribution of externally studentized midrange was created based on the original studentization procedures of Student and was inspired in the distribution of the externally studentized range. The large use of the externally studentized range in multiple comparisons was also a motivation for developing this new distribution. This work aimed to derive analytic equations to distribution of the externally studentized midrange, obtaining the cumulative distribution, probability density and quantile functions and generating random values. This is a new distribution that the authors could not find any report in the literature. A second objective was to build an R package for obtaining numerically the probability density, cumulative distribution and quantile functions and make it available to the scientific community. The algorithms were proposed and implemented using Gauss-Legendre quadrature and the Newton-Raphson method in R software, resulting in the SMR package, available for download in the CRAN site. The implemented routines showed high accuracy proved by using Monte Carlo simulations and by comparing results with different number of quadrature points. Regarding to the precision to obtain the quantiles for cases where the degrees of freedom are close to 1 and the percentiles are close to 100%, it is recommended to use more than 64 quadrature points.
Using SAML for Attribution, Delegation and Least Privilege
Directory of Open Access Journals (Sweden)
Coimbatore S. Chandersekaran
2011-02-01
Full Text Available Delegation, Attribution and Least Privilege are an implicit part of information sharing. In operating systems like Windows there is no security enforcement for code running in kernel mode and therefore such code always runs with maximum privileges. The principle of least privilege therefore demands the use of a user mode solutions when given the choice between a kernel mode and user mode solution if the two solutions provide the same results. Discussions in this paper will be restricted to OSI model levels five and above. This paper describes the SAML delegation framework in the context of a large enclave-based architecture currently being implemented by the US Air Force. Benefits of the framework include increased flexibility to handle a number of different delegation business scenarios, decreased complexity of the solution, and greater accountability with only a modest amount of additional infrastructure required.
A New Distribution-Random Limit Normal Distribution
Gong, Xiaolin; Yang, Shuzhen
2013-01-01
This paper introduces a new distribution to improve tail risk modeling. Based on the classical normal distribution, we define a new distribution by a series of heat equations. Then, we use market data to verify our model.
Sampling from the normal and exponential distributions
International Nuclear Information System (INIS)
Chaplin, K.R.; Wills, C.A.
1982-01-01
Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms
Mast cell distribution in normal adult skin
A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)
2005-01-01
markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.
Ventilation-perfusion distribution in normal subjects.
Beck, Kenneth C; Johnson, Bruce D; Olson, Thomas P; Wilson, Theodore A
2012-09-01
Functional values of LogSD of the ventilation distribution (σ(V)) have been reported previously, but functional values of LogSD of the perfusion distribution (σ(q)) and the coefficient of correlation between ventilation and perfusion (ρ) have not been measured in humans. Here, we report values for σ(V), σ(q), and ρ obtained from wash-in data for three gases, helium and two soluble gases, acetylene and dimethyl ether. Normal subjects inspired gas containing the test gases, and the concentrations of the gases at end-expiration during the first 10 breaths were measured with the subjects at rest and at increasing levels of exercise. The regional distribution of ventilation and perfusion was described by a bivariate log-normal distribution with parameters σ(V), σ(q), and ρ, and these parameters were evaluated by matching the values of expired gas concentrations calculated for this distribution to the measured values. Values of cardiac output and LogSD ventilation/perfusion (Va/Q) were obtained. At rest, σ(q) is high (1.08 ± 0.12). With the onset of ventilation, σ(q) decreases to 0.85 ± 0.09 but remains higher than σ(V) (0.43 ± 0.09) at all exercise levels. Rho increases to 0.87 ± 0.07, and the value of LogSD Va/Q for light and moderate exercise is primarily the result of the difference between the magnitudes of σ(q) and σ(V). With known values for the parameters, the bivariate distribution describes the comprehensive distribution of ventilation and perfusion that underlies the distribution of the Va/Q ratio.
Mast cell distribution in normal adult skin.
Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P
2005-03-01
To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.
Radiation distribution sensing with normal optical fiber
Energy Technology Data Exchange (ETDEWEB)
Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo [Nagoya Univ., Dept. of Nuclear Engineering, Nagoya, Aichi (Japan); Tsujimura, Norio [Japan Nuclear Cycle Development Inst., Tokai Works, Tokai, Ibaraki (Japan)
2002-12-01
The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ({sup 90}Sr{sup -90}Y), gamma rays ({sup 137}Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10{sup -5}% and 5.4x10{sup -4}%, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)
Radiation distribution sensing with normal optical fiber
International Nuclear Information System (INIS)
Kawarabayashi, Jun; Mizuno, Ryoji; Naka, Ryotaro; Uritani, Akira; Watanabe, Ken-ichi; Iguchi, Tetsuo; Tsujimura, Norio
2002-01-01
The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( 90 Sr -90 Y), gamma rays ( 137 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 -5 % and 5.4x10 -4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that the radiation distributions were calculated from the spectrum by mathematical deconvolution technique. (author)
Radiation distribution sensing with normal optical fiber
Kawarabayashi, J; Naka, R; Uritani, A; Watanabe, K I; Iguchi, T; Tsujimura, N
2002-01-01
The purpose of this study is to develop a radiation distribution monitor using a normal plastic optical fiber. The monitor has a long operating length (10m-100m) and can obtain continuous radiation distributions. A principle of the position sensing is based on a time-of-flight technique. The characteristics of this monitor to beta particles, gamma rays and fast neutrons were obtained. The spatial resolutions for beta particles ( sup 9 sup 0 Sr sup - sup 9 sup 0 Y), gamma rays ( sup 1 sup 3 sup 7 Cs) and D-T neutrons were 30 cm, 37 cm and 13 cm, respectively. The detection efficiencies for the beta rays, the gamma rays and D-T neutrons were 0.11%, 1.6x10 sup - sup 5 % and 5.4x10 sup - sup 4 %, respectively. The effective attenuation length of the detection efficiency was 18m. New principle of the position sensing based on spectroscopic analysis was also proposed. A preliminary test showed that the spectrum observed at the end of the fiber depended on the position of the irradiated point. This fact shows that t...
Correlated random sampling for multivariate normal and log-normal distributions
International Nuclear Information System (INIS)
Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.
2012-01-01
A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.
About normal distribution on SO(3) group in texture analysis
Savyolova, T. I.; Filatov, S. V.
2017-12-01
This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.
Scale and shape mixtures of multivariate skew-normal distributions
Arellano-Valle, Reinaldo B.; Ferreira, Clé cio S.; Genton, Marc G.
2018-01-01
We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down
Determining Normal-Distribution Tolerance Bounds Graphically
Mezzacappa, M. A.
1983-01-01
Graphical method requires calculations and table lookup. Distribution established from only three points: mean upper and lower confidence bounds and lower confidence bound of standard deviation. Method requires only few calculations with simple equations. Graphical procedure establishes best-fit line for measured data and bounds for selected confidence level and any distribution percentile.
The exp-normal distribution is infinitely divisible
Pinelis, Iosif
2018-01-01
Let $Z$ be a standard normal random variable (r.v.). It is shown that the distribution of the r.v. $\\ln|Z|$ is infinitely divisible; equivalently, the standard normal distribution considered as the distribution on the multiplicative group over $\\mathbb{R}\\setminus\\{0\\}$ is infinitely divisible.
DEFF Research Database (Denmark)
Sørensen, Lasse
2012-01-01
Jacobsens ukendte samling af stensager fra den græske oldtid. Obsidian, økser, matricer, ansigtssten og andre sjældne oldsager......Jacobsens ukendte samling af stensager fra den græske oldtid. Obsidian, økser, matricer, ansigtssten og andre sjældne oldsager...
Modified Normal Demand Distributions in (R,S)-Inventory Models
Strijbosch, L.W.G.; Moors, J.J.A.
2003-01-01
To model demand, the normal distribution is by far the most popular; the disadvantage that it takes negative values is taken for granted.This paper proposes two modi.cations of the normal distribution, both taking non-negative values only.Safety factors and order-up-to-levels for the familiar (R,
Zimmerman, Donald W.
2011-01-01
This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…
Software reliability growth models with normal failure time distributions
International Nuclear Information System (INIS)
Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji
2013-01-01
This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects
Normal distribution of standing balance for healthy Danish children
DEFF Research Database (Denmark)
Pedersen, Line Kjeldgaard; Ghasemi, Habib; Rahbek, Ole
2013-01-01
Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used in child......Title: Normal distribution of standing balance for healthy Danish children – Reproducibility of parameters of balance. Authors Line Kjeldgaard Pedersen Habib Ghasemi Ole Rahbek Bjarne Møller-Madsen 1800 characters incl. spaces Background Pedobarographic measurements are increasingly used...
Application of a truncated normal failure distribution in reliability testing
Groves, C., Jr.
1968-01-01
Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.
Reliability assessment based on small samples of normal distribution
International Nuclear Information System (INIS)
Ma Zhibo; Zhu Jianshi; Xu Naixin
2003-01-01
When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations
Scale and shape mixtures of multivariate skew-normal distributions
Arellano-Valle, Reinaldo B.
2018-02-26
We introduce a broad and flexible class of multivariate distributions obtained by both scale and shape mixtures of multivariate skew-normal distributions. We present the probabilistic properties of this family of distributions in detail and lay down the theoretical foundations for subsequent inference with this model. In particular, we study linear transformations, marginal distributions, selection representations, stochastic representations and hierarchical representations. We also describe an EM-type algorithm for maximum likelihood estimation of the parameters of the model and demonstrate its implementation on a wind dataset. Our family of multivariate distributions unifies and extends many existing models of the literature that can be seen as submodels of our proposal.
Multivariate stochastic simulation with subjective multivariate normal distributions
P. J. Ince; J. Buongiorno
1991-01-01
In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...
Sketching Curves for Normal Distributions--Geometric Connections
Bosse, Michael J.
2006-01-01
Within statistics instruction, students are often requested to sketch the curve representing a normal distribution with a given mean and standard deviation. Unfortunately, these sketches are often notoriously imprecise. Poor sketches are usually the result of missing mathematical knowledge. This paper considers relationships which exist among…
Improved Root Normal Size Distributions for Liquid Atomization
2015-11-01
ANSI Std. Z39.18 j CONVERSION TABLE Conversion Factors for U.S. Customary to metric (SI) units of measurement. MULTIPLY BY TO...Gray (Gy) coulomb /kilogram (C/kg) second (s) kilogram (kg) kilo pascal (kPa) 1 Improved Root Normal Size Distributions for Liquid
Confidence bounds for normal and lognormal distribution coefficients of variation
Steve Verrill
2003-01-01
This paper compares the so-called exact approach for obtaining confidence intervals on normal distribution coefficients of variation to approximate methods. Approximate approaches were found to perform less well than the exact approach for large coefficients of variation and small sample sizes. Web-based computer programs are described for calculating confidence...
Evaluating Transfer Entropy for Normal and y-Order Normal Distributions
Czech Academy of Sciences Publication Activity Database
Hlaváčková-Schindler, Kateřina; Toulias, T. L.; Kitsos, C. P.
2016-01-01
Roč. 17, č. 5 (2016), s. 1-20 ISSN 2231-0851 Institutional support: RVO:67985556 Keywords : Transfer entropy * time series * Kullback-Leibler divergence * causality * generalized normal distribution Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/hlavackova-schindler-0461261.pdf
Percentile estimation using the normal and lognormal probability distribution
International Nuclear Information System (INIS)
Bement, T.R.
1980-01-01
Implicitly or explicitly percentile estimation is an important aspect of the analysis of aerial radiometric survey data. Standard deviation maps are produced for quadrangles which are surveyed as part of the National Uranium Resource Evaluation. These maps show where variables differ from their mean values by more than one, two or three standard deviations. Data may or may not be log-transformed prior to analysis. These maps have specific percentile interpretations only when proper distributional assumptions are met. Monte Carlo results are presented in this paper which show the consequences of estimating percentiles by: (1) assuming normality when the data are really from a lognormal distribution; and (2) assuming lognormality when the data are really from a normal distribution
Ludvík Friebel; Jana Friebelová
2006-01-01
This article deals with approximation of empirical distribution to standard normal distribution using Johnson transformation. This transformation enables us to approximate wide spectrum of continuous distributions with a normal distribution. The estimation of parameters of transformation formulas is based on percentiles of empirical distribution. There are derived theoretical probability distribution functions of random variable obtained on the base of backward transformation standard normal ...
Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.
Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan
2016-02-01
This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.
Distributive justice and cognitive enhancement in lower, normal intelligence.
Dunlop, Mikael; Savulescu, Julian
2014-01-01
There exists a significant disparity within society between individuals in terms of intelligence. While intelligence varies naturally throughout society, the extent to which this impacts on the life opportunities it affords to each individual is greatly undervalued. Intelligence appears to have a prominent effect over a broad range of social and economic life outcomes. Many key determinants of well-being correlate highly with the results of IQ tests, and other measures of intelligence, and an IQ of 75 is generally accepted as the most important threshold in modern life. The ability to enhance our cognitive capacities offers an exciting opportunity to correct disabling natural variation and inequality in intelligence. Pharmaceutical cognitive enhancers, such as modafinil and methylphenidate, have been shown to have the capacity to enhance cognition in normal, healthy individuals. Perhaps of most relevance is the presence of an 'inverted U effect' for most pharmaceutical cognitive enhancers, whereby the degree of enhancement increases as intelligence levels deviate further below the mean. Although enhancement, including cognitive enhancement, has been much debated recently, we argue that there are egalitarian reasons to enhance individuals with low but normal intelligence. Under egalitarianism, cognitive enhancement has the potential to reduce opportunity inequality and contribute to relative income and welfare equality in the lower, normal intelligence subgroup. Cognitive enhancement use is justifiable under prioritarianism through various means of distribution; selective access to the lower, normal intelligence subgroup, universal access, or paradoxically through access primarily to the average and above average intelligence subgroups. Similarly, an aggregate increase in social well-being is achieved through similar means of distribution under utilitarianism. In addition, the use of cognitive enhancement within the lower, normal intelligence subgroup negates, or at
Distribution of normal superficial ocular vessels in digital images.
Banaee, Touka; Ehsaei, Asieh; Pourreza, Hamidreza; Khajedaluee, Mohammad; Abrishami, Mojtaba; Basiri, Mohsen; Daneshvar Kakhki, Ramin; Pourreza, Reza
2014-02-01
To investigate the distribution of different-sized vessels in the digital images of the ocular surface, an endeavor which may provide useful information for future studies. This study included 295 healthy individuals. From each participant, four digital photographs of the superior and inferior conjunctivae of both eyes, with a fixed succession of photography (right upper, right lower, left upper, left lower), were taken with a slit lamp mounted camera. Photographs were then analyzed by a previously described algorithm for vessel detection in the digital images. The area (of the image) occupied by vessels (AOV) of different sizes was measured. Height, weight, fasting blood sugar (FBS) and hemoglobin levels were also measured and the relationship between these parameters and the AOV was investigated. These findings indicated a statistically significant difference in the distribution of the AOV among the four conjunctival areas. No significant correlations were noted between the AOV of each conjunctival area and the different demographic and biometric factors. Medium-sized vessels were the most abundant vessels in the photographs of the four investigated conjunctival areas. The AOV of the different sizes of vessels follows a normal distribution curve in the four areas of the conjunctiva. The distribution of the vessels in successive photographs changes in a specific manner, with the mean AOV becoming larger as the photos were taken from the right upper to the left lower area. The AOV of vessel sizes has a normal distribution curve and medium-sized vessels occupy the largest area of the photograph. Copyright © 2013 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.
The normal distribution of thoracoabdominal aorta small branch artery ostia
International Nuclear Information System (INIS)
Cronin, Paul; Williams, David M.; Vellody, Ranjith; Kelly, Aine Marie; Kazerooni, Ella A.; Carlos, Ruth C.
2011-01-01
The purpose of this study was to determine the normal distribution of aortic branch artery ostia. CT scans of 100 subjects were retrospectively reviewed. The angular distributions of the aorta with respect to the center of the T3 to L4 vertebral bodies, and of branch artery origins with respect to the center of the aorta were measured. At each vertebral body level the distribution of intercostal/lumbar arteries and other branch arteries were calculated. The proximal descending aorta is posteriorly placed becoming a midline structure, at the thoracolumbar junction, and remains anterior to the vertebral bodies within the abdomen. The intercostal and lumbar artery ostia have a distinct distribution. At each vertebral level from T3 caudally, one intercostal artery originates from the posterior wall of the aorta throughout the thoracic aorta, while the other intercostal artery originates from the medial wall of the descending thoracic aorta high in the chest, posteromedially from the mid-thoracic aorta, and from the posterior wall of the aorta low in the chest. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Lumbar branches originate only from the posterior wall of the abdominal aorta. Aortic branch artery origins arise with a bimodal distribution and have a characteristic location. Mediastinal branches of the thoracic aorta originate from the medial and anterior wall. Knowing the location of aortic branch artery ostia may help distinguish branch artery pseudoaneurysms from penetrating ulcers.
Davis, Joe M
2011-10-28
General equations are derived for the distribution of minimum resolution between two chromatographic peaks, when peak heights in a multi-component chromatogram follow a continuous statistical distribution. The derivation draws on published theory by relating the area under the distribution of minimum resolution to the area under the distribution of the ratio of peak heights, which in turn is derived from the peak-height distribution. Two procedures are proposed for the equations' numerical solution. The procedures are applied to the log-normal distribution, which recently was reported to describe the distribution of component concentrations in three complex natural mixtures. For published statistical parameters of these mixtures, the distribution of minimum resolution is similar to that for the commonly assumed exponential distribution of peak heights used in statistical-overlap theory. However, these two distributions of minimum resolution can differ markedly, depending on the scale parameter of the log-normal distribution. Theory for the computation of the distribution of minimum resolution is extended to other cases of interest. With the log-normal distribution of peak heights as an example, the distribution of minimum resolution is computed when small peaks are lost due to noise or detection limits, and when the height of at least one peak is less than an upper limit. The distribution of minimum resolution shifts slightly to lower resolution values in the first case and to markedly larger resolution values in the second one. The theory and numerical procedure are confirmed by Monte Carlo simulation. Copyright © 2011 Elsevier B.V. All rights reserved.
Concentration distribution of trace elements: from normal distribution to Levy flights
International Nuclear Information System (INIS)
Kubala-Kukus, A.; Banas, D.; Braziewicz, J.; Majewska, U.; Pajek, M.
2003-01-01
The paper discusses a nature of concentration distributions of trace elements in biomedical samples, which were measured by using the X-ray fluorescence techniques (XRF, TXRF). Our earlier observation, that the lognormal distribution well describes the measured concentration distribution is explained here on a more general ground. Particularly, the role of random multiplicative process, which models the concentration distributions of trace elements in biomedical samples, is discussed in detail. It is demonstrated that the lognormal distribution, appearing when the multiplicative process is driven by normal distribution, can be generalized to the so-called log-stable distribution. Such distribution describes the random multiplicative process, which is driven, instead of normal distribution, by more general stable distribution, being known as the Levy flights. The presented ideas are exemplified by the results of the study of trace element concentration distributions in selected biomedical samples, obtained by using the conventional (XRF) and (TXRF) X-ray fluorescence methods. Particularly, the first observation of log-stable concentration distribution of trace elements is reported and discussed here in detail
Characteristic functions of scale mixtures of multivariate skew-normal distributions
Kim, Hyoung-Moon
2011-08-01
We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew-normal distributions. In particular, we describe the characteristic function of skew-normal, skew-t, and other related distributions. © 2011 Elsevier Inc.
Statistical properties of the normalized ice particle size distribution
Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.
2005-05-01
Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000
Basic study on radiation distribution sensing with normal optical fiber
International Nuclear Information System (INIS)
Naka, R.; Kawarabayashi, J.; Uritani, A.; Iguchi, T.; Kaneko, J.; Takeuchi, H.; Kakuta, T.
2000-01-01
Recently, some methods of radiation distribution sensing with optical fibers have been proposed. These methods employ scintillating fibers or scintillators with wavelength-shifting fibers. The positions of radiation interactions are detected by applying a time-of-flight (TOF) technique to the scintillation photon propagation. In the former method, the attenuation length for the scintillation photons in the scintillating fiber is relatively short, so that the operating length of the sensor is limited to several meters. In the latter method, a radiation distribution cannot continuously be obtained but discretely. To improve these shortcomings, a normal optical fiber made of polymethyl methacrylate (PMMA) is used in this study. Although the scintillation efficiency of PMMA is very low, several photons are emitted through interaction with a radiation. The fiber is transparent for the emitted photons to have a relatively long operating length. A radiation distribution can continuously be obtained. This paper describes a principle of the position sensing method based on the time of flight technique and preliminary results obtained for 90 Sr- 90 Y beta rays, 137 Cs gamma rays, and 14 MeV neutrons. The spatial resolutions for the above three kinds of radiations are 0.30 m, 0.37 m, 0.13 m, and the detection efficiencies are 1.1 x 10 -3 , 1.6 x 10 -7 , 5.4 x 10 -6 , respectively, with 10 m operation length. The results of a spectroscopic study on the optical property of the fiber are also described. (author)
Normal distribution of /sup 111/In chloride on scintigram
Energy Technology Data Exchange (ETDEWEB)
Oyama, K; Machida, K; Hayashi, S; Watari, T; Akaike, A
1977-05-01
Indium-111-chloride (/sup 111/InCl/sub 3/) was used as a bone marrow imaging and a tumor-localizing agent in 38 patients (46 scintigrams), who were suspected of, or diagnosed as, having malignant disease, and who were irradiated for malignant disease. The regions of suspected malignant disease, of abnormally accumulated on scintigrams, and the target irradiated, were excluded to estimate the normal distribution of /sup 111/InCl/sub 3/. Scintigrams were taken 48 hrs after intravenous injection of /sup 111/InCl/sub 3/ 1 to 3 mCi. The percent and score distribution of /sup 111/InCl/sub 3/ were noted in 23 regions. As the liver showed the highest accumulation of /sup 111/In on all scintigrams, the liver was designated as 2+. Comparing with the radioactivity in the liver, other regions had similar (2+), moderately decreased (+), or severely decreased (-) accumulation on scintigram. The score is given one for 2+, 0.5 for +, 0 for -. The score and percentage distributions were: liver 100 (100%), lumbar vertebra 58.5 (100%), mediastinum 55 (100%), nasopharynx 50 (100%), testis 47.5 (59%), heart 44.5 (89%), and pelvis 43.5 (78%). Comparing this study with a previous study of /sup 111/In-BLM, score distribution in lumbar vertebra, pelvis, and skull were similar. /sup 111/In-BLM is excreted rapidly after injection, but little /sup 111/InCl/sub 3/ is excreted. Accumulation of /sup 111/In in bone marrow depends upon the amount of /sup 111/In-transferrin in blood. High accumulation in the lumbar vertebra and pelvis shows that /sup 111/InCl/sub 3/ would be effective as a bone marrow imaging agent.
Dobinski-type relations and the log-normal distribution
International Nuclear Information System (INIS)
Blasiak, P; Penson, K A; Solomon, A I
2003-01-01
We consider sequences of generalized Bell numbers B(n), n = 1, 2, ..., which can be represented by Dobinski-type summation formulae, i.e. B(n) = 1/C Σ k =0 ∞ [P(k)] n /D(k), with P(k) a polynomial, D(k) a function of k and C = const. They include the standard Bell numbers (P(k) k, D(k) = k!, C = e), their generalizations B r,r (n), r = 2, 3, ..., appearing in the normal ordering of powers of boson monomials (P(k) (k+r)!/k!, D(k) = k!, C = e), variants of 'ordered' Bell numbers B o (p) (n) (P(k) = k, D(k) = (p+1/p) k , C = 1 + p, p = 1, 2 ...), etc. We demonstrate that for α, β, γ, t positive integers (α, t ≠ 0), [B(αn 2 + βn + γ)] t is the nth moment of a positive function on (0, ∞) which is a weighted infinite sum of log-normal distributions. (letter to the editor)
Characteristic functions of scale mixtures of multivariate skew-normal distributions
Kim, Hyoung-Moon; Genton, Marc G.
2011-01-01
We obtain the characteristic function of scale mixtures of skew-normal distributions both in the univariate and multivariate cases. The derivation uses the simple stochastic relationship between skew-normal distributions and scale mixtures of skew
Visualizing Tensor Normal Distributions at Multiple Levels of Detail.
Abbasloo, Amin; Wiens, Vitalis; Hermann, Max; Schultz, Thomas
2016-01-01
Despite the widely recognized importance of symmetric second order tensor fields in medicine and engineering, the visualization of data uncertainty in tensor fields is still in its infancy. A recently proposed tensorial normal distribution, involving a fourth order covariance tensor, provides a mathematical description of how different aspects of the tensor field, such as trace, anisotropy, or orientation, vary and covary at each point. However, this wealth of information is far too rich for a human analyst to take in at a single glance, and no suitable visualization tools are available. We propose a novel approach that facilitates visual analysis of tensor covariance at multiple levels of detail. We start with a visual abstraction that uses slice views and direct volume rendering to indicate large-scale changes in the covariance structure, and locations with high overall variance. We then provide tools for interactive exploration, making it possible to drill down into different types of variability, such as in shape or orientation. Finally, we allow the analyst to focus on specific locations of the field, and provide tensor glyph animations and overlays that intuitively depict confidence intervals at those points. Our system is demonstrated by investigating the effects of measurement noise on diffusion tensor MRI, and by analyzing two ensembles of stress tensor fields from solid mechanics.
Retention and subcellular distribution of 67Ga in normal organs
International Nuclear Information System (INIS)
Ando, A.; Ando, I.; Hiraki, T.
1986-01-01
Using normal rats, retention values and subcellular distribution of 67 Ga in each organ were investigated. At 10 min after administration of 67 Ga-citrate the retention value of 67 Ga in blood was 6.77% dose/g, and this value decreased with time. The values for skeletal muscle, lung, pancreas, adrenal, heart muscle, brain, small intestine, large intestine and spinal cord were the highest at 10 min after administration, and they decreased with time. Conversely this value in bone increased until 10 days after injection. But in the liver, kidney, and stomach, these values increased with time after administration and were highest 24 h or 48 h after injection. After that, they decreased with time. The value in spleen reached a plateau 48 h after administration, and hardly varied for 10 days. From the results of subcellular fractionation, it was deduced that lysosome plays quite an important role in the concentration of 67 Ga in small intestine, stomach, lung, kidney and pancreas; a lesser role in its concentration in heart muscle, and hardly any role in the 67 Ga accumulation in skeletal muscle. In spleen, the contents in nuclear, mitochrondrial, microsomal, and supernatant fractions all contributed to the accumulation of 67 Ga. (orig.) [de
A general approach to double-moment normalization of drop size distributions
Lee, G.W.; Zawadzki, I.; Szyrmer, W.; Sempere Torres, D.; Uijlenhoet, R.
2004-01-01
Normalization of drop size distributions (DSDs) is reexamined here. First, an extension of the scaling normalization that uses one moment of the DSD as a scaling parameter to a more general scaling normalization that uses two moments as scaling parameters of the normalization is presented. In
Stellar Distributions and NIR Colours of Normal Galaxies
Peletier, R. F.; Grijs, R. de
1997-01-01
Abstract: We discuss some results of a morphological study of edge-on galaxies, based on optical and especially near-infrared surface photometry. We find that the vertical surface brightness distributions of galaxies are fitted very well by exponential profiles, much better than by isothermal
Log-normal distribution from a process that is not multiplicative but is additive.
Mouri, Hideaki
2013-10-01
The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.
Penalized Maximum Likelihood Estimation for univariate normal mixture distributions
International Nuclear Information System (INIS)
Ridolfi, A.; Idier, J.
2001-01-01
Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill posedness is solved by penalizing the likelihood function. In the Bayesian framework, it amounts to incorporating an inverted gamma prior in the likelihood function. A penalized version of the EM algorithm is derived, which is still explicit and which intrinsically assures that the estimates are not singular. Numerical evidence of the latter property is put forward with a test
The retest distribution of the visual field summary index mean deviation is close to normal.
Anderson, Andrew J; Cheng, Allan C Y; Lau, Samantha; Le-Pham, Anne; Liu, Victor; Rahman, Farahnaz
2016-09-01
When modelling optimum strategies for how best to determine visual field progression in glaucoma, it is commonly assumed that the summary index mean deviation (MD) is normally distributed on repeated testing. Here we tested whether this assumption is correct. We obtained 42 reliable 24-2 Humphrey Field Analyzer SITA standard visual fields from one eye of each of five healthy young observers, with the first two fields excluded from analysis. Previous work has shown that although MD variability is higher in glaucoma, the shape of the MD distribution is similar to that found in normal visual fields. A Shapiro-Wilks test determined any deviation from normality. Kurtosis values for the distributions were also calculated. Data from each observer passed the Shapiro-Wilks normality test. Bootstrapped 95% confidence intervals for kurtosis encompassed the value for a normal distribution in four of five observers. When examined with quantile-quantile plots, distributions were close to normal and showed no consistent deviations across observers. The retest distribution of MD is not significantly different from normal in healthy observers, and so is likely also normally distributed - or nearly so - in those with glaucoma. Our results increase our confidence in the results of influential modelling studies where a normal distribution for MD was assumed. © 2016 The Authors Ophthalmic & Physiological Optics © 2016 The College of Optometrists.
Elk Distributions Relative to Spring Normalized Difference Vegetation Index Values
International Nuclear Information System (INIS)
Smallidge, S.T.; Baker, T.T.; VanLeeuwen, D.; Gould, W.R.; Thompson, B.C.
2010-01-01
Rocky Mountain elk (Cervus elaphus) that winter near San Antonio Mountain in northern New Mexico provide important recreational and economic benefits while creating management challenges related to temporospatial variation in their spring movements. Our objective was to examine spring distributions of elk in relation to vegetative emergence as it progresses across the landscape as measured by remote sensing. Spring distributions of elk were closely associated with greater photosynthetic activity of spring vegetation in 2 of 3 years as determined using NDVI values derived from AVHRR datasets. Observed elk locations were up to 271% greater than expected in the category representing the most photosynthetic activity. This association was not observed when analyses at a finer geographic scale were conducted. Managers facing challenges involving human-wildlife interactions and land-use issues should consider environmental conditions that may influence variation in elk association with greener portions of the landscape.
A novel generalized normal distribution for human longevity and other negatively skewed data.
Robertson, Henry T; Allison, David B
2012-01-01
Negatively skewed data arise occasionally in statistical practice; perhaps the most familiar example is the distribution of human longevity. Although other generalizations of the normal distribution exist, we demonstrate a new alternative that apparently fits human longevity data better. We propose an alternative approach of a normal distribution whose scale parameter is conditioned on attained age. This approach is consistent with previous findings that longevity conditioned on survival to the modal age behaves like a normal distribution. We derive such a distribution and demonstrate its accuracy in modeling human longevity data from life tables. The new distribution is characterized by 1. An intuitively straightforward genesis; 2. Closed forms for the pdf, cdf, mode, quantile, and hazard functions; and 3. Accessibility to non-statisticians, based on its close relationship to the normal distribution.
Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer
2016-01-01
Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the
Back to Normal! Gaussianizing posterior distributions for cosmological probes
Schuhmann, Robert L.; Joachimi, Benjamin; Peiris, Hiranya V.
2014-05-01
We present a method to map multivariate non-Gaussian posterior probability densities into Gaussian ones via nonlinear Box-Cox transformations, and generalizations thereof. This is analogous to the search for normal parameters in the CMB, but can in principle be applied to any probability density that is continuous and unimodal. The search for the optimally Gaussianizing transformation amongst the Box-Cox family is performed via a maximum likelihood formalism. We can judge the quality of the found transformation a posteriori: qualitatively via statistical tests of Gaussianity, and more illustratively by how well it reproduces the credible regions. The method permits an analytical reconstruction of the posterior from a sample, e.g. a Markov chain, and simplifies the subsequent joint analysis with other experiments. Furthermore, it permits the characterization of a non-Gaussian posterior in a compact and efficient way. The expression for the non-Gaussian posterior can be employed to find analytic formulae for the Bayesian evidence, and consequently be used for model comparison.
International Nuclear Information System (INIS)
Vardavas, I.M.
1992-01-01
A simple procedure is presented for the statistical analysis of measurement data where the primary concern is the determination of the value corresponding to a specified average exceedance probability. The analysis employs the normal and log-normal frequency distributions together with a χ 2 -test and an error analysis. The error analysis introduces the concept of a counting error criterion, or ζ-test, to test whether the data are sufficient to make the Z 2 -test reliable. The procedure is applied to the analysis of annual rainfall data recorded at stations in the tropical Top End of Australia where the Ranger uranium deposit is situated. 9 refs., 12 tabs., 9 figs
Estimation of value at risk and conditional value at risk using normal mixture distributions model
Kamaruzzaman, Zetty Ain; Isa, Zaidi
2013-04-01
Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.
Normal and student´s t distributions and their applications
Ahsanullah, Mohammad; Shakil, Mohammad
2014-01-01
The most important properties of normal and Student t-distributions are presented. A number of applications of these properties are demonstrated. New related results dealing with the distributions of the sum, product and ratio of the independent normal and Student distributions are presented. The materials will be useful to the advanced undergraduate and graduate students and practitioners in the various fields of science and engineering.
Kullback–Leibler Divergence of the γ–ordered Normal over t–distribution
Toulias, T-L.; Kitsos, C-P.
2012-01-01
The aim of this paper is to evaluate and study the Kullback–Leibler divergence of the γ–ordered Normal distribution, a generalization of Normal distribution emerged from the generalized Fisher’s information measure, over the scaled t–distribution. We investigate this evaluation through a series of bounds and approximations while the asymptotic behavior of the divergence is also studied. Moreover, we obtain a generalization of the known Kullback–Leibler information measure betwe...
Divinskiy, M. L.; Kolchinskiy, I. G.
1974-01-01
The distribution of deviations from mean star trail directions was studied on the basis of 105 star trails. It was found that about 93% of the trails yield a distribution in agreement with the normal law. About 4% of the star trails agree with the Charlier distribution.
A general approach to double-moment normalization of drop size distributions
Lee, G. W.; Sempere-Torres, D.; Uijlenhoet, R.; Zawadzki, I.
2003-04-01
Normalization of drop size distributions (DSDs) is re-examined here. First, we present an extension of scaling normalization using one moment of the DSD as a parameter (as introduced by Sempere-Torres et al, 1994) to a scaling normalization using two moments as parameters of the normalization. It is shown that the normalization of Testud et al. (2001) is a particular case of the two-moment scaling normalization. Thus, a unified vision of the question of DSDs normalization and a good model representation of DSDs is given. Data analysis shows that from the point of view of moment estimation least square regression is slightly more effective than moment estimation from the normalized average DSD.
The approximation of the normal distribution by means of chaotic expression
International Nuclear Information System (INIS)
Lawnik, M
2014-01-01
The approximation of the normal distribution by means of a chaotic expression is achieved by means of Weierstrass function, where, for a certain set of parameters, the density of the derived recurrence renders good approximation of the bell curve
Confidence bounds and hypothesis tests for normal distribution coefficients of variation
Steve Verrill; Richard A. Johnson
2007-01-01
For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.
An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis
Diwakar, Rekha
2017-01-01
Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…
Landfors, Mattias; Philip, Philge; Rydén, Patrik; Stenberg, Per
2011-01-01
Genome-wide analysis of gene expression or protein binding patterns using different array or sequencing based technologies is now routinely performed to compare different populations, such as treatment and reference groups. It is often necessary to normalize the data obtained to remove technical variation introduced in the course of conducting experimental work, but standard normalization techniques are not capable of eliminating technical bias in cases where the distribution of the truly altered variables is skewed, i.e. when a large fraction of the variables are either positively or negatively affected by the treatment. However, several experiments are likely to generate such skewed distributions, including ChIP-chip experiments for the study of chromatin, gene expression experiments for the study of apoptosis, and SNP-studies of copy number variation in normal and tumour tissues. A preliminary study using spike-in array data established that the capacity of an experiment to identify altered variables and generate unbiased estimates of the fold change decreases as the fraction of altered variables and the skewness increases. We propose the following work-flow for analyzing high-dimensional experiments with regions of altered variables: (1) Pre-process raw data using one of the standard normalization techniques. (2) Investigate if the distribution of the altered variables is skewed. (3) If the distribution is not believed to be skewed, no additional normalization is needed. Otherwise, re-normalize the data using a novel HMM-assisted normalization procedure. (4) Perform downstream analysis. Here, ChIP-chip data and simulated data were used to evaluate the performance of the work-flow. It was found that skewed distributions can be detected by using the novel DSE-test (Detection of Skewed Experiments). Furthermore, applying the HMM-assisted normalization to experiments where the distribution of the truly altered variables is skewed results in considerably higher
Computer program determines exact two-sided tolerance limits for normal distributions
Friedman, H. A.; Webb, S. R.
1968-01-01
Computer program determines by numerical integration the exact statistical two-sided tolerance limits, when the proportion between the limits is at least a specified number. The program is limited to situations in which the underlying probability distribution for the population sampled is the normal distribution with unknown mean and variance.
Adaptive Bayesian inference on the mean of an infinite-dimensional normal distribution
Belitser, E.; Ghosal, S.
2003-01-01
We consider the problem of estimating the mean of an infinite-break dimensional normal distribution from the Bayesian perspective. Under the assumption that the unknown true mean satisfies a "smoothness condition," we first derive the convergence rate of the posterior distribution for a prior that
A study of the up-and-down method for non-normal distribution functions
DEFF Research Database (Denmark)
Vibholm, Svend; Thyregod, Poul
1988-01-01
The assessment of breakdown probabilities is examined by the up-and-down method. The exact maximum-likelihood estimates for a number of response patterns are calculated for three different distribution functions and are compared with the estimates corresponding to the normal distribution. Estimates...
Yamada, S; Ishikawa, M; Yamamoto, K
2016-07-01
CSF volumes in the basal cistern and Sylvian fissure are increased in both idiopathic normal pressure hydrocephalus and Alzheimer disease, though the differences in these volumes in idiopathic normal pressure hydrocephalus and Alzheimer disease have not been well-described. Using CSF segmentation and volume quantification, we compared the distribution of CSF in idiopathic normal pressure hydrocephalus and Alzheimer disease. CSF volumes were extracted from T2-weighted 3D spin-echo sequences on 3T MR imaging and quantified semi-automatically. We compared the volumes and ratios of the ventricles and subarachnoid spaces after classification in 30 patients diagnosed with idiopathic normal pressure hydrocephalus, 10 with concurrent idiopathic normal pressure hydrocephalus and Alzheimer disease, 18 with Alzheimer disease, and 26 control subjects 60 years of age or older. Brain to ventricle ratios at the anterior and posterior commissure levels and 3D volumetric convexity cistern to ventricle ratios were useful indices for the differential diagnosis of idiopathic normal pressure hydrocephalus or idiopathic normal pressure hydrocephalus with Alzheimer disease from Alzheimer disease, similar to the z-Evans index and callosal angle. The most distinctive characteristics of the CSF distribution in idiopathic normal pressure hydrocephalus were small convexity subarachnoid spaces and the large volume of the basal cistern and Sylvian fissure. The distribution of the subarachnoid spaces in the idiopathic normal pressure hydrocephalus with Alzheimer disease group was the most deformed among these 3 groups, though the mean ventricular volume of the idiopathic normal pressure hydrocephalus with Alzheimer disease group was intermediate between that of the idiopathic normal pressure hydrocephalus and Alzheimer disease groups. The z-axial expansion of the lateral ventricle and compression of the brain just above the ventricle were the common findings in the parameters for differentiating
International Nuclear Information System (INIS)
Aringazin, A.K.; Mazhitov, M.I.
2003-01-01
We describe a formal procedure to obtain and specify the general form of a marginal distribution for the Lagrangian acceleration of fluid particle in developed turbulent flow using Langevin type equation and the assumption that velocity fluctuation u follows a normal distribution with zero mean, in accord to the Heisenberg-Yaglom picture. For a particular representation, β=exp[u], of the fluctuating parameter β, we reproduce the underlying log-normal distribution and the associated marginal distribution, which was found to be in a very good agreement with the new experimental data by Crawford, Mordant, and Bodenschatz on the acceleration statistics. We discuss on arising possibilities to make refinements of the log-normal model
Steerneman, A. G. M.; van Perlo-ten Kleij, Frederieke
2008-01-01
If X similar to N-nxk(M, I-n circle times Sigma), then S = X'X has the noncentral Wishart distribution W-k(')(n, Sigma; A), where Lambda = M'M. Here Sigma is allowed to be singular. It is well known that if Lambda = 0, then S has a (central) Wishart distribution and. S is positive definite with
Directory of Open Access Journals (Sweden)
Yerriswamy Wooluru
2016-06-01
Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.
Probabilistic analysis in normal operation of distribution system with distributed generation
DEFF Research Database (Denmark)
Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.
2011-01-01
Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...
Scarpelli, Matthew; Eickhoff, Jens; Cuna, Enrique; Perlman, Scott; Jeraj, Robert
2018-02-01
The statistical analysis of positron emission tomography (PET) standardized uptake value (SUV) measurements is challenging due to the skewed nature of SUV distributions. This limits utilization of powerful parametric statistical models for analyzing SUV measurements. An ad-hoc approach, which is frequently used in practice, is to blindly use a log transformation, which may or may not result in normal SUV distributions. This study sought to identify optimal transformations leading to normally distributed PET SUVs extracted from tumors and assess the effects of therapy on the optimal transformations. Methods. The optimal transformation for producing normal distributions of tumor SUVs was identified by iterating the Box-Cox transformation parameter (λ) and selecting the parameter that maximized the Shapiro-Wilk P-value. Optimal transformations were identified for tumor SUVmax distributions at both pre and post treatment. This study included 57 patients that underwent 18F-fluorodeoxyglucose (18F-FDG) PET scans (publically available dataset). In addition, to test the generality of our transformation methodology, we included analysis of 27 patients that underwent 18F-Fluorothymidine (18F-FLT) PET scans at our institution. Results. After applying the optimal Box-Cox transformations, neither the pre nor the post treatment 18F-FDG SUV distributions deviated significantly from normality (P > 0.10). Similar results were found for 18F-FLT PET SUV distributions (P > 0.10). For both 18F-FDG and 18F-FLT SUV distributions, the skewness and kurtosis increased from pre to post treatment, leading to a decrease in the optimal Box-Cox transformation parameter from pre to post treatment. There were types of distributions encountered for both 18F-FDG and 18F-FLT where a log transformation was not optimal for providing normal SUV distributions. Conclusion. Optimization of the Box-Cox transformation, offers a solution for identifying normal SUV transformations for when
LOG-NORMAL DISTRIBUTION OF COSMIC VOIDS IN SIMULATIONS AND MOCKS
Energy Technology Data Exchange (ETDEWEB)
Russell, E.; Pycke, J.-R., E-mail: er111@nyu.edu, E-mail: jrp15@nyu.edu [Division of Science and Mathematics, New York University Abu Dhabi, P.O. Box 129188, Abu Dhabi (United Arab Emirates)
2017-01-20
Following up on previous studies, we complete here a full analysis of the void size distributions of the Cosmic Void Catalog based on three different simulation and mock catalogs: dark matter (DM), haloes, and galaxies. Based on this analysis, we attempt to answer two questions: Is a three-parameter log-normal distribution a good candidate to satisfy the void size distributions obtained from different types of environments? Is there a direct relation between the shape parameters of the void size distribution and the environmental effects? In an attempt to answer these questions, we find here that all void size distributions of these data samples satisfy the three-parameter log-normal distribution whether the environment is dominated by DM, haloes, or galaxies. In addition, the shape parameters of the three-parameter log-normal void size distribution seem highly affected by environment, particularly existing substructures. Therefore, we show two quantitative relations given by linear equations between the skewness and the maximum tree depth, and between the variance of the void size distribution and the maximum tree depth, directly from the simulated data. In addition to this, we find that the percentage of voids with nonzero central density in the data sets has a critical importance. If the number of voids with nonzero central density reaches ≥3.84% in a simulation/mock sample, then a second population is observed in the void size distributions. This second population emerges as a second peak in the log-normal void size distribution at larger radius.
Generating log-normally distributed random numbers by using the Ziggurat algorithm
International Nuclear Information System (INIS)
Choi, Jong Soo
2016-01-01
Uncertainty analyses are usually based on the Monte Carlo method. Using an efficient random number generator(RNG) is a key element in success of Monte Carlo simulations. Log-normal distributed variates are very typical in NPP PSAs. This paper proposes an approach to generate log normally distributed variates based on the Ziggurat algorithm and evaluates the efficiency of the proposed Ziggurat RNG. The proposed RNG can be helpful to improve the uncertainty analysis of NPP PSAs. This paper focuses on evaluating the efficiency of the Ziggurat algorithm from a NPP PSA point of view. From this study, we can draw the following conclusions. - The Ziggurat algorithm is one of perfect random number generators to product normal distributed variates. - The Ziggurat algorithm is computationally much faster than the most commonly used method, Marsaglia polar method
Pham-Van-diep, Gerald C.; Erwin, Daniel A.
1989-01-01
Velocity distribution functions in normal shock waves in argon and helium are calculated using Monte Carlo direct simulation. These are compared with experimental results for argon at M = 7.18 and for helium at M = 1.59 and 20. For both argon and helium, the variable-hard-sphere (VHS) model is used for the elastic scattering cross section, with the velocity dependence derived from a viscosity-temperature power-law relationship in the way normally used by Bird (1976).
International Nuclear Information System (INIS)
Khromov, V.V.
1978-01-01
The notion of neutron importance when applied to nuclear reactor statics problems described by time-independent homogeneous equations of neutron transport with provision for normalization of neutron distribution is considered. An equation has been obtained for the function of neutron importance in a conditionally critical reactor with respect to an arbitrary nons linear functional determined for the normalized neutron distribution. Relation between this function and the generalized Green function of the selfconjugated operator of the reactor equation is determined and the formula of small perturbations for the functionals of a conditionally critical reactor is deduced
Austenite Grain Size Estimtion from Chord Lengths of Logarithmic-Normal Distribution
Directory of Open Access Journals (Sweden)
Adrian H.
2017-12-01
Full Text Available Linear section of grains in polyhedral material microstructure is a system of chords. The mean length of chords is the linear grain size of the microstructure. For the prior austenite grains of low alloy structural steels, the chord length is a random variable of gamma- or logarithmic-normal distribution. The statistical grain size estimation belongs to the quantitative metallographic problems. The so-called point estimation is a well known procedure. The interval estimation (grain size confidence interval for the gamma distribution was given elsewhere, but for the logarithmic-normal distribution is the subject of the present contribution. The statistical analysis is analogous to the one for the gamma distribution.
Zheng, Wei-Qiang; Ma, Rong; Zheng, Jian-Ming; Gong, Zhi-Jing
2006-04-01
To describe the histologic distribution of elastin in the nonpregnant human uterus, uterine leiomyomas, adenomyosis and adenomyomas. Uteri were obtained from women undergoing hysterectomy for benign conditions, including 26 cases of uterine leiomyomas, 24 cases of adenomyosis, 18 adenomyomas and 6 cases of autopsy specimens. Specific histochemical staining techniques were employed in order to demonstrate the distribution of elastin. The distribution of elastin components in the uterus was markedly uneven and showed a decreasing gradient from outer to inner myometrium. No elastin was present within leiomyomas, adenomyomas or adenomyosis. The distribution of elastin may help explain the normal function of the myometrium in labor. It implies that the uneven distribution of elastin components and absence of elastin within leiomyomas, adenomyomas and adenomyosis could be of some clinical significance. The altered elastin distribution in disease states may help explain such symptoms as dysmenorrhea in uterine endometriosis.
International Nuclear Information System (INIS)
Matsuda, Yoshinobu; Ueda, Yasutoshi; Uchino, Kiichiro; Muraoka, Katsunori; Maeda, Mitsuo; Akazaki, Masanori; Yamamura, Yasunori.
1986-01-01
The angular distributions of sputtered Fe-atoms were measured using the laser fluorescence technique during Ar-ion bombardment for energies of 0.6, 1, 2 and 3 keV at normal incidence. The measured cosine distribution at 0.6 keV progressively deviated to an over-cosine distribution at higher energies, and at 3 keV the angular distribution was an overcosine distribution of about 20 %. The experimental results agree qualitatively with calculations by a recent computer simulation code, ACAT. The results are explained by the competition between surface scattering and the effects of primary knock-on atoms, which tend to make the angular distributions over-cosine and under-cosine, respectively. (author)
International Nuclear Information System (INIS)
Miller, G.; Martz, H.; Bertelli, L.; Melo, D.
2008-01-01
A simplified biokinetic model for 137 Cs has six parameters representing transfer of material to and from various compartments. Using a Bayesian analysis, the joint probability distribution of these six parameters is determined empirically for two cases with quite a lot of bioassay data. The distribution is found to be a multivariate log-normal. Correlations between different parameters are obtained. The method utilises a fairly large number of pre-determined forward biokinetic calculations, whose results are stored in interpolation tables. Four different methods to sample the multidimensional parameter space with a limited number of samples are investigated: random, stratified, Latin Hypercube sampling with a uniform distribution of parameters and importance sampling using a lognormal distribution that approximates the posterior distribution. The importance sampling method gives much smaller sampling uncertainty. No sampling method-dependent differences are perceptible for the uniform distribution methods. (authors)
The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.
Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica
2014-05-01
The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.
Patel, Krupa J; Trédan, Olivier; Tannock, Ian F
2013-07-01
Pharmacokinetic analyses estimate the mean concentration of drug within a given tissue as a function of time, but do not give information about the spatial distribution of drugs within that tissue. Here, we compare the time-dependent spatial distribution of three anticancer drugs within tumors, heart, kidney, liver and brain. Mice bearing various xenografts were treated with doxorubicin, mitoxantrone or topotecan. At various times after injection, tumors and samples of heart, kidney, liver and brain were excised. Within solid tumors, the distribution of doxorubicin, mitoxantrone and topotecan was limited to perivascular regions at 10 min after administration and the distance from blood vessels at which drug intensity fell to half was ~25-75 μm. Although drug distribution improved after 3 and 24 h, there remained a significant decrease in drug fluorescence with increasing distance from tumor blood vessels. Drug distribution was relatively uniform in the heart, kidney and liver with substantially greater perivascular drug uptake than in tumors. There was significantly higher total drug fluorescence in the liver than in tumors after 10 min, 3 and 24 h. Little to no drug fluorescence was observed in the brain. There are marked differences in the spatial distributions of three anticancer drugs within tumor tissue and normal tissues over time, with greater exposure to most normal tissues and limited drug distribution to many cells in tumors. Studies of the spatial distribution of drugs are required to complement pharmacokinetic data in order to better understand and predict drug effects and toxicities.
Jambor, Ivan; Merisaari, Harri; Aronen, Hannu J; Järvinen, Jukka; Saunavaara, Jani; Kauko, Tommi; Borra, Ronald; Pesola, Marko
2014-05-01
To determine the optimal b-value distribution for biexponential diffusion-weighted imaging (DWI) of normal prostate using both a computer modeling approach and in vivo measurements. Optimal b-value distributions for the fit of three parameters (fast diffusion Df, slow diffusion Ds, and fraction of fast diffusion f) were determined using Monte-Carlo simulations. The optimal b-value distribution was calculated using four individual optimization methods. Eight healthy volunteers underwent four repeated 3 Tesla prostate DWI scans using both 16 equally distributed b-values and an optimized b-value distribution obtained from the simulations. The b-value distributions were compared in terms of measurement reliability and repeatability using Shrout-Fleiss analysis. Using low noise levels, the optimal b-value distribution formed three separate clusters at low (0-400 s/mm2), mid-range (650-1200 s/mm2), and high b-values (1700-2000 s/mm2). Higher noise levels resulted into less pronounced clustering of b-values. The clustered optimized b-value distribution demonstrated better measurement reliability and repeatability in Shrout-Fleiss analysis compared with 16 equally distributed b-values. The optimal b-value distribution was found to be a clustered distribution with b-values concentrated in the low, mid, and high ranges and was shown to improve the estimation quality of biexponential DWI parameters of in vivo experiments. Copyright © 2013 Wiley Periodicals, Inc.
Confidence Intervals for True Scores Using the Skew-Normal Distribution
Garcia-Perez, Miguel A.
2010-01-01
A recent comparative analysis of alternative interval estimation approaches and procedures has shown that confidence intervals (CIs) for true raw scores determined with the Score method--which uses the normal approximation to the binomial distribution--have actual coverage probabilities that are closest to their nominal level. It has also recently…
Reply to: Are There More Gifted People than Would Be Expected on a Normal Distribution?
Gallagher, James J.
2014-01-01
The author responds to the article by Warne, Godwin, and Smith (2013) on the question of whether there are more gifted people than would be expected in a Gaussian normal distribution. He asserts that the answer to this question is yes, based on (a) data that he and his colleagues have collected, (b) data that are already available and quoted by…
The Weight of Euro Coins: Its Distribution Might Not Be as Normal as You Would Expect
Shkedy, Ziv; Aerts, Marc; Callaert, Herman
2006-01-01
Classical regression models, ANOVA models and linear mixed models are just three examples (out of many) in which the normal distribution of the response is an essential assumption of the model. In this paper we use a dataset of 2000 euro coins containing information (up to the milligram) about the weight of each coin, to illustrate that the…
Using an APOS Framework to Understand Teachers' Responses to Questions on the Normal Distribution
Bansilal, Sarah
2014-01-01
This study is an exploration of teachers' engagement with concepts embedded in the normal distribution. The participants were a group of 290 in-service teachers enrolled in a teacher development program. The research instrument was an assessment task that can be described as an "unknown percentage" problem, which required the application…
Bellera, Carine A.; Julien, Marilyse; Hanley, James A.
2010-01-01
The Wilcoxon statistics are usually taught as nonparametric alternatives for the 1- and 2-sample Student-"t" statistics in situations where the data appear to arise from non-normal distributions, or where sample sizes are so small that we cannot check whether they do. In the past, critical values, based on exact tail areas, were…
Confidence bounds and hypothesis tests for normal distribution coefficients of variation
Steve P. Verrill; Richard A. Johnson
2007-01-01
For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...
A simple approximation to the bivariate normal distribution with large correlation coefficient
Albers, Willem/Wim; Kallenberg, W.C.M.
1994-01-01
The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the
Sample size determination for logistic regression on a logit-normal distribution.
Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance
2017-06-01
Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.
Energy Technology Data Exchange (ETDEWEB)
Ishizaka, Hiroshi; Kurihara, Mikiko; Tomioka, Kuniaki; Kobayashi, Kanako; Sato, Noriko; Nagai, Teruo; Heshiki, Atsuko; Amanuma, Makoto; Mizuno, Hitomi.
1989-02-01
Normal vertebral bone marrow intensity distribution and its alteration in various anemias were evaluated on short TI IR sequences. Material consists of 73 individuals, 48 normals and 25 anemic patients excluding neoplastic conditions. All normal and reactive hypercellular bone marrow revealed characteristic intensity distribution; marginal high intensity and central low intensity, corresponding well to normal distribution of red and yellow marrows and their physiological or reactive conversion between red and yellow marrows. Aplastic anemia did not reveal normal intensity distribution, presumably due to autonomous condition.
Sass, D. A.; Schmitt, T. A.; Walker, C. M.
2008-01-01
Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…
Distribution of Different Sized Ocular Surface Vessels in Diabetics and Normal Individuals.
Banaee, Touka; Pourreza, Hamidreza; Doosti, Hassan; Abrishami, Mojtaba; Ehsaei, Asieh; Basiry, Mohsen; Pourreza, Reza
2017-01-01
To compare the distribution of different sized vessels using digital photographs of the ocular surface of diabetic and normal individuals. In this cross-sectional study, red-free conjunctival photographs of diabetic and normal individuals, aged 30-60 years, were taken under defined conditions and analyzed using a Radon transform-based algorithm for vascular segmentation. The image areas occupied by vessels (AOV) of different diameters were calculated. The main outcome measure was the distribution curve of mean AOV of different sized vessels. Secondary outcome measures included total AOV and standard deviation (SD) of AOV of different sized vessels. Two hundred and sixty-eight diabetic patients and 297 normal (control) individuals were included, differing in age (45.50 ± 5.19 vs. 40.38 ± 6.19 years, P distribution curves of mean AOV differed between patients and controls (smaller AOV for larger vessels in patients; P distribution curve of vessels compared to controls. Presence of diabetes mellitus is associated with contraction of larger vessels in the conjunctiva. Smaller vessels dilate with diabetic retinopathy. These findings may be useful in the photographic screening of diabetes mellitus and retinopathy.
International Nuclear Information System (INIS)
Keall, P J; Webb, S
2007-01-01
The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets
Smolyar, V A; Eremin, V V
2002-01-01
In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well
International Nuclear Information System (INIS)
Smolyar, V.A.; Eremin, A.V.; Eremin, V.V.
2002-01-01
In terms of a kinetic equation diffusion model for a beam of electrons falling on a target along the normal one derived analytical formulae for distributions of separated energy and injected charge. In this case, no empirical adjustable parameters are introduced to the theory. The calculated distributions of separated energy for an electron plate directed source within infinite medium for C, Al, Sn and Pb are in good consistency with the Spencer data derived on the basis of the accurate solution of the Bethe equation being the source one in assumption of a diffusion model, as well [ru
An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution
Campbell, C. W.
1983-01-01
An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.
International Nuclear Information System (INIS)
Mathis, J.S.; Wallenhorst, S.G.
1981-01-01
The effect of changing the upper and lower size limits of a distribution of bare graphite and silicate particles with n(a)αa/sup -q/ is investigated. Mathis, Rumpl, and Nordsieck showed that the normal extinction is matched very well by having the small-size cutoff, a/sub -/, roughly-equal0.005 or 0.01 μm, and the large size a/sub +/, about 0.25 μm, and q = 3.5 for both substances. We consider the progressively peculiar extinctions exhibited by the well-observed stars, sigma Sco, rho Oph, and theta 1 Ori C, with values of R/sub v/[equivalentA/sub v//E(B--V)] of 3.4, 4.4, and 5.5 compared to the normal 3.1. Two (sigma Sco, rho Oph) are in a neutral dense cloud; theta 1 Ori C is in the Orion Nebula. We find that sigma Sco has a normal graphite distribution but has had its small silicate particles removed, so that a/sub -/(sil)roughly-equal0.04 μm if q = 3.5, or q(sil) = 2.6 if the size limits are fixed. However, the upper size limit on silicates remains normal. In rho Oph, the graphite is still normal, but both a/sub -/(sil) and a/sub +/(sil) are increased, to about 0.04 μm and 0.4 or 0.5 μm, respectively, if q = 3.5, or q(sil)roughly-equal1.3 if the size limits are fixed. In theta 1 Ori, the small limit on graphite has increased to about 0.04 μm, or q(gra)roughly-equal3, while the silicates are about like those in rho Oph. The calculated lambda2175 bump is broader than the observed, but normal foreground extinction probably contributes appreciably to the observed bump. The absolute amount of extinction per H atom for rho Oph is not explained. The column density of H is so large that systematic effects might be present. Very large graphite particles (a>3 μm) are required to ''hide'' the graphite without overly affecting the visual extinction, but a normal (small) graphite size distribution is required by the lambda2175 bump. We feel that it is unlikely that such a bimodal distribution exists
Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach
International Nuclear Information System (INIS)
Goodarzi, Samereh; Pazirandeh, Ali; Jameie, Seyed Behnamedin; Baghban Khojasteh, Nasrin
2012-01-01
Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: ► Boron distribution in male and female rats' normal brain was studied in this research. ► Coronal sections of animal tissue samples were irradiated with thermal neutrons. ► Alpha and Lithium tracks were counted using alpha autoradiography. ► Different boron concentration was seen in brain sections of male and female rats. ► The highest boron concentration was seen in 4 h after boron compound injection.
International Nuclear Information System (INIS)
Kuznetsov, Yu.V.; Al'terman, Eh.I.; Lisitsyn, A.P.; AN SSSR, Moscow. Inst. Okeanologii)
1981-01-01
The possibilities of the method of normalization of experimental ionic curves in reference to dating of abyssal sediments and establishing their accumulation rapidities are studied. The method is based on using correlation between ionic curves extrema and variations of Fe, Mn, C org., and P contents in abyssal oceanic sediments. It has been found that the above method can be successfully applied for correction of 230 Th vertical distribution data obtained by low-background γ-spectrometry. The method leads to most reliable results in those cases when the vertical distribution curves in sediments of elements concentrators of 230 Th are symbasic between themselves. The normalization of experimental ionic curves in many cases gives the possibility to realize the sediment age stratification [ru
Comparing of Normal Stress Distribution in Static and Dynamic Soil-Structure Interaction Analyses
International Nuclear Information System (INIS)
Kholdebarin, Alireza; Massumi, Ali; Davoodi, Mohammad; Tabatabaiefar, Hamid Reza
2008-01-01
It is important to consider the vertical component of earthquake loading and inertia force in soil-structure interaction analyses. In most circumstances, design engineers are primarily concerned about the analysis of behavior of foundations subjected to earthquake-induced forces transmitted from the bedrock. In this research, a single rigid foundation with designated geometrical parameters located on sandy-clay soil has been modeled in FLAC software with Finite Different Method and subjected to three different vertical components of earthquake records. In these cases, it is important to evaluate effect of footing on underlying soil and to consider normal stress in soil with and without footing. The distribution of normal stress under the footing in static and dynamic states has been studied and compared. This Comparison indicated that, increasing in normal stress under the footing caused by vertical component of ground excitations, has decreased dynamic vertical settlement in comparison with static state
Laubach, S. E.; Hundley, T. H.; Hooker, J. N.; Marrett, R. A.
2018-03-01
Fault arrays typically include a wide range of fault sizes and those faults may be randomly located, clustered together, or regularly or periodically located in a rock volume. Here, we investigate size distribution and spatial arrangement of normal faults using rigorous size-scaling methods and normalized correlation count (NCC). Outcrop data from Miocene sedimentary rocks in the immediate upper plate of the regional Buckskin detachment-low angle normal-fault, have differing patterns of spatial arrangement as a function of displacement (offset). Using lower size-thresholds of 1, 0.1, 0.01, and 0.001 m, displacements range over 5 orders of magnitude and have power-law frequency distributions spanning ∼ four orders of magnitude from less than 0.001 m to more than 100 m, with exponents of -0.6 and -0.9. The largest faults with >1 m displacement have a shallower size-distribution slope and regular spacing of about 20 m. In contrast, smaller faults have steep size-distribution slopes and irregular spacing, with NCC plateau patterns indicating imposed clustering. Cluster widths are 15 m for the 0.1-m threshold, 14 m for 0.01-m, and 1 m for 0.001-m displacement threshold faults. Results demonstrate normalized correlation count effectively characterizes the spatial arrangement patterns of these faults. Our example from a high-strain fault pattern above a detachment is compatible with size and spatial organization that was influenced primarily by boundary conditions such as fault shape, mechanical unit thickness and internal stratigraphy on a range of scales rather than purely by interaction among faults during their propagation.
Normal cranial bone marrow MR imaging pattern with age-related ADC value distribution
International Nuclear Information System (INIS)
Li Qi; Pan Shinong; Yin Yuming; Li Wei; Chen Zhian; Liu Yunhui; Wu Zhenhua; Guo Qiyong
2011-01-01
Objective: To determine MRI appearances of normal age-related cranial bone marrow and the relationship between MRI patterns and apparent diffusion coefficient (ADC) values. Methods: Five hundred subjects were divided into seven groups based on ages. Cranial bone marrow MRI patterns were performed based on different thickness of the diploe and signal intensity distribution characteristics. ADC values of the frontal, parietal, occipital and temporal bones on DWI were measured and calculated. Correlations between ages and ADC values, between patterns and ADC values, as well as the distribution of ADC values were analyzed. Results: Normal cranial bone marrow was divided into four types and six subtypes, Type I, II, III and IV, which had positive correlation with age increasing (χ 2 = 266.36, P 0.05). In addition, there was significant negative correlation between the ADC values and MRI patterns in the normal parietal and occipital bones (r = -0.691 and -0.750, P < 0.01). Conclusion: The combination of MRI features and ADC values changes in different cranial bones showed significant correlation with age increasing. Familiar with the MRI appearance of the normal bone marrow conversion pattern in different age group and their ADC value will aid the diagnosis and differential of the cranial bone pathology.
International Nuclear Information System (INIS)
Devine, E.; Artman, L.; Budzik, G.; Bush, E.; Holleman, W.
1986-01-01
The 24 amino acid peptide, ANH(5-28), was N-terminally labeled with I-125 Bolton-Hunter reagent, iodo-N-succinimidyl 3-(4-hydroxyphenyl)propionate. The I-125 peptide plus 1μg/kg of the I-127 Bolton-Hunter peptide was injected into normal and nephrectomized anesthetized (Nembutal) rats. Blood samples were drawn into a cocktail developed to inhibit plasma induced degradation. Radiolabeled peptides were analyzed by HPLC. A biphasic curve of I-125 ANH(5-28) elimination was obtained, the first phase (t 1/2 = 15 sec) representing in vivo distribution and the second phase (t 1/2 = 7-10 min) a measurement of elimination. This biphasic elimination curve was similar in normal and nephrectomized rats. The apparent volumes of distribution were 15-20 ml for the first phase and > 300 ml for the second phase. In order to examine the tissue distribution of the peptide, animals were sacrificed at 2 minutes and the I-125 tissue contents were quantitated. The majority of the label was located in the liver (50%), kidneys (21%) and the lung (5%). The degradative peptides appearing in the plasma and urine of normal rats were identical. No intact radiolabeled ANH(5-28) was found in the urine. In conclusion, iodinated Bolton-Hunter labeled ANH(5-28) is rapidly removed from the circulation by the liver and to a lesser extent by the kidney, but the rate of elimination is not decreased by nephrectomy
Latré, S.; Desplentere, F.; De Pooter, S.; Seveno, D.
2017-10-01
Nanoscale materials showing superior thermal properties have raised the interest of the building industry. By adding these materials to conventional construction materials, it is possible to decrease the total thermal conductivity by almost one order of magnitude. This conductivity is mainly influenced by the dispersion quality within the matrix material. At the industrial scale, the main challenge is to control this dispersion to reduce or even eliminate thermal bridges. This allows to reach an industrially relevant process to balance out the high material cost and their superior thermal insulation properties. Therefore, a methodology is required to measure and describe these nanoscale distributions within the inorganic matrix material. These distributions are either random or normally distributed through thickness within the matrix material. We show that the influence of these distributions is meaningful and modifies the thermal conductivity of the building material. Hence, this strategy will generate a thermal model allowing to predict the thermal behavior of the nanoscale particles and their distributions. This thermal model will be validated by the hot wire technique. For the moment, a good correlation is found between the numerical results and experimental data for a randomly distributed form of nanoparticles in all directions.
Breast cancer subtype distribution is different in normal weight, overweight, and obese women.
Gershuni, Victoria; Li, Yun R; Williams, Austin D; So, Alycia; Steel, Laura; Carrigan, Elena; Tchou, Julia
2017-06-01
Obesity is associated with tumor promoting pathways related to insulin resistance and chronic low-grade inflammation which have been linked to various disease states, including cancer. Many studies have focused on the relationship between obesity and increased estrogen production, which contributes to the pathogenesis of estrogen receptor-positive breast cancers. The link between obesity and other breast cancer subtypes, such as triple-negative breast cancer (TNBC) and Her2/neu+ (Her2+) breast cancer, is less clear. We hypothesize that obesity may be associated with the pathogenesis of specific breast cancer subtypes resulting in a different subtype distribution than normal weight women. A single-institution, retrospective analysis of tumor characteristics of 848 patients diagnosed with primary operable breast cancer between 2000 and 2013 was performed to evaluate the association between BMI and clinical outcome. Patients were grouped based on their BMI at time of diagnosis stratified into three subgroups: normal weight (BMI = 18-24.9), overweight (BMI = 25-29.9), and obese (BMI > 30). The distribution of breast cancer subtypes across the three BMI subgroups was compared. Obese and overweight women were more likely to present with TNBC and normal weight women with Her2+ breast cancer (p = 0.008). We demonstrated, for the first time, that breast cancer subtype distribution varied significantly according to BMI status. Our results suggested that obesity might activate molecular pathways other than the well-known obesity/estrogen circuit in the pathogenesis of breast cancer. Future studies are needed to understand the molecular mechanisms that drive the variation in subtype distribution across BMI subgroups.
CERN. Geneva
2018-01-01
Global science calls for global infrastructure. A typical large-scale research group will use a suite of international services and involve hundreds of collaborating institutes and users from around the world. How can these users access those services securely? How can their digital identities be established, verified and maintained? We will explore the motivation for distributed authentication and the ways in which research communities are addressing the challenges. We will discuss security incident response in distributed environments - a particular challenge for the operators of these infrastructures. Through this course you should gain an overview of federated identity technologies and protocols, including x509 certificates, SAML and OIDC.
Energy Technology Data Exchange (ETDEWEB)
Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research
2011-01-15
Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.
Impact of foot progression angle on the distribution of plantar pressure in normal children.
Lai, Yu-Cheng; Lin, Huey-Shyan; Pan, Hui-Fen; Chang, Wei-Ning; Hsu, Chien-Jen; Renn, Jenn-Huei
2014-02-01
Plantar pressure distribution during walking is affected by several gait factors, most especially the foot progression angle which has been studied in children with neuromuscular diseases. However, this relationship in normal children has only been reported in limited studies. The purpose of this study is to clarify the correlation between foot progression angle and plantar pressure distribution in normal children, as well as the impacts of age and sex on this correlation. This study retrospectively reviewed dynamic pedobarographic data that were included in the gait laboratory database of our institution. In total, 77 normally developed children aged 5-16 years who were treated between 2004 and 2009 were included. Each child's footprint was divided into 5 segments: lateral forefoot, medial forefoot, lateral midfoot, medial midfoot, and heel. The percentages of impulse exerted at the medial foot, forefoot, midfoot, and heel were calculated. The average foot progression angle was 5.03° toe-out. Most of the total impulse was exerted on the forefoot (52.0%). Toe-out gait was positively correlated with high medial (r = 0.274; P plantar pressure as part of the treatment of various foot pathologies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Foldager, Casper Bindzus; Toh, Wei Seong; Gomoll, Andreas H; Olsen, Bjørn Reino; Spector, Myron
2014-04-01
The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti-collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional roles of these 2 extracellular matrix proteins
Toh, Wei Seong; Gomoll, Andreas H.; Olsen, Bjørn Reino; Spector, Myron
2014-01-01
Objective: The objective of the present study was to investigate the presence and distribution of 2 basement membrane (BM) molecules, laminin and collagen type IV, in healthy and degenerative cartilage tissues. Design: Normal and degenerated tissues were obtained from goats and humans, including articular knee cartilage, the intervertebral disc, and meniscus. Normal tissue was also obtained from patella-tibial enthesis in goats. Immunohistochemical analysis was performed using anti-laminin and anti–collagen type IV antibodies. Human and goat skin were used as positive controls. The percentage of cells displaying the pericellular presence of the protein was graded semiquantitatively. Results: When present, laminin and collagen type IV were exclusively found in the pericellular matrix, and in a discrete layer on the articulating surface of normal articular cartilage. In normal articular (hyaline) cartilage in the human and goat, the proteins were found co-localized pericellularly. In contrast, in human osteoarthritic articular cartilage, collagen type IV but not laminin was found in the pericellular region. Nonpathological fibrocartilaginous tissues from the goat, including the menisci and the enthesis, were also positive for both laminin and collagen type IV pericellularly. In degenerated fibrocartilage, including intervertebral disc, as in degenerated hyaline cartilage only collagen type IV was found pericellularly around chondrocytes but with less intense staining than in non-degenerated tissue. In calcified cartilage, some cells were positive for laminin but not type IV collagen. Conclusions: We report differences in expression of the BM molecules, laminin and collagen type IV, in normal and degenerative cartilaginous tissues from adult humans and goats. In degenerative tissues laminin is depleted from the pericellular matrix before collagen type IV. The findings may inform future studies of the processes underlying cartilage degeneration and the functional
Liang, Faming; Jin, Ick-Hoon
2013-01-01
Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.
Liang, Faming
2013-08-01
Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.
Energy Technology Data Exchange (ETDEWEB)
Žerovnik, Gašper, E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Smith, Donald L., E-mail: donald.l.smith@anl.gov [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States); Capote, Roberto, E-mail: roberto.capotenoy@iaea.org [NAPC–Nuclear Data Section, International Atomic Energy Agency, PO Box 100, Vienna-A-1400 (Austria)
2013-11-01
Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue.
International Nuclear Information System (INIS)
Žerovnik, Gašper; Trkov, Andrej; Smith, Donald L.; Capote, Roberto
2013-01-01
Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue
International Nuclear Information System (INIS)
Yamazaki, Dai G.; Ichiki, Kiyotomo; Takahashi, Keitaro
2011-01-01
We study the effect of primordial magnetic fields (PMFs) on the anisotropies of the cosmic microwave background (CMB). We assume the spectrum of PMFs is described by log-normal distribution which has a characteristic scale, rather than power-law spectrum. This scale is expected to reflect the generation mechanisms and our analysis is complementary to previous studies with power-law spectrum. We calculate power spectra of energy density and Lorentz force of the log-normal PMFs, and then calculate CMB temperature and polarization angular power spectra from scalar, vector, and tensor modes of perturbations generated from such PMFs. By comparing these spectra with WMAP7, QUaD, CBI, Boomerang, and ACBAR data sets, we find that the current CMB data set places the strongest constraint at k≅10 -2.5 Mpc -1 with the upper limit B < or approx. 3 nG.
The distribution of YKL-40 in osteoarthritic and normal human articular cartilage
DEFF Research Database (Denmark)
Volck, B; Ostergaard, K; Johansen, J S
1999-01-01
YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL......-40 in osteoarthritic (n=9) and macroscopically normal (n=5) human articular cartilage, collected from 12 pre-selected areas of the femoral head, to discover a potential role for YKL-40 in cartilage remodelling in osteoarthritis. Immunohistochemical analysis showed that YKL-40 staining was found...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...
Use of critical pathway models and log-normal frequency distributions for siting nuclear facilities
International Nuclear Information System (INIS)
Waite, D.A.; Denham, D.H.
1975-01-01
The advantages and disadvantages of potential sites for nuclear facilities are evaluated through the use of environmental pathway and log-normal distribution analysis. Environmental considerations of nuclear facility siting are necessarily geared to the identification of media believed to be sifnificant in terms of dose to man or to be potential centres for long-term accumulation of contaminants. To aid in meeting the scope and purpose of this identification, an exposure pathway diagram must be developed. This type of diagram helps to locate pertinent environmental media, points of expected long-term contaminant accumulation, and points of population/contaminant interface for both radioactive and non-radioactive contaminants. Confirmation of facility siting conclusions drawn from pathway considerations must usually be derived from an investigatory environmental surveillance programme. Battelle's experience with environmental surveillance data interpretation using log-normal techniques indicates that this distribution has much to offer in the planning, execution and analysis phases of such a programme. How these basic principles apply to the actual siting of a nuclear facility is demonstrated for a centrifuge-type uranium enrichment facility as an example. A model facility is examined to the extent of available data in terms of potential contaminants and facility general environmental needs. A critical exposure pathway diagram is developed to the point of prescribing the characteristics of an optimum site for such a facility. Possible necessary deviations from climatic constraints are reviewed and reconciled with conclusions drawn from the exposure pathway analysis. Details of log-normal distribution analysis techniques are presented, with examples of environmental surveillance data to illustrate data manipulation techniques and interpretation procedures as they affect the investigatory environmental surveillance programme. Appropriate consideration is given these
Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions
DEFF Research Database (Denmark)
Pinson, Pierre
2012-01-01
and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...
Log-Normal Distribution in a Growing System with Weighted and Multiplicatively Interacting Particles
Fujihara, Akihiro; Tanimoto, Satoshi; Yamamoto, Hiroshi; Ohtsuki, Toshiya
2018-03-01
A growing system with weighted and multiplicatively interacting particles is investigated. Each particle has a quantity that changes multiplicatively after a binary interaction, with its growth rate controlled by a weight parameter in a homogeneous symmetric kernel. We consider the system using moment inequalities and analytically derive the log-normal-type tail in the probability distribution function of quantities when the parameter is negative, which is different from the result for single-body multiplicative processes. We also find that the system approaches a winner-take-all state when the parameter is positive.
American Option Pricing using GARCH models and the Normal Inverse Gaussian distribution
DEFF Research Database (Denmark)
Stentoft, Lars Peter
In this paper we propose a feasible way to price American options in a model with time varying volatility and conditional skewness and leptokurtosis using GARCH processes and the Normal Inverse Gaussian distribution. We show how the risk neutral dynamics can be obtained in this model, we interpret...... properties shows that there are important option pricing differences compared to the Gaussian case as well as to the symmetric special case. A large scale empirical examination shows that our model outperforms the Gaussian case for pricing options on three large US stocks as well as a major index...
The distribution of YKL-40 in osteoarthritic and normal human articular cartilage
DEFF Research Database (Denmark)
Volck, B; Ostergaard, K; Johansen, J S
1999-01-01
YKL-40, also called human cartilage glycoprotein-39, is a major secretory protein of human chondrocytes in cell culture. YKL-40 mRNA is expressed by cartilage from patients with rheumatoid arthritis, but is not detectable in normal human cartilage. The aim was to investigate the distribution of YKL...... in chondrocytes of osteoarthritic cartilage mainly in the superficial and middle zone of the cartilage rather than the deep zone. There was a tendency for high number of YKL-40 positive chondrocytes in areas of the femoral head with a considerable biomechanical load. The number of chondrocytes with a positive...
On the Use of the Log-Normal Particle Size Distribution to Characterize Global Rain
Meneghini, Robert; Rincon, Rafael; Liao, Liang
2003-01-01
Although most parameterizations of the drop size distributions (DSD) use the gamma function, there are several advantages to the log-normal form, particularly if we want to characterize the large scale space-time variability of the DSD and rain rate. The advantages of the distribution are twofold: the logarithm of any moment can be expressed as a linear combination of the individual parameters of the distribution; the parameters of the distribution are approximately normally distributed. Since all radar and rainfall-related parameters can be written approximately as a moment of the DSD, the first property allows us to express the logarithm of any radar/rainfall variable as a linear combination of the individual DSD parameters. Another consequence is that any power law relationship between rain rate, reflectivity factor, specific attenuation or water content can be expressed in terms of the covariance matrix of the DSD parameters. The joint-normal property of the DSD parameters has applications to the description of the space-time variation of rainfall in the sense that any radar-rainfall quantity can be specified by the covariance matrix associated with the DSD parameters at two arbitrary space-time points. As such, the parameterization provides a means by which we can use the spaceborne radar-derived DSD parameters to specify in part the covariance matrices globally. However, since satellite observations have coarse temporal sampling, the specification of the temporal covariance must be derived from ancillary measurements and models. Work is presently underway to determine whether the use of instantaneous rain rate data from the TRMM Precipitation Radar can provide good estimates of the spatial correlation in rain rate from data collected in 5(sup 0)x 5(sup 0) x 1 month space-time boxes. To characterize the temporal characteristics of the DSD parameters, disdrometer data are being used from the Wallops Flight Facility site where as many as 4 disdrometers have been
Exact, time-independent estimation of clone size distributions in normal and mutated cells.
Roshan, A; Jones, P H; Greenman, C D
2014-10-06
Biological tools such as genetic lineage tracing, three-dimensional confocal microscopy and next-generation DNA sequencing are providing new ways to quantify the distribution of clones of normal and mutated cells. Understanding population-wide clone size distributions in vivo is complicated by multiple cell types within observed tissues, and overlapping birth and death processes. This has led to the increased need for mathematically informed models to understand their biological significance. Standard approaches usually require knowledge of clonal age. We show that modelling on clone size independent of time is an alternative method that offers certain analytical advantages; it can help parametrize these models, and obtain distributions for counts of mutated or proliferating cells, for example. When applied to a general birth-death process common in epithelial progenitors, this takes the form of a gambler's ruin problem, the solution of which relates to counting Motzkin lattice paths. Applying this approach to mutational processes, alternative, exact, formulations of classic Luria-Delbrück-type problems emerge. This approach can be extended beyond neutral models of mutant clonal evolution. Applications of these approaches are twofold. First, we resolve the probability of progenitor cells generating proliferating or differentiating progeny in clonal lineage tracing experiments in vivo or cell culture assays where clone age is not known. Second, we model mutation frequency distributions that deep sequencing of subclonal samples produce.
Kartono; Suryadi, D.; Herman, T.
2018-01-01
This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.
Elastic microfibril distribution in the cornea: Differences between normal and keratoconic stroma.
White, Tomas L; Lewis, Philip N; Young, Robert D; Kitazawa, Koji; Inatomi, Tsutomu; Kinoshita, Shigeru; Meek, Keith M
2017-06-01
The optical and biomechanical properties of the cornea are largely governed by the collagen-rich stroma, a layer that represents approximately 90% of the total thickness. Within the stroma, the specific arrangement of superimposed lamellae provides the tissue with tensile strength, whilst the spatial arrangement of individual collagen fibrils within the lamellae confers transparency. In keratoconus, this precise stromal arrangement is lost, resulting in ectasia and visual impairment. In the normal cornea, we previously characterised the three-dimensional arrangement of an elastic fiber network spanning the posterior stroma from limbus-to-limbus. In the peripheral cornea/limbus there are elastin-containing sheets or broad fibers, most of which become microfibril bundles (MBs) with little or no elastin component when reaching the central cornea. The purpose of the current study was to compare this network with the elastic fiber distribution in post-surgical keratoconic corneal buttons, using serial block face scanning electron microscopy and transmission electron microscopy. We have demonstrated that the MB distribution is very different in keratoconus. MBs are absent from a region of stroma anterior to Descemet's membrane, an area that is densely populated in normal cornea, whilst being concentrated below the epithelium, an area in which they are absent in normal cornea. We contend that these latter microfibrils are produced as a biomechanical response to provide additional strength to the anterior stroma in order to prevent tissue rupture at the apex of the cone. A lack of MBs anterior to Descemet's membrane in keratoconus would alter the biomechanical properties of the tissue, potentially contributing to the pathogenesis of the disease. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Goodness-of-Fit Tests for Generalized Normal Distribution for Use in Hydrological Frequency Analysis
Das, Samiran
2018-04-01
The use of three-parameter generalized normal (GNO) as a hydrological frequency distribution is well recognized, but its application is limited due to unavailability of popular goodness-of-fit (GOF) test statistics. This study develops popular empirical distribution function (EDF)-based test statistics to investigate the goodness-of-fit of the GNO distribution. The focus is on the case most relevant to the hydrologist, namely, that in which the parameter values are unidentified and estimated from a sample using the method of L-moments. The widely used EDF tests such as Kolmogorov-Smirnov, Cramer von Mises, and Anderson-Darling (AD) are considered in this study. A modified version of AD, namely, the Modified Anderson-Darling (MAD) test, is also considered and its performance is assessed against other EDF tests using a power study that incorporates six specific Wakeby distributions (WA-1, WA-2, WA-3, WA-4, WA-5, and WA-6) as the alternative distributions. The critical values of the proposed test statistics are approximated using Monte Carlo techniques and are summarized in chart and regression equation form to show the dependence of shape parameter and sample size. The performance results obtained from the power study suggest that the AD and a variant of the MAD (MAD-L) are the most powerful tests. Finally, the study performs case studies involving annual maximum flow data of selected gauged sites from Irish and US catchments to show the application of the derived critical values and recommends further assessments to be carried out on flow data sets of rivers with various hydrological regimes.
A Platoon Dispersion Model Based on a Truncated Normal Distribution of Speed
Directory of Open Access Journals (Sweden)
Ming Wei
2012-01-01
Full Text Available Understanding platoon dispersion is critical for the coordination of traffic signal control in an urban traffic network. Assuming that platoon speed follows a truncated normal distribution, ranging from minimum speed to maximum speed, this paper develops a piecewise density function that describes platoon dispersion characteristics as the platoon moves from an upstream to a downstream intersection. Based on this density function, the expected number of cars in the platoon that pass the downstream intersection, and the expected number of cars in the platoon that do not pass the downstream point are calculated. To facilitate coordination in a traffic signal control system, dispersion models for the front and the rear of the platoon are also derived. Finally, a numeric computation for the coordination of successive signals is presented to illustrate the validity of the proposed model.
Directory of Open Access Journals (Sweden)
Jin H. Jo
2016-04-01
Full Text Available Due to increasing price volatility in fossil-fuel-produced energy, the demand for clean, renewable, and abundant energy is more prevalent than in past years. Solar photovoltaic (PV systems have been well documented for their ability to produce electrical energy while at the same time offering support to mitigate the negative externalities associated with fossil fuel combustion. Prices for PV systems have decreased over the past few years, however residential and commercial owners may still opt out of purchasing a system due to the overall price required for a PV system installation. Therefore, determining optimal financing options for residential and small-scale purchasers is a necessity. We report on payment methods currently used for distributed community solar projects throughout the US and suggest appropriate options for purchasers in Normal, Illinois given their economic status. We also examine the jobs and total economic impact of a PV system implementation in the case study area.
Topographical Distribution of Arsenic, Manganese, and Selenium in the Normal Human Brain
DEFF Research Database (Denmark)
Larsen, Niels Agersnap; Pakkenberg, H.; Damsgaard, Else
1979-01-01
The concentrations of arsenic, manganese and selenium per gram wet tissue weight were determined in samples from 24 areas of normal human brains from 5 persons with ages ranging from 15 to 81 years of age. The concentrations of the 3 elements were determined for each sample by means of neutron...... activation analysis with radiochemical separation. Distinct patterns of distribution were shown for each of the 3 elements. Variations between individuals were found for some but not all brain areas, resulting in coefficients of variation between individuals of about 30% for arsenic, 10% for manganese and 20......% for selenium. The results seem to indicate that arsenic is associated with the lipid phase, manganese with the dry matter and selenium with the aqueous phase of brain tissue....
Skewed Normal Distribution Of Return Assets In Call European Option Pricing
Directory of Open Access Journals (Sweden)
Evy Sulistianingsih
2011-12-01
Full Text Available Option is one of security derivates. In financial market, option is a contract that gives a right (notthe obligation for its owner to buy or sell a particular asset for a certain price at a certain time.Option can give a guarantee for a risk that can be faced in a market.This paper studies about theuse of Skewed Normal Distribution (SN in call europeanoption pricing. The SN provides aflexible framework that captures the skewness of log return. We obtain aclosed form solution forthe european call option pricing when log return follow the SN. Then, we will compare optionprices that is obtained by the SN and the Black-Scholes model with the option prices of market.Â Keywords: skewed normaldistribution, log return, options.
Pantazopoulos, Harry; Murray, Elisabeth A.; Berretta, Sabina
2009-01-01
Chondroitin sulphate proteoglycans (CSPGs) are a key structural component of the brain extracellular matrix. They are involved in critical neurodevelopmental functions and are one of the main components of pericellular aggregates known as perineuronal nets. As a step toward investigating their functional and pathophysiological roles in the human amygdala, we assessed the pattern of CSPG expression in the normal human amygdala using wisteria floribunda agglutinin (WFA) lectin-histochemistry. Total numbers of WFA-labeled elements were measured in the lateral (LN), basal (BN), accessory basal (ABN) and cortical (CO) nuclei of the amygdala from 15 normal adult human subjects. For interspecies qualitative comparison, we also investigated the pattern of WFA labeling in the amygdala of naïve rats (n=32) and rhesus monkeys (Macaca mulatta; n=6). In human amygdala, WFA lectin-histochemistry resulted in labeling of perineuronal nets and cells with clear glial morphology, while neurons did not show WFA-labeling. Total numbers of WFA-labeled glial cells showed high interindividual variability. These cells aggregated in clusters with a consistent between-subjects spatial distribution. In a subset of human subjects (n=5), dual color fluorescence using an antibody raised against glial fibrillary acidic protein (GFAP) and WFA showed that the majority (93.7%) of WFA-labeled glial cells correspond to astrocytes. In rat and monkey amygdala, WFA histochemistry labeled perineuronal nets, but not glial cells. These results suggest that astrocytes are the main cell type expressing CSPGs in the adult human amygdala. Their highly segregated distribution pattern suggests that these cells serve specialized functions within human amygdalar nuclei. PMID:18374308
International Nuclear Information System (INIS)
Wu Yuanfang; Liu Lianshou
1990-01-01
The even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows are calculated, starting from a simple picture for charge correlation with non-zero correlation length. The coincidence and separation of these distributions are explained. The calculated window-and energy-dependence of normalized moments recovered the behaviour found in experiments. A new definition for normalized moments is propossed, especially suitable for narrow rapidity windows
International Nuclear Information System (INIS)
Hopke, P.K.; Wasiolek, P.; Montassier, N.; Cavallo, A.; Gadsby, K.; Socolow, R.
1992-01-01
In order to assess the exposure of individuals to the presence of indoor radioactivity arising from the decay of radon, an automated, semicontinuous graded screen array system was developed to permit the measurement of the activity-weighted size distributions of the radon progeny in homes. The system has been modified so that the electronics and sampling heads can be separated from the pump by approximately 15 m. The system was placed in the living room of a one-storey house with basement in Princeton, NJ and operated for 2 weeks while the house was occupied by the home owners in their normal manner. One of the house occupants was a cigarette smoker. Radon and potential alpha energy concentration (PAEC) measurements were also made, but condensation nuclei counts were not performed. PAEC values ranged from 23.4 to 461.6 mWL. In the measured activity size distributions, the amount of activity in the 0.5-1.5 nm size range can be considered to be the unattached fraction. The mean value for the 218 Po unattached fraction is 0.217 with a range of 0.054-0.549. The median value for the unattached fraction of PAEC is 0.077 with a range of 0.022-0.178. (author)
Zhu, Qiaohao; Carriere, K C
2016-01-01
Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.
[Calbindin and parvalbumin distribution in spinal cord of normal and rabies-infected mice].
Monroy-Gómez, Jeison; Torres-Fernández, Orlando
2013-01-01
Rabies is a fatal infectious disease of the nervous system; however, the knowledge about the pathogenic neural mechanisms in rabies is scarce. In addition, there are few studies of rabies pathology of the spinal cord. To study the distribution of calcium binding proteins calbindin and parvalbumin and assessing the effect of rabies virus infection on their expression in the spinal cord of mice. MATERIALES Y METHODS: Mice were inoculated with rabies virus, by intracerebral or intramuscular route. The spinal cord was extracted to perform some crosscuts which were treated by immunohistochemistry with monoclonal antibodies to reveal the presence of the two proteins in normal and rabies infected mice. We did qualitative and quantitative analyses of the immunoreactivity of the two proteins. Calbindin and parvalbumin showed differential distribution in Rexed laminae. Rabies infection produced a decrease in the expression of calbindin. On the contrary, the infection caused an increased expression of parvalbumin. The effect of rabies infection on the two proteins expression was similar when comparing both routes of inoculation. The differential effect of rabies virus infection on the expression of calbindin and parvalbumin in the spinal cord of mice was similar to that previously reported for brain areas. This result suggests uniformity in the response to rabies infection throughout the central nervous system. This is an important contribution to the understanding of the pathogenesis of rabies.
Distribution Log Normal of 222 Rn in the state of Zacatecas, Mexico
International Nuclear Information System (INIS)
Garcia, M.L.; Mireles, F.; Quirino, L.; Davila, I.; Rios, C.; Pinedo, J.L.
2006-01-01
In this work the evaluation of the concentration of 222 Rn in air for Zacatecas is shown. The Solid State Nuclear Track Detectors were used as the technique for the realization of the measurements in large scale with cellulose nitrate LR-115, type 2, in open chambers of 222 Rn. The measurements were carried out during three months in different times of the year. In the results it is presented the log normal distribution, arithmetic mean and geometric media for the concentration at indoor and outdoor of residence constructions, the concentration at indoor of occupational constructions and in the 57 municipal heads of the state of Zacatecas. The statistics of the values in the concentration showed variation according to the time of the year, obtaining high quantities in winter seasons for both cases. The distribution of the concentration of 222 Rn is presented in the state map for each one of the municipalities, representing the measurement places in the entire state of Zacatecas. Finally the places where the values in the concentration of 222 Rn in air are near to the one limit settled down by the EPA of 148 Bq/m 3 are presented. (Author)
Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering
Gamayunov, K. V.; Khazanov, G. V.
2006-01-01
calculate the pitch-angle diffusion coefficients using the typical wave normal distributions obtained from our self-consistent ring current-EMIC wave model, and try to quantify the effect of EMIC wave normal angle characteristics on relativistic electron scattering.
International Nuclear Information System (INIS)
Tsutsui, H.; Tomoike, H.; Nakamura, M.
1990-01-01
Tracer distribution as an index of nutritional support across the thoracic and abdominal aortas in rabbits in the presence or absence of atherosclerotic lesions was evaluated using [ 14 C]antipyrine, a metabolically inert, diffusible indicator. Intimal plaques were produced by endothelial balloon denudation of the thoracic aorta and a 1% cholesterol diet. After a steady intravenous infusion of 200 microCi of [ 14 C]antipyrine for 60 seconds, thoracic and abdominal aortas and the heart were excised, and autoradiograms of 20-microns-thick sections were quantified, using microcomputer-aided densitometry. Regional radioactivity and regional diffusional support, as an index of nutritional flow estimated from the timed collections of arterial blood, was 367 and 421 nCi.g-1 (82 and 106 ml.min-1.100 g-1) in thoracic aortic media of the normal and atherosclerotic rabbits, respectively. Radioactivity at the thickened intima was 179 nCi.g-1 (p less than 0.01 versus media). The gruel was noted at a deeper site within the thickened intima, and diffusional support here was 110 nCi.g-1 (p less than 0.01 versus an average radioactivity at the thickened intima). After ligating the intercostal arteries, regional tracer distribution in the media beneath the fibrofatty lesion, but not the plaque-free intima, was reduced to 46%. Thus, in the presence of advanced intimal thickening, the heterogeneous distribution of diffusional flow is prominent across the vessel wall, and abluminal routes are crucial to meet the increased demands of nutritional requirements
Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering in Outer RB
Khazanov, G. V.; Gamayunov, K. V.
2007-01-01
We present the equatorial and bounce average pitch angle diffusion coefficients for scattering of relativistic electrons by the H+ mode of EMIC waves. Both the model (prescribed) and self consistent distributions over the wave normal angle are considered. The main results of our calculation can be summarized as follows: First, in comparison with field aligned waves, the intermediate and highly oblique waves reduce the pitch angle range subject to diffusion, and strongly suppress the scattering rate for low energy electrons (E less than 2 MeV). Second, for electron energies greater than 5 MeV, the |n| = 1 resonances operate only in a narrow region at large pitch-angles, and despite their greatest contribution in case of field aligned waves, cannot cause electron diffusion into the loss cone. For those energies, oblique waves at |n| greater than 1 resonances are more effective, extending the range of pitch angle diffusion down to the loss cone boundary, and increasing diffusion at small pitch angles by orders of magnitude.
International Nuclear Information System (INIS)
Herzog, H.; Spohr, G.; Notohamiprodjo, G.; Feinendegen, L.E.
1987-01-01
Estimates of the radiation dose resulting from liver-spleen scintigraphy 99 TCsup(m)-labelled colloids are based on pharmacokinetic data mainly determined in animals. The aim of this study was to check the pharmacokinetic data by direct, absolute in vivo quantification in man. Liver and spleen activities were directly measured using a double-energy window technique. Activities in other organs were quantified by conjugate whole-body scans. All measurement procedures were checked using the whole-body Alderson phantom. Pharmacokinetic data for sulphur colloid, tin colloid, human serum albumin (HSA) millimicrospheres, and phytate were obtained in 13 to 20 normal subjects for each type of colloid. Depending on the colloid type liver uptake was between 54 and 75% of the total administered dose (TAD) and spleen uptake was 3.5 to 21% TAD. Activity measured in blood, urine, lung and thyroid proved to be far from negligible. The results of this work suggest a correction of the animal-based data of colloid distribution and radiation dose on the basis of the direct measurement of absolute uptake in man. (author)
Duarte Queirós, Sílvio M.
2012-07-01
We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.
Enabling Secure XMPP Communications in Federated IoT Clouds Through XEP 0027 and SAML/SASL SSO.
Celesti, Antonio; Fazio, Maria; Villari, Massimo
2017-02-07
Nowadays, in the panorama of Internet of Things (IoT), finding a right compromise between interactivity and security is not trivial at all. Currently, most of pervasive communication technologies are designed to work locally. As a consequence, the development of large-scale Internet services and applications is not so easy for IoT Cloud providers. The main issue is that both IoT architectures and services have started as simple but they are becoming more and more complex. Consequently, the web service technology is often inappropriate. Recently, many operators in both academia and industry fields are considering the possibility to adopt the eXtensible Messaging and Presence Protocol (XMPP) for the implementation of IoT Cloud communication systems. In fact, XMPP offers many advantages in term of real-time capabilities, efficient data distribution, service discovery and inter-domain communication compared to other technologies. Nevertheless, the protocol lacks of native security, data confidentiality and trustworthy federation features. In this paper, considering an XMPP-based IoT Cloud architectural model, we discuss how can be possible to enforce message signing/encryption and Single-Sign On (SSO) authentication respectively for secure inter-module and inter-domain communications in a federated environment. Experiments prove that security mechanisms introduce an acceptable overhead, considering the obvious advantages achieved in terms of data trustiness and privacy.
Enabling Secure XMPP Communications in Federated IoT Clouds Through XEP 0027 and SAML/SASL SSO
Directory of Open Access Journals (Sweden)
Antonio Celesti
2017-02-01
Full Text Available Nowadays, in the panorama of Internet of Things (IoT, finding a right compromise between interactivity and security is not trivial at all. Currently, most of pervasive communication technologies are designed to work locally. As a consequence, the development of large-scale Internet services and applications is not so easy for IoT Cloud providers. The main issue is that both IoT architectures and services have started as simple but they are becoming more and more complex. Consequently, the web service technology is often inappropriate. Recently, many operators in both academia and industry fields are considering the possibility to adopt the eXtensible Messaging and Presence Protocol (XMPP for the implementation of IoT Cloud communication systems. In fact, XMPP offers many advantages in term of real-time capabilities, efficient data distribution, service discovery and inter-domain communication compared to other technologies. Nevertheless, the protocol lacks of native security, data confidentiality and trustworthy federation features. In this paper, considering an XMPP-based IoT Cloud architectural model, we discuss how can be possible to enforce message signing/encryption and Single-Sign On (SSO authentication respectively for secure inter-module and inter-domain communications in a federated environment. Experiments prove that security mechanisms introduce an acceptable overhead, considering the obvious advantages achieved in terms of data trustiness and privacy.
Enabling Secure XMPP Communications in Federated IoT Clouds Through XEP 0027 and SAML/SASL SSO
Celesti, Antonio; Fazio, Maria; Villari, Massimo
2017-01-01
Nowadays, in the panorama of Internet of Things (IoT), finding a right compromise between interactivity and security is not trivial at all. Currently, most of pervasive communication technologies are designed to work locally. As a consequence, the development of large-scale Internet services and applications is not so easy for IoT Cloud providers. The main issue is that both IoT architectures and services have started as simple but they are becoming more and more complex. Consequently, the web service technology is often inappropriate. Recently, many operators in both academia and industry fields are considering the possibility to adopt the eXtensible Messaging and Presence Protocol (XMPP) for the implementation of IoT Cloud communication systems. In fact, XMPP offers many advantages in term of real-time capabilities, efficient data distribution, service discovery and inter-domain communication compared to other technologies. Nevertheless, the protocol lacks of native security, data confidentiality and trustworthy federation features. In this paper, considering an XMPP-based IoT Cloud architectural model, we discuss how can be possible to enforce message signing/encryption and Single-Sign On (SSO) authentication respectively for secure inter-module and inter-domain communications in a federated environment. Experiments prove that security mechanisms introduce an acceptable overhead, considering the obvious advantages achieved in terms of data trustiness and privacy. PMID:28178214
International Nuclear Information System (INIS)
Mayaud, P.N.
1976-01-01
Because of the various components of positive conservation existing in the series of aa indices, their frequency distribution is necessarily distorted with respect to any random distribution. However when one takes these various components into account, the observed distribution can be considered as being a logarithmo-normal distribution. This implies that the geomagnetic activity satisfies the conditions of the central limit theorem, according to which a phenomenon which presents such a distribution is due to independent causes whose effects are multiplicative. Furthermore, the distorsion of the frequency distribution caused by the 11-year and 90-year cycles corresponds to a pure attenuation effect; an interpretation by the solar 'coronal holes' is proposed [fr
Distribution of CD163-positive cell and MHC class II-positive cell in the normal equine uveal tract.
Sano, Yuto; Matsuda, Kazuya; Okamoto, Minoru; Takehana, Kazushige; Hirayama, Kazuko; Taniyama, Hiroyuki
2016-02-01
Antigen-presenting cells (APCs) in the uveal tract participate in ocular immunity including immune homeostasis and the pathogenesis of uveitis. In horses, although uveitis is the most common ocular disorder, little is known about ocular immunity, such as the distribution of APCs. In this study, we investigated the distribution of CD163-positive and MHC II-positive cells in the normal equine uveal tract using an immunofluorescence technique. Eleven eyes from 10 Thoroughbred horses aged 1 to 24 years old were used. Indirect immunofluorescence was performed using the primary antibodies CD163, MHC class II (MHC II) and CD20. To demonstrate the site of their greatest distribution, positive cells were manually counted in 3 different parts of the uveal tract (ciliary body, iris and choroid), and their average number was assessed by statistical analysis. The distribution of pleomorphic CD163- and MHC II-expressed cells was detected throughout the equine uveal tract, but no CD20-expressed cells were detected. The statistical analysis demonstrated the distribution of CD163- and MHC II-positive cells focusing on the ciliary body. These results demonstrated that the ciliary body is the largest site of their distribution in the normal equine uveal tract, and the ciliary body is considered to play important roles in uveal and/or ocular immune homeostasis. The data provided in this study will help further understanding of equine ocular immunity in the normal state and might be beneficial for understanding of mechanisms of ocular disorders, such as equine uveitis.
Bos, J. D.; Zonneveld, I.; Das, P. K.; Krieg, S. R.; van der Loos, C. M.; Kapsenberg, M. L.
1987-01-01
The complexity of immune response-associated cells present in normal human skin was recently redefined as the skin immune system (SIS). In the present study, the exact immunophenotypes of lymphocyte subpopulations with their localizations in normal human skin were determined quantitatively. B cells
International Nuclear Information System (INIS)
Stanisic, D.; Hancock, A.A.; Kyncl, J.J.; Lin, C.T.; Bush, E.N.
1986-01-01
(-)-Norepinephrine (NE) is used as an internal standard in their in vitro adrenergic assays, and the concentration of NE which produces a half-maximal inhibition of specific radioligand binding (affinity; K/sub I/), or half-maximal contractile response (potency; ED 50 ) has been measured numerous times. The goodness-of-fit test for normality was performed on both normal (Gaussian) or log 10 -normal frequency histograms of these data using the SAS Univariate procedure. Specific binding of 3 H-prazosin to rat liver (α 1 -), 3 H rauwolscine to rat cortex (α 2 -) and 3 H-dihydroalprenolol to rat ventricle (β 1 -) or rat lung (β 2 -receptors) was inhibited by NE; the distributions of NE K/sub I/'s at all these sites were skewed to the right, with highly significant (p 50 's of NE in isolated rabbit aorta (α 1 ), phenoxybenzamine-treated dog saphenous vein (α 2 ) and guinea pig atrium (β 1 ). The vasorelaxant potency of atrial natriuretic hormone in histamine-contracted rabbit aorta also was better described by a log-normal distribution, indicating that log-normalcy is probably a general phenomenon of drug-receptor interactions. Because data of this type appear to be log-normally distributed, geometric means should be used in parametric statistical analyses
Peters, B. C., Jr.; Walker, H. F.
1975-01-01
New results and insights concerning a previously published iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions were discussed. It was shown that the procedure converges locally to the consistent maximum likelihood estimate as long as a specified parameter is bounded between two limits. Bound values were given to yield optimal local convergence.
Warne, Russell T.; Godwin, Lindsey R.; Smith, Kyle V.
2013-01-01
Among some gifted education researchers, advocates, and practitioners, it is sometimes believed that there is a larger number of gifted people in the general population than would be predicted from a normal distribution (e.g., Gallagher, 2008; N. M. Robinson, Zigler, & Gallagher, 2000; Silverman, 1995, 2009), a belief that we termed the…
Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong
2016-03-01
Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.
A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique
Rached, Nadhir B.
2015-06-08
The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.
A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique
Rached, Nadhir B.; Benkhelifa, Fatma; Alouini, Mohamed-Slim; Tempone, Raul
2015-01-01
The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.
International Nuclear Information System (INIS)
Jahnke, T.; Titze, J.; Foucar, L.; Wallauer, R.; Osipov, T.; Benis, E.P.; Jagutzki, O.; Arnold, W.; Czasch, A.; Staudte, A.; Schoeffler, M.; Alnaser, A.; Weber, T.; Prior, M.H.; Schmidt-Boecking, H.; Doerner, R.
2011-01-01
We have measured the molecular frame angular distributions of photoelectrons emitted from the Carbon K-shell of fixed-in-space CO molecules for the case of simultaneous excitation of the remaining molecular ion. Normal and conjugate shakeup states are observed. Photoelectrons belonging to normal Σ-satellite lines show an angular distribution resembling that observed for the main photoline at the same electron energy. Surprisingly a similar shape is found for conjugate shakeup states with Π-symmetry. In our data we identify shake rather than electron scattering (PEVE) as the mechanism producing the conjugate lines. The angular distributions clearly show the presence of a Σ shape resonance for all of the satellite lines.
Energy Technology Data Exchange (ETDEWEB)
Bellasio, R [Enviroware s.r.l., Agrate Brianza, Milan (Italy). Centro Direzionale Colleoni; Lanzani, G; Ripamonti, M; Valore, M [Amministrazione Provinciale, Como (Italy)
1998-04-01
This work illustrates the possibility to interpolate the measured concentrations of CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} during one year (1995) at the 13 stations of the air quality monitoring station network of the Provinces of Como and Lecco (Italy) by means of a log-normal distribution. Particular attention was given in choosing the method for the determination of the log-normal distribution parameters among four possible methods: I natural, II percentiles, III moments, IV maximum likelihood. In order to evaluate the goodness of fit a ranking procedure was carried out over the values of four indices: absolute deviation, weighted absolute deviation, Kolmogorov-Smirnov index and Cramer-von Mises-Smirnov index. The capability of the log-normal distribution to fit the measured data is then discussed as a function of the pollutant and of the monitoring station. Finally an example of application is given: the effect of an emission reduction strategy in Lombardy Region (the so called `bollino blu`) is evaluated using a log-normal distribution. [Italiano] In questo lavoro si discute la possibilita` di interpolare le concentrazioni misurate di CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} durante un anno solare (il 1995) nelle 13 stazioni della Rete di Rilevamento della qualita` dell`aria delle Provincie di Como e di Lecco mediante una funzione log-normale. In particolare si discute quale metodo e` meglio usare per l`individuazione dei 2 parametri caratteristici della log-normale, tra 4 teoreticamente possibili: I naturale, II dei percentili, III dei momenti, IV della massima verosimiglianza. Per valutare i risultati ottenuti si usano: la deviazione assoluta, la deviazione pesata, il parametro di Kolmogorov-Smirnov e quello di Cramer-von Mises-Smirnov effettuando un ranking tra i metodi in funzione degli inquinanti e della stazione di misura. Ancora in funzione degli inquinanti e delle diverse stazioni di misura si discute poi la capacita` della funzione log-normale di
Subchondral bone density distribution of the talus in clinically normal Labrador Retrievers.
Dingemanse, W; Müller-Gerbl, M; Jonkers, I; Vander Sloten, J; van Bree, H; Gielen, I
2016-03-15
Bones continually adapt their morphology to their load bearing function. At the level of the subchondral bone, the density distribution is highly correlated with the loading distribution of the joint. Therefore, subchondral bone density distribution can be used to study joint biomechanics non-invasively. In addition physiological and pathological joint loading is an important aspect of orthopaedic disease, and research focusing on joint biomechanics will benefit veterinary orthopaedics. This study was conducted to evaluate density distribution in the subchondral bone of the canine talus, as a parameter reflecting the long-term joint loading in the tarsocrural joint. Two main density maxima were found, one proximally on the medial trochlear ridge and one distally on the lateral trochlear ridge. All joints showed very similar density distribution patterns and no significant differences were found in the localisation of the density maxima between left and right limbs and between dogs. Based on the density distribution the lateral trochlear ridge is most likely subjected to highest loads within the tarsocrural joint. The joint loading distribution is very similar between dogs of the same breed. In addition, the joint loading distribution supports previous suggestions of the important role of biomechanics in the development of OC lesions in the tarsus. Important benefits of computed tomographic osteoabsorptiometry (CTOAM), i.e. the possibility of in vivo imaging and temporal evaluation, make this technique a valuable addition to the field of veterinary orthopaedic research.
Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong
2018-05-01
In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.
DEFF Research Database (Denmark)
Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G
2016-01-01
a homeostatic set point that follows a normal (Gaussian) distribution. This set point (or baseline in steady-state) should be estimated from a set of previous samples, but, in practice, decisions based on reference change value are often based on only two consecutive results. The original reference change value......-positive results. The aim of this study was to investigate false-positive results using five different published methods for calculation of reference change value. METHODS: The five reference change value methods were examined using normally and ln-normally distributed simulated data. RESULTS: One method performed...... best in approaching the theoretical false-positive percentages on normally distributed data and another method performed best on ln-normally distributed data. The commonly used reference change value method based on two results (without use of estimated set point) performed worst both on normally...
Directory of Open Access Journals (Sweden)
Casault Sébastien
2016-05-01
Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture
Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt
2016-04-01
RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
Jolani, Shahab
2014-01-01
For a vector of multivariate normal when some elements, but not necessarily all, are truncated, we derive the moment generating function and obtain expressions for the first two moments involving the multivariate hazard gradient. To show one of many applications of these moments, we then extend the
Site-dependent distribution of macrophages in normal human extraocular muscles
Schmidt, E. D.; van der Gaag, R.; Mourits, M. P.; Koornneef, L.
1993-01-01
PURPOSE: Clinical data indicate that extraocular muscles have different susceptibilities for some orbital immune disorders depending on their anatomic location. The resident immunocompetent cells may be important mediators in the local pathogenesis of such disorders so the distribution of these
International Nuclear Information System (INIS)
Jalilian, A.R.; Fateh, B.; Ghergherehchi, M.; Karimian, A.; Matloobi, M.; Moradkhani, S.; Kamalidehghan, M.; Tabeie, F.
2003-01-01
Backgrounds: Bleomycin (BLM) has been labeled with radioisotopes and widely used in therapy and diagnosis. In this study BLM was labeled with [ 62 Zn] zinc chloride for oncologic PET studies. Materials and methods: The complex was obtained at the P H=2 normal saline at 90 d eg C in 60 min. Radio-TLC showed on overall radiochemical yield of 95-97% (radiochemical purity>97%). Stability of complex was checked in vitro in mice and human plasma/urine. Results: Preliminary in vitro studies performed to determined complex stability and distribution of [ 62 Zn] BLM in normal and fibrosarcoma tumors in mice according to bio-distribution/imaging studies. Conclusion: [ 62 Zn] BLM can be used in PET oncology studies due to its suitable physico-chemical propertied as a diagnostic complex behavior in higher animals
Sack, Kevin L.; Baillargeon, Brian; Acevedo-Bolton, Gabriel; Genet, Martin; Rebelo, Nuno; Kuhl, Ellen; Klein, Liviu; Weiselthaler, Georg M.; Burkhoff, Daniel; Franz, Thomas; Guccione, Julius M.
2016-01-01
Purpose: Heart failure is a worldwide epidemic that is unlikely to change as the population ages and life expectancy increases. We sought to detail significant recent improvements to the Dassault Systèmes Living Heart Model (LHM) and use the LHM to compute left ventricular (LV) and right ventricular (RV) myofiber stress distributions under the following 4 conditions: (1) normal cardiac function; (2) acute left heart failure (ALHF); (3) ALHF treated using an LV assist device (LVAD) flow rate o...
Directory of Open Access Journals (Sweden)
Eckhard Limpert
Full Text Available BACKGROUND: The gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by mean ± SD, or with the standard error of the mean, mean ± SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. METHODOLOGY/PRINCIPAL FINDINGS: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the "95% range check", their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log- normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x/, times-divide, and notation. Analogous to mean ± SD, it connects the multiplicative (or geometric mean mean * and the multiplicative standard deviation s* in the form mean * x/s*, that is advantageous and recommended. CONCLUSIONS/SIGNIFICANCE: The corresponding shift from the symmetric to the asymmetric view will substantially increase both, recognition of data distributions, and interpretation quality. It will allow for savings in sample size that can be considerable. Moreover, this is in line with ethical responsibility. Adequate models will improve concepts and theories, and provide deeper insight into science and life.
International Nuclear Information System (INIS)
Cleaver, J.E.
1979-01-01
Excision repair of damage from ultraviolet light in both normal and xeroderma pigmentosum variant fibroblasts at early times after irradiation occurred preferentially in regions of DNA accessible to micrococcal nuclease digestion. These regions are predominantly the linker regions between nucleosomes in chromatin. The alterations reported at polymerization and ligation steps of excision repair in the variant are therefore not associated with changes in the relative distributions of repair sites in linker and core particle regions of DNA. (Auth.)
DEFF Research Database (Denmark)
Wogensen, L D; Welinder, B; Hejnaes, K R
1991-01-01
-lives of distribution (T1/2 alpha) and elimination phases (T1/2 beta) of human recombinant interleukin 1 beta (rIL-1 beta), and its tissue distribution and cellular localization by means of mono-labelled, biologically active 125I-rIL-1 beta. After intravenous (i.v.) injection, 125I-rIL-1 beta was eliminated from...... the circulation with a T1/2 alpha of 2.9 min and a T1/2 beta of 41.1 min. The central and peripheral volume of distribution was 20.7 and 19.1 ml/rat, respectively, and the metabolic clearance rate was 16.9 ml/min/kg. The kidney and liver showed the highest accumulation of tracer, and autoradiography demonstrated...
Banerjee, Abhirup; Maji, Pradipta
2015-12-01
The segmentation of brain MR images into different tissue classes is an important task for automatic image analysis technique, particularly due to the presence of intensity inhomogeneity artifact in MR images. In this regard, this paper presents a novel approach for simultaneous segmentation and bias field correction in brain MR images. It integrates judiciously the concept of rough sets and the merit of a novel probability distribution, called stomped normal (SN) distribution. The intensity distribution of a tissue class is represented by SN distribution, where each tissue class consists of a crisp lower approximation and a probabilistic boundary region. The intensity distribution of brain MR image is modeled as a mixture of finite number of SN distributions and one uniform distribution. The proposed method incorporates both the expectation-maximization and hidden Markov random field frameworks to provide an accurate and robust segmentation. The performance of the proposed approach, along with a comparison with related methods, is demonstrated on a set of synthetic and real brain MR images for different bias fields and noise levels.
Kalsi, Karan; Fuller, Jason C.; Somani, Abhishek; Pratt, Robert G.; Chassin, David P.; Hammerstrom, Donald J.
2017-09-12
Disclosed herein are representative embodiments of methods, apparatus, and systems for facilitating operation and control of a resource distribution system (such as a power grid). Among the disclosed embodiments is a distributed hierarchical control architecture (DHCA) that enables smart grid assets to effectively contribute to grid operations in a controllable manner, while helping to ensure system stability and equitably rewarding their contribution. Embodiments of the disclosed architecture can help unify the dispatch of these resources to provide both market-based and balancing services.
Log-normal spray drop distribution...analyzed by two new computer programs
Gerald S. Walton
1968-01-01
Results of U.S. Forest Service research on chemical insecticides suggest that large drops are not as effective as small drops in carrying insecticides to target insects. Two new computer programs have been written to analyze size distribution properties of drops from spray nozzles. Coded in Fortran IV, the programs have been tested on both the CDC 6400 and the IBM 7094...
Energy Technology Data Exchange (ETDEWEB)
Petithuguenin, T.D.P.; Sherman, M.H.
2009-05-01
The purpose of ventilation is to dilute indoor contaminants that an occupant is exposed to. Even when providing the same nominal rate of outdoor air, different ventilation systems may distribute air in different ways, affecting occupants' exposure to household contaminants. Exposure ultimately depends on the home being considered, on source disposition and strength, on occupants' behavior, on the ventilation strategy, and on operation of forced air heating and cooling systems. In any multi-zone environment dilution rates and source strengths may be different in every zone and change in time, resulting in exposure being tied to occupancy patterns.This paper will report on simulations that compare ventilation systems by assessing their impact on exposure by examining common house geometries, contaminant generation profiles, and occupancy scenarios. These simulations take into account the unsteady, occupancy-tied aspect of ventilation such as bathroom and kitchen exhaust fans. As most US homes have central HVAC systems, the simulation results will be used to make appropriate recommendations and adjustments for distribution and mixing to residential ventilation standards such as ASHRAE Standard 62.2.This paper will report on work being done to model multizone airflow systems that are unsteady and elaborate the concept of distribution matrix. It will examine several metrics for evaluating the effect of air distribution on exposure to pollutants, based on previous work by Sherman et al. (2006).
Li, Mao; Li, Yan; Wen, Peng Paul
2014-01-01
The biological microenvironment is interrupted when tumour masses are introduced because of the strong competition for oxygen. During the period of avascular growth of tumours, capillaries that existed play a crucial role in supplying oxygen to both tumourous and healthy cells. Due to limitations of oxygen supply from capillaries, healthy cells have to compete for oxygen with tumourous cells. In this study, an improved Krogh's cylinder model which is more realistic than the previously reported assumption that oxygen is homogeneously distributed in a microenvironment, is proposed to describe the process of the oxygen diffusion from a capillary to its surrounding environment. The capillary wall permeability is also taken into account. The simulation study is conducted and the results show that when tumour masses are implanted at the upstream part of a capillary and followed by normal tissues, the whole normal tissues suffer from hypoxia. In contrast, when normal tissues are ahead of tumour masses, their pO2 is sufficient. In both situations, the pO2 in the whole normal tissues drops significantly due to the axial diffusion at the interface of normal tissues and tumourous cells. As the existence of the axial oxygen diffusion cannot supply the whole tumour masses, only these tumourous cells that are near the interface can be partially supplied, and have a small chance to survive.
International Nuclear Information System (INIS)
Tsuchimochi, Shinsaku; Tamaki, Nagara; Shirakawa, Seishi; Fujita, Toru; Yonekura, Yoshiharu; Konishi, Junji; Nohara, Ryuji; Sasayama, Shigetake; Nishioka, Kenya
1994-01-01
The normal pattern of the myocardial sympathetic innervation was studied in 15 subjects using gamma camera scintigraphy with iodine-123 labeled metaiodobenzylguanidine ( 123 I-MIBG). Seven younger subjects (mean age 24.6±3.6) and eight older patients (mean age 60.9±8.4) with normal cardiac function were studied. Planar imaging was obtained at 15 minutes and 3 hours, and SPECT was also performed 3 hours after injection of 111 MBq (3 mCi) of MIBG. The younger subjects showed higher the heart to mediastinum count ratio (2.91±0.25 vs. 2.67±0.34; p<0.05) and higher inferior to anterior count ratio (1.19±0.15 vs. 0.97±0.13; p<0.05) on the late scan. The bull's-eye polar map also differences in counts in the mid-inferior (p<0.005), basal-inferior (p<0.005) and mid-lateral sectors (p<0.01). But there was no significant difference in MIBG washout rate from myocardium between two groups. These data suggest that there is a difference of the cardiac sympathetic innervation, with older subjects having fewer sympathetic nerve terminals, especially in inferior than younger subjects. We conclude that the age difference in sympathetic nerve function should be considered in the interpretation of MIBG scan. (author)
Isacco, L; Thivel, D; Duclos, M; Aucouturier, J; Boisseau, N
2014-06-01
Fat mass localization affects lipid metabolism differently at rest and during exercise in overweight and normal-weight subjects. The aim of this study was to investigate the impact of a low vs high ratio of abdominal to lower-body fat mass (index of adipose tissue distribution) on the exercise intensity (Lipox(max)) that elicits the maximum lipid oxidation rate in normal-weight women. Twenty-one normal-weight women (22.0 ± 0.6 years, 22.3 ± 0.1 kg.m(-2)) were separated into two groups of either a low or high abdominal to lower-body fat mass ratio [L-A/LB (n = 11) or H-A/LB (n = 10), respectively]. Lipox(max) and maximum lipid oxidation rate (MLOR) were determined during a submaximum incremental exercise test. Abdominal and lower-body fat mass were determined from DXA scans. The two groups did not differ in aerobic fitness, total fat mass, or total and localized fat-free mass. Lipox(max) and MLOR were significantly lower in H-A/LB vs L-A/LB women (43 ± 3% VO(2max) vs 54 ± 4% VO(2max), and 4.8 ± 0.6 mg min(-1)kg FFM(-1)vs 8.4 ± 0.9 mg min(-1)kg FFM(-1), respectively; P normal-weight women, a predominantly abdominal fat mass distribution compared with a predominantly peripheral fat mass distribution is associated with a lower capacity to maximize lipid oxidation during exercise, as evidenced by their lower Lipox(max) and MLOR. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Medan, R. T.; Ray, K. S.
1974-01-01
A description of and users manual are presented for a U.S.A. FORTRAN 4 computer program which evaluates spanwise and chordwise loading distributions, lift coefficient, pitching moment coefficient, and other stability derivatives for thin wings in linearized, steady, subsonic flow. The program is based on a kernel function method lifting surface theory and is applicable to a large class of planforms including asymmetrical ones and ones with mixed straight and curved edges.
DEFF Research Database (Denmark)
Reimers, J; Wogensen, L D; Welinder, B
1991-01-01
Based upon in vivo rat experiments it was recently suggested that interleukin 1 in the circulation may be implicated in the initial events of beta-cell destruction leading to insulin-dependent diabetes mellitus (IDDM) in humans. The aim of the present study was to estimate half-lives of distribut......Based upon in vivo rat experiments it was recently suggested that interleukin 1 in the circulation may be implicated in the initial events of beta-cell destruction leading to insulin-dependent diabetes mellitus (IDDM) in humans. The aim of the present study was to estimate half......-lives of distribution (T1/2 alpha) and elimination phases (T1/2 beta) of human recombinant interleukin 1 beta (rIL-1 beta), and its tissue distribution and cellular localization by means of mono-labelled, biologically active 125I-rIL-1 beta. After intravenous (i.v.) injection, 125I-rIL-1 beta was eliminated from.......v., intraperitoneal (i.p.) and subcutaneous (s.c.) injections, as demonstrated by high performance size exclusion chromatography, trichloracetic acid precipitation and SDS-PAGE until 5 h after tracer injection. Pre-treatment with 'cold' rIL-1 beta enhanced degradation of a subsequent injection of tracer. The route...
Comparison of plantar pressure distribution in subjects with normal and flat feet during gait
Directory of Open Access Journals (Sweden)
Aluisio Otavio Vargas Avila
2010-06-01
Full Text Available The aim of this study was to determine the possible relationship between loss of thenormal medial longitudinal arch measured by the height of the navicular bone in a static situationand variables related to plantar pressure distribution measured in a dynamic situation. Elevenmen (21 ± 3 years, 74 ± 10 kg and 175 ± 4 cm participated in the study. The Novel Emed-ATSystem was used for the acquisition of plantar pressure distribution data (peak pressure, meanpressure, contact area, and relative load at a sampling rate of 50 Hz. The navicular drop testproposed by Brody (1982 was used to assess the height of the navicular bone for classificationof the subjects. The results were compared by the Mann-Whitney U test, with the level of significanceset at p ≤ 0.05. Differences were observed between the two groups in the mid-foot regionfor all variables studied, with the observation of higher mean values in subjects with flat feet.There were also significant differences in contact area, relative load, peak pressure, and meanpressure between groups. The present study demonstrates the importance of paying attentionto subjects with flat feet because changes in plantar pressure distribution are associated withdiscomfort and injuries.
International Nuclear Information System (INIS)
Lopez, G.
1991-01-01
The quench simulations of a superconducting (s.c.) magnet requires some assumptions about the evolution of the normal zone and its temperature profile. The axial evolution of the normal zone is considered through the longitudinal quench velocity. However, the transversal quench propagation may be considered through the transversal quench velocity or with the turn-to-turn time delay quench propagation. The temperature distribution has been assumed adiabatic-like or cosine-like in two different computer programs. Although both profiles are different, they bring about more or less the same qualitative quench results differing only in about 8%. Unfortunately, there are not experimental data for the temperature profile along the conductor in a quench event to have a realistic comparison. Little attention has received the temperature profile, mainly because it is not so critical parameter in the quench analysis. Nonetheless, a confident quench analysis requires that the temperature distribution along the normal zone be taken into account with good approximation. In this paper, an analytical study is made about the temperature profile
Directory of Open Access Journals (Sweden)
Jinhong Noh
2016-04-01
Full Text Available Obstacle avoidance methods require knowledge of the distance between a mobile robot and obstacles in the environment. However, in stochastic environments, distance determination is difficult because objects have position uncertainty. The purpose of this paper is to determine the distance between a robot and obstacles represented by probability distributions. Distance determination for obstacle avoidance should consider position uncertainty, computational cost and collision probability. The proposed method considers all of these conditions, unlike conventional methods. It determines the obstacle region using the collision probability density threshold. Furthermore, it defines a minimum distance function to the boundary of the obstacle region with a Lagrange multiplier method. Finally, it computes the distance numerically. Simulations were executed in order to compare the performance of the distance determination methods. Our method demonstrated a faster and more accurate performance than conventional methods. It may help overcome position uncertainty issues pertaining to obstacle avoidance, such as low accuracy sensors, environments with poor visibility or unpredictable obstacle motion.
International Nuclear Information System (INIS)
Currie, L.A.
2001-01-01
Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85 Kr and 14 C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger- Mueller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test - for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14 C- 12 C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban ''soot''. The third, environmentally skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85 Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom. (orig.)
Currie, L A
2001-07-01
Three general classes of skewed data distributions have been encountered in research on background radiation, chemical and radiochemical blanks, and low levels of 85Kr and 14C in the atmosphere and the cryosphere. The first class of skewed data can be considered to be theoretically, or fundamentally skewed. It is typified by the exponential distribution of inter-arrival times for nuclear counting events for a Poisson process. As part of a study of the nature of low-level (anti-coincidence) Geiger-Muller counter background radiation, tests were performed on the Poisson distribution of counts, the uniform distribution of arrival times, and the exponential distribution of inter-arrival times. The real laboratory system, of course, failed the (inter-arrival time) test--for very interesting reasons, linked to the physics of the measurement process. The second, computationally skewed, class relates to skewness induced by non-linear transformations. It is illustrated by non-linear concentration estimates from inverse calibration, and bivariate blank corrections for low-level 14C-12C aerosol data that led to highly asymmetric uncertainty intervals for the biomass carbon contribution to urban "soot". The third, environmentally, skewed, data class relates to a universal problem for the detection of excursions above blank or baseline levels: namely, the widespread occurrence of ab-normal distributions of environmental and laboratory blanks. This is illustrated by the search for fundamental factors that lurk behind skewed frequency distributions of sulfur laboratory blanks and 85Kr environmental baselines, and the application of robust statistical procedures for reliable detection decisions in the face of skewed isotopic carbon procedural blanks with few degrees of freedom.
Messiter, A. F.
1980-01-01
Asymptotic solutions are derived for the pressure distribution in the interaction of a weak normal shock wave with a turbulent boundary layer. The undisturbed boundary layer is characterized by the law of the wall and the law of the wake for compressible flow. In the limiting case considered, for 'high' transonic speeds, the sonic line is very close to the wall. Comparisons with experiment are shown, with corrections included for the effect of longitudinal wall curvature and for the boundary-layer displacement effect in a circular pipe.
M-dwarf exoplanet surface density distribution. A log-normal fit from 0.07 to 400 AU
Meyer, Michael R.; Amara, Adam; Reggiani, Maddalena; Quanz, Sascha P.
2018-04-01
Aims: We fit a log-normal function to the M-dwarf orbital surface density distribution of gas giant planets, over the mass range 1-10 times that of Jupiter, from 0.07 to 400 AU. Methods: We used a Markov chain Monte Carlo approach to explore the likelihoods of various parameter values consistent with point estimates of the data given our assumed functional form. Results: This fit is consistent with radial velocity, microlensing, and direct-imaging observations, is well-motivated from theoretical and phenomenological points of view, and predicts results of future surveys. We present probability distributions for each parameter and a maximum likelihood estimate solution. Conclusions: We suggest that this function makes more physical sense than other widely used functions, and we explore the implications of our results on the design of future exoplanet surveys.
Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Ouchi, Noriyuki B; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki
2013-01-01
Clonogenicity gives important information about the cellular reproductive potential following ionizing irradiation, but an abortive colony that fails to continue to grow remains poorly characterized. It was recently reported that the fraction of abortive colonies increases with increasing dose. Thus, we set out to investigate the production kinetics of abortive colonies using a model of branching processes. We firstly plotted the experimentally determined colony size distribution of abortive colonies in irradiated normal human fibroblasts, and found the linear relationship on the log-linear or log-log plot. By applying the simple model of branching processes to the linear relationship, we found the persistent reproductive cell death (RCD) over several generations following irradiation. To verify the estimated probability of RCD, abortive colony size distribution (≤ 15 cells) and the surviving fraction were simulated by the Monte Carlo computational approach for colony expansion. Parameters estimated from the log-log fit demonstrated the good performance in both simulations than those from the log-linear fit. Radiation-induced RCD, i.e. excess probability, lasted over 16 generations and mainly consisted of two components in the early (probability over 5 generations, whereas abortive colony size distribution was robust against it. These results suggest that, whereas short-term RCD is critical to the abortive colony size distribution, long-lasting RCD is important for the dose response of the surviving fraction. Our present model provides a single framework for understanding the behavior of primary cell colonies in culture following irradiation.
Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus
2017-01-01
Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.
Elshahaby, Fatma E. A.; Ghaly, Michael; Jha, Abhinav K.; Frey, Eric C.
2015-03-01
Model Observers are widely used in medical imaging for the optimization and evaluation of instrumentation, acquisition parameters and image reconstruction and processing methods. The channelized Hotelling observer (CHO) is a commonly used model observer in nuclear medicine and has seen increasing use in other modalities. An anthropmorphic CHO consists of a set of channels that model some aspects of the human visual system and the Hotelling Observer, which is the optimal linear discriminant. The optimality of the CHO is based on the assumption that the channel outputs for data with and without the signal present have a multivariate normal distribution with equal class covariance matrices. The channel outputs result from the dot product of channel templates with input images and are thus the sum of a large number of random variables. The central limit theorem is thus often used to justify the assumption that the channel outputs are normally distributed. In this work, we aim to examine this assumption for realistically simulated nuclear medicine images when various types of signal variability are present.
Ibrahim, Mohamed
2017-08-28
Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.
Differentiation in boron distribution in adult male and female rats' normal brain: A BNCT approach
Energy Technology Data Exchange (ETDEWEB)
Goodarzi, Samereh, E-mail: samere.g@gmail.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Pazirandeh, Ali, E-mail: paziran@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of); Jameie, Seyed Behnamedin, E-mail: behnamjameie@tums.ac.ir [Basic Science Department, Faculty of Allied Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Department of Anatomy, Faculty of Medicine, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Baghban Khojasteh, Nasrin, E-mail: khojasteh_n@yahoo.com [Department of Nuclear Engineering, Science and Research Branch, Islamic Azad University, PO Box 19395-1943, Tehran (Iran, Islamic Republic of)
2012-06-15
Boron distribution in adult male and female rats' normal brain after boron carrier injection (0.005 g Boric Acid+0.005 g Borax+10 ml distilled water, pH: 7.4) was studied in this research. Coronal sections of control and trial animal tissue samples were irradiated with thermal neutrons. Using alpha autoradiography, significant differences in boron concentration were seen in forebrain, midbrain and hindbrain sections of male and female animal groups with the highest value, four hours after boron compound injection. - Highlights: Black-Right-Pointing-Pointer Boron distribution in male and female rats' normal brain was studied in this research. Black-Right-Pointing-Pointer Coronal sections of animal tissue samples were irradiated with thermal neutrons. Black-Right-Pointing-Pointer Alpha and Lithium tracks were counted using alpha autoradiography. Black-Right-Pointing-Pointer Different boron concentration was seen in brain sections of male and female rats. Black-Right-Pointing-Pointer The highest boron concentration was seen in 4 h after boron compound injection.
Mingxing Zhu; Wanzhang Yang; Samuel, Oluwarotimi Williams; Yun Xiang; Jianping Huang; Haiqing Zou; Guanglin Li
2016-08-01
Pharyngeal phase is a central hub of swallowing in which food bolus pass through from the oral cavity to the esophageal. Proper understanding of the muscular activities in the pharyngeal phase is useful for assessing swallowing function and the occurrence of dysphagia in humans. In this study, high-density (HD) surface electromyography (sEMG) was used to study the muscular activities in the pharyngeal phase during swallowing tasks involving three healthy male subjects. The root mean square (RMS) of the HD sEMG data was computed by using a series of segmented windows as myoelectrical energy. And the RMS of each window covering all channels (16×5) formed a matrix. During the pharyngeal phase of swallowing, three of the matrixes were chosen and normalized to obtain the HD energy maps and the statistical parameter. The maps across different viscosity levels offered the energy distribution which showed the muscular activities of the left and right sides of the front neck muscles. In addition, the normalized average RMS (NARE) across different viscosity levels revealed a left-right significant correlation (r=0.868±0.629, pstronger correlation when swallowing water. This pilot study suggests that HD sEMG would be a potential tool to evaluate muscular activities in pharyngeal phase during normal swallowing. Also, it might provide useful information for dysphagia diagnosis.
Luo, Qingzhi; Jin, Qi; Zhang, Ning; Han, Yanxin; Wang, Yilong; Huang, Shangwei; Lin, Changjian; Ling, Tianyou; Chen, Kang; Pan, Wenqi; Wu, Liqun
2017-04-13
The objective of this study was to detect differences in the distribution of the left and right ventricle (LV & RV) activation rate (AR) during short-duration ventricular fibrillation (SDVF, 1 min) in normal and heart failure (HF) canine hearts. Ventricular fibrillation (VF) was electrically induced in six healthy dogs (control group) and six dogs with right ventricular pacing-induced congestive HF (HF group). Two 64-electrode basket catheters deployed in the LV and RV were used for global endocardium electrical mapping. The AR of VF was estimated by fast Fourier transform analysis from each electrode. In the control group, the LV was activated faster than the RV in the first 20 s, after which there was no detectable difference in the AR between them. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the posterior LV was activated fastest, while the anterior was slowest. In the HF group, a detectable AR gradient existed between the two ventricles within 3 min of VF, with the LV activating more quickly than the RV. When analyzing the distribution of the AR within the bi-ventricles at 3 min of LDVF, the septum of the LV was activated fastest, while the anterior was activated slowest. A global bi-ventricular endocardial AR gradient existed within the first 20 s of VF but disappeared in the LDVF in healthy hearts. However, the AR gradient was always observed in both SDVF and LDVF in HF hearts. The findings of this study suggest that LDVF in HF hearts can be maintained differently from normal hearts, which accordingly should lead to the development of different management strategies for LDVF resuscitation.
Spatial Distribution of Iron Within the Normal Human Liver Using Dual-Source Dual-Energy CT Imaging.
Abadia, Andres F; Grant, Katharine L; Carey, Kathleen E; Bolch, Wesley E; Morin, Richard L
2017-11-01
Explore the potential of dual-source dual-energy (DSDE) computed tomography (CT) to retrospectively analyze the uniformity of iron distribution and establish iron concentration ranges and distribution patterns found in healthy livers. Ten mixtures consisting of an iron nitrate solution and deionized water were prepared in test tubes and scanned using a DSDE 128-slice CT system. Iron images were derived from a 3-material decomposition algorithm (optimized for the quantification of iron). A conversion factor (mg Fe/mL per Hounsfield unit) was calculated from this phantom study as the quotient of known tube concentrations and their corresponding CT values. Retrospective analysis was performed of patients who had undergone DSDE imaging for renal stones. Thirty-seven patients with normal liver function were randomly selected (mean age, 52.5 years). The examinations were processed for iron concentration. Multiple regions of interest were analyzed, and iron concentration (mg Fe/mL) and distribution was reported. The mean conversion factor obtained from the phantom study was 0.15 mg Fe/mL per Hounsfield unit. Whole-liver mean iron concentrations yielded a range of 0.0 to 2.91 mg Fe/mL, with 94.6% (35/37) of the patients exhibiting mean concentrations below 1.0 mg Fe/mL. The most important finding was that iron concentration was not uniform and patients exhibited regionally high concentrations (36/37). These regions of higher concentration were observed to be dominant in the middle-to-upper part of the liver (75%), medially (72.2%), and anteriorly (83.3%). Dual-source dual-energy CT can be used to assess the uniformity of iron distribution in healthy subjects. Applying similar techniques to unhealthy livers, future research may focus on the impact of hepatic iron content and distribution for noninvasive assessment in diseased subjects.
Institute of Scientific and Technical Information of China (English)
GUO Huarong; HUANG Bing; QI Fei; ZHANG Shicui
2007-01-01
The distribution and ultrastructure of pigment cells in skins of normal and albino adult turbots were examined with transmission electron microscopy (TEM). Three types of pigment cells of melanophore, iridophore and xanthophore have been recognized in adult turbot skins. The skin color depends mainly on the amount and distribution of melanophore and iridophore, as xanthophore is quite rare. No pigment cells can be found in the epidermis of the skins. In the pigmented ocular skin of the turbot, melanophore and iridophore are usually co-localized in the dermis. This is quite different from the distribution in larvae skin. In albino and white blind skins of adult turbots, however, only iridophore monolayer still exists, while the melanophore monolayer disappears. This cytological evidence explains why the albino adult turbot, unlike its larvae, could never resume its body color no matter what environmental and nutritional conditions were provided. Endocytosis is quite active in the cellular membrane of the iridophore. This might be related to the formation of reflective platelet and stability of the iridophore.
Bellier, Edwige; Grøtan, Vidar; Engen, Steinar; Schartau, Ann Kristin; Diserud, Ola H; Finstad, Anders G
2012-10-01
Obtaining accurate estimates of diversity indices is difficult because the number of species encountered in a sample increases with sampling intensity. We introduce a novel method that requires that the presence of species in a sample to be assessed while the counts of the number of individuals per species are only required for just a small part of the sample. To account for species included as incidence data in the species abundance distribution, we modify the likelihood function of the classical Poisson log-normal distribution. Using simulated community assemblages, we contrast diversity estimates based on a community sample, a subsample randomly extracted from the community sample, and a mixture sample where incidence data are added to a subsample. We show that the mixture sampling approach provides more accurate estimates than the subsample and at little extra cost. Diversity indices estimated from a freshwater zooplankton community sampled using the mixture approach show the same pattern of results as the simulation study. Our method efficiently increases the accuracy of diversity estimates and comprehension of the left tail of the species abundance distribution. We show how to choose the scale of sample size needed for a compromise between information gained, accuracy of the estimates and cost expended when assessing biological diversity. The sample size estimates are obtained from key community characteristics, such as the expected number of species in the community, the expected number of individuals in a sample and the evenness of the community.
Directory of Open Access Journals (Sweden)
Habib Baghirov
Full Text Available The treatment of brain diseases is hindered by the blood-brain barrier (BBB preventing most drugs from entering the brain. Focused ultrasound (FUS with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma.
Directory of Open Access Journals (Sweden)
Jessica Freundt-Revilla
Full Text Available The endocannabinoid system is a regulatory pathway consisting of two main types of cannabinoid receptors (CB1 and CB2 and their endogenous ligands, the endocannabinoids. The CB1 receptor is highly expressed in the central and peripheral nervous systems (PNS in mammalians and is involved in neuromodulatory functions. Since endocannabinoids were shown to be elevated in cerebrospinal fluid of epileptic dogs, knowledge about the species specific CB receptor expression in the nervous system is required. Therefore, we assessed the spatial distribution of CB1 receptors in the normal canine CNS and PNS. Immunohistochemistry of several regions of the brain, spinal cord and peripheral nerves from a healthy four-week-old puppy, three six-month-old dogs, and one ten-year-old dog revealed strong dot-like immunoreactivity in the neuropil of the cerebral cortex, Cornu Ammonis (CA and dentate gyrus of the hippocampus, midbrain, cerebellum, medulla oblongata and grey matter of the spinal cord. Dense CB1 expression was found in fibres of the globus pallidus and substantia nigra surrounding immunonegative neurons. Astrocytes were constantly positive in all examined regions. CB1 labelled neurons and satellite cells of the dorsal root ganglia, and myelinating Schwann cells in the PNS. These results demonstrate for the first time the spatial distribution of CB1 receptors in the healthy canine CNS and PNS. These results can be used as a basis for further studies aiming to elucidate the physiological consequences of this particular anatomical and cellular distribution.
Energy Technology Data Exchange (ETDEWEB)
Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)
2015-09-25
Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.
Baghirov, Habib; Snipstad, Sofie; Sulheim, Einar; Berg, Sigrid; Hansen, Rune; Thorsen, Frits; Mørch, Yrr; Åslund, Andreas K. O.
2018-01-01
The treatment of brain diseases is hindered by the blood-brain barrier (BBB) preventing most drugs from entering the brain. Focused ultrasound (FUS) with microbubbles can open the BBB safely and reversibly. Systemic drug injection might induce toxicity, but encapsulation into nanoparticles reduces accumulation in normal tissue. Here we used a novel platform based on poly(2-ethyl-butyl cyanoacrylate) nanoparticle-stabilized microbubbles to permeabilize the BBB in a melanoma brain metastasis model. With a dual-frequency ultrasound transducer generating FUS at 1.1 MHz and 7.8 MHz, we opened the BBB using nanoparticle-microbubbles and low-frequency FUS, and applied high-frequency FUS to generate acoustic radiation force and push nanoparticles through the extracellular matrix. Using confocal microscopy and image analysis, we quantified nanoparticle extravasation and distribution in the brain parenchyma. We also evaluated haemorrhage, as well as the expression of P-glycoprotein, a key BBB component. FUS and microbubbles distributed nanoparticles in the brain parenchyma, and the distribution depended on the extent of BBB opening. The results from acoustic radiation force were not conclusive, but in a few animals some effect could be detected. P-glycoprotein was not significantly altered immediately after sonication. In summary, FUS with our nanoparticle-stabilized microbubbles can achieve accumulation and displacement of nanoparticles in the brain parenchyma. PMID:29338016
International Nuclear Information System (INIS)
Silva, M.P.; Oliveira, M.A.; Poletti, M.E.
2012-01-01
Full text: Some trace elements, naturally present in breast tissues, participate in a large number of biological processes, which include among others, activation or inhibition of enzymatic reactions and changes on cell membranes permeability, suggesting that these elements may influence carcinogenic processes. Thus, knowledge of the amounts of these elements and their spatial distribution in normal and neoplastic tissues may help in understanding the role of these elements in the carcinogenic process and tumor progression of breast cancers. Concentrations of trace elements like Ca, Fe, Cu and Zn, previously studied at LNLS using TXRF and conventional XRF, were elevated in neoplastic breast tissues compared to normal tissues. In this study we determined the spatial distribution of these elements in normal and neoplastic breast tissues using μ-XRF technique. We analyzed 22 samples of normal and neoplastic breast tissues (malignant and benign) obtained from paraffin blocks available for study at the Department of Pathology HC-FMRP/USP. From the blocks, a small fraction of material was removed and subjected to histological sections of 60 μm thick made with a microtome. The slices where placed in holder samples and covered with ultralen film. Tissue samples were irradiated with a white beam of synchrotron radiation. The samples were positioned at 45 degrees with respect to the incident beam on a table with 3 freedom degrees (x, y and z), allowing independent positioning of the sample in these directions. The white beam was collimated by a 20 μm microcapillary and samples were fully scanned. At each step, a spectrum was detected for 10 s. The fluorescence emitted by elements present in the sample was detected by a Si (Li) detector with 165 eV at 5.9 keV energy resolution, placed at 90 deg with respect to the incident beam. Results reveal that trace elements Ca-Zn and Fe-Cu could to be correlated in malignant breast tissues. Quantitative results, achieved by Spearman
Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd
2008-01-01
This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…
International Nuclear Information System (INIS)
Alyar, S.
2008-01-01
N-substituted sulfonamides are well known for their diuretic, antidiabetic, antibacterial and antifungal, anticancer e.g., and are widely used in the therapy of patients. These important bioactive properties are strongly affected by the special features of -CH 2 -SO 2 -NR-linker and intramolecular motion Thus, the studies of energetic and spatial properties on N-substituted sulfonamides are of great importance to improve our understanding of their biological activities and enhance abilities to predict new drugs. Density Functional Theory B3LYP /6-31G(d,p) level has been applied to obtain the vibrational force field for the most stable conformation of N,N'-etilenbis(p-toluensulfonamit)(ptsen)having sulfonamide moiety. The results of these calculation have been compared with spectroscopic data to verify accuracy of calculation and applicability of the DFT approach to ptsen. Additionally, complete normal coordinate analyses with quantum mechanical scaling (SQM) were performed to derive the potential energy distributions (PE)
Peters, B. C., Jr.; Walker, H. F.
1976-01-01
The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.
Peters, B. C., Jr.; Walker, H. F.
1978-01-01
This paper addresses the problem of obtaining numerically maximum-likelihood estimates of the parameters for a mixture of normal distributions. In recent literature, a certain successive-approximations procedure, based on the likelihood equations, was shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, we introduce a general iterative procedure, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. We show that, with probability 1 as the sample size grows large, this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. We also show that the step-size which yields optimal local convergence rates for large samples is determined in a sense by the 'separation' of the component normal densities and is bounded below by a number between 1 and 2.
International Nuclear Information System (INIS)
Sakashita, Tetsuya; Kobayashi, Yasuhiko; Hamada, Nobuyuki; Kawaguchi, Isao; Hara, Takamitsu; Saito, Kimiaki
2014-01-01
A single cell can form a colony, and ionizing irradiation has long been known to reduce such a cellular clonogenic potential. Analysis of abortive colonies unable to continue to grow should provide important information on the reproductive cell death (RCD) following irradiation. Our previous analysis with a branching process model showed that the RCD in normal human fibroblasts can persist over 16 generations following irradiation with low linear energy transfer (LET) γ-rays. Here we further set out to evaluate the RCD persistency in abortive colonies arising from normal human fibroblasts exposed to high-LET carbon ions (18.3 MeV/u, 108 keV/μm). We found that the abortive colony size distribution determined by biological experiments follows a linear relationship on the log–log plot, and that the Monte Carlo simulation using the RCD probability estimated from such a linear relationship well simulates the experimentally determined surviving fraction and the relative biological effectiveness (RBE). We identified the short-term phase and long-term phase for the persistent RCD following carbon-ion irradiation, which were similar to those previously identified following γ-irradiation. Taken together, our results suggest that subsequent secondary or tertiary colony formation would be invaluable for understanding the long-lasting RCD. All together, our framework for analysis with a branching process model and a colony formation assay is applicable to determination of cellular responses to low- and high-LET radiation, and suggests that the long-lasting RCD is a pivotal determinant of the surviving fraction and the RBE. (author)
Yamada, Shigeki; Ishikawa, Masatsune; Yamamoto, Kazuo
2017-01-01
In spite of growing evidence of idiopathic normal-pressure hydrocephalus (NPH), a viewpoint about clinical care for idiopathic NPH is still controversial. A continuous divergence of viewpoints might be due to confusing classifications of idiopathic and adult-onset congenital NPH. To elucidate the classification of NPH, we propose that adult-onset congenital NPH should be explicitly distinguished from idiopathic and secondary NPH. On the basis of conventional CT scan or MRI, idiopathic NPH was defined as narrow sulci at the high convexity in concurrent with enlargement of the ventricles, basal cistern and Sylvian fissure, whereas adult-onset congenital NPH was defined as huge ventricles without high-convexity tightness. We compared clinical characteristics and cerebrospinal fluid distribution among 85 patients diagnosed with idiopathic NPH, 17 patients with secondary NPH, and 7 patients with adult-onset congenital NPH. All patients underwent 3-T MRI examinations and tap-tests. The volumes of ventricles and subarachnoid spaces were measured using a 3D workstation based on T2-weighted 3D sequences. The mean intracranial volume for the patients with adult-onset congenital NPH was almost 100 mL larger than the volumes for patients with idiopathic and secondary NPH. Compared with the patients with idiopathic or secondary NPH, patients with adult-onset congenital NPH exhibited larger ventricles but normal sized subarachnoid spaces. The mean volume ratio of the high-convexity subarachnoid space was significantly less in idiopathic NPH than in adult-onset congenital NPH, whereas the mean volume ratio of the basal cistern and Sylvian fissure in idiopathic NPH was >2 times larger than that in adult-onset congenital NPH. The symptoms of gait disturbance, cognitive impairment, and urinary incontinence in patients with adult-onset congenital NPH tended to progress more slowly compared to their progress in patients with idiopathic NPH. Cerebrospinal fluid distributions and
International Nuclear Information System (INIS)
Blasdell, R.C.; Ceperley, D.M.; Simmons, R.O.
1993-07-01
Deep inelastic neutron scattering has been used to measure the neutron Compton profile (NCP) of a series of condensed 4 He samples at densities from 28.8 atoms/nm 3 (essentially the minimum possible density in the solid phase) up to 39.8 atoms/nm 3 using a chopper spectrometer at the Argonne National Laboratory Intense Pulsed Neutron Source. At the lowest density, the NCP was measured along an isochore through the hcp, bcc, and normal liquid phases. Average atomic kinetic energies are extracted from each of the data sets and are compared to both published and new path integral Monte-Carlo (PIMC) calculations as well as other theoretical predictions. In this preliminary analysis of the data, account is taken of the effects of instrumental resolution, multiple scattering, and final-state interactions. Both our measurements and the PIMC theory show that there are only small differences in the kinetic energy and longitudinal momentum distribution of isochoric helium samples, regardless of their phase or crystal structure
International Nuclear Information System (INIS)
Gigase, Yves
2007-01-01
Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide as example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)
Geravanchizadeh, Masoud; Fallah, Ali
2015-12-01
A binaural and psychoacoustically motivated intelligibility model, based on a well-known monaural microscopic model is proposed. This model simulates a phoneme recognition task in the presence of spatially distributed speech-shaped noise in anechoic scenarios. In the proposed model, binaural advantage effects are considered by generating a feature vector for a dynamic-time-warping speech recognizer. This vector consists of three subvectors incorporating two monaural subvectors to model the better-ear hearing, and a binaural subvector to simulate the binaural unmasking effect. The binaural unit of the model is based on equalization-cancellation theory. This model operates blindly, which means separate recordings of speech and noise are not required for the predictions. Speech intelligibility tests were conducted with 12 normal hearing listeners by collecting speech reception thresholds (SRTs) in the presence of single and multiple sources of speech-shaped noise. The comparison of the model predictions with the measured binaural SRTs, and with the predictions of a macroscopic binaural model called extended equalization-cancellation, shows that this approach predicts the intelligibility in anechoic scenarios with good precision. The square of the correlation coefficient (r(2)) and the mean-absolute error between the model predictions and the measurements are 0.98 and 0.62 dB, respectively.
Fan, L; Shang, X; Zhu, J; Ma, B; Zhang, Q
2018-05-02
In this study, we assessed the therapeutic effects of fosfomycin tromethamine (FT) in a bacterial prostatitis (BP) rat model. The BP model was induced by Escherichia coli and was demonstrated after 7 days microbiologically and histologically. Then, 25 BP rats selected were randomly divided into five treatment groups: model group, positive group, FT-3 day group, FT-7 day group and FT-14 day group. Ventral lobes of prostate from all animals were removed, and the serum samples were collected at the end of the experiments. Microbiological cultures and histological findings of the prostate samples demonstrated reduced bacterial growth and improved inflammatory responses in FT-treatment groups compared with the model group, indicating that FT against prostatic infection induced by E. coli showed good antibacterial effects. Moreover, plasma pharmacokinetics and prostatic distribution of fosfomycin were studied and compared in BP and normal rats. The concentrations of fosfomycin in samples were analysed by liquid chromatography-tandem mass spectrometry. There were no differences in plasma pharmacokinetic parameters between two groups. But significantly higher penetration of fosfomycin into prostatic tissues was found in BP rats. We therefore suggested that FT had a good therapeutic effect on BP and it might be used in curing masculine reproductive system diseases. © 2018 Blackwell Verlag GmbH.
Schwantes-An, Tae-Hwi; Sung, Heejong; Sabourin, Jeremy A; Justice, Cristina M; Sorant, Alexa J M; Wilson, Alexander F
2016-01-01
In this study, the effects of (a) the minor allele frequency of the single nucleotide variant (SNV), (b) the degree of departure from normality of the trait, and (c) the position of the SNVs on type I error rates were investigated in the Genetic Analysis Workshop (GAW) 19 whole exome sequence data. To test the distribution of the type I error rate, 5 simulated traits were considered: standard normal and gamma distributed traits; 2 transformed versions of the gamma trait (log 10 and rank-based inverse normal transformations); and trait Q1 provided by GAW 19. Each trait was tested with 313,340 SNVs. Tests of association were performed with simple linear regression and average type I error rates were determined for minor allele frequency classes. Rare SNVs (minor allele frequency < 0.05) showed inflated type I error rates for non-normally distributed traits that increased as the minor allele frequency decreased. The inflation of average type I error rates increased as the significance threshold decreased. Normally distributed traits did not show inflated type I error rates with respect to the minor allele frequency for rare SNVs. There was no consistent effect of transformation on the uniformity of the distribution of the location of SNVs with a type I error.
Yang, Tao; Liu, Shan; Wang, Chang-Hong; Tao, Yan-Yan; Zhou, Hua; Liu, Cheng-Hai
2015-10-10
Fuzheng Huayu recipe (FZHY) is a herbal product for the treatment of liver fibrosis approved by the Chinese State Food and Drug Administration (SFDA), but its pharmacokinetics and tissue distribution had not been investigated. In this study, the liver fibrotic model was induced with intraperitoneal injection of dimethylnitrosamine (DMN), and FZHY was given orally to the model and normal rats. The plasma pharmacokinetics and tissue distribution profiles of four major bioactive components from FZHY were analyzed in the normal and fibrotic rat groups using an ultrahigh performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. Results revealed that the bioavailabilities of danshensu (DSS), salvianolic acid B (SAB) and rosmarinic acid (ROS) in liver fibrotic rats increased 1.49, 3.31 and 2.37-fold, respectively, compared to normal rats. There was no obvious difference in the pharmacokinetics of amygdalin (AMY) between the normal and fibrotic rats. The tissue distribution of DSS, SAB, and AMY trended to be mostly in the kidney and lung. The distribution of DSS, SAB, and AMY in liver tissue of the model rats was significantly decreased compared to the normal rats. Significant differences in the pharmacokinetics and tissue distribution profiles of DSS, ROS, SAB and AMY were observed in rats with hepatic fibrosis after oral administration of FZHY. These results provide a meaningful basis for developing a clinical dosage regimen in the treatment of hepatic fibrosis by FZHY. Copyright © 2015 Elsevier B.V. All rights reserved.
Plancade, Sandra; Rozenholc, Yves; Lund, Eiliv
2012-12-11
Illumina BeadArray technology includes non specific negative control features that allow a precise estimation of the background noise. As an alternative to the background subtraction proposed in BeadStudio which leads to an important loss of information by generating negative values, a background correction method modeling the observed intensities as the sum of the exponentially distributed signal and normally distributed noise has been developed. Nevertheless, Wang and Ye (2012) display a kernel-based estimator of the signal distribution on Illumina BeadArrays and suggest that a gamma distribution would represent a better modeling of the signal density. Hence, the normal-exponential modeling may not be appropriate for Illumina data and background corrections derived from this model may lead to wrong estimation. We propose a more flexible modeling based on a gamma distributed signal and a normal distributed background noise and develop the associated background correction, implemented in the R-package NormalGamma. Our model proves to be markedly more accurate to model Illumina BeadArrays: on the one hand, it is shown on two types of Illumina BeadChips that this model offers a more correct fit of the observed intensities. On the other hand, the comparison of the operating characteristics of several background correction procedures on spike-in and on normal-gamma simulated data shows high similarities, reinforcing the validation of the normal-gamma modeling. The performance of the background corrections based on the normal-gamma and normal-exponential models are compared on two dilution data sets, through testing procedures which represent various experimental designs. Surprisingly, we observe that the implementation of a more accurate parametrisation in the model-based background correction does not increase the sensitivity. These results may be explained by the operating characteristics of the estimators: the normal-gamma background correction offers an improvement
Markiewicz, Erica; Fan, Xiaobing; Mustafi, Devkumar; Zamora, Marta; Conzen, Suzanne D; Karczmar, Gregory S
2017-07-01
High resolution 3D MRI was used to study contrast agent distribution and leakage in normal mouse mammary glands and glands containing in situ cancer after intra-ductal injection. Five female FVB/N mice (~19weeks old) with no detectable mammary cancer and eight C3(1) SV40 Tag virgin female mice (~15weeks old) with extensive in situ cancer were studied. A 34G, 45° tip Hamilton needle with a 25μL Hamilton syringe was inserted into the tip of the nipple and approximately 15μL of a Gadodiamide was injected slowly over 1min into the nipple and throughout the duct on one side of the inguinal gland. Following injection, the mouse was placed in a 9.4T MRI scanner, and a series of high resolution 3D T1-weighted images was acquired with a temporal resolution of 9.1min to follow contrast agent leakage from the ducts. The first image was acquired at about 12min after injection. Ductal enhancement regions detected in images acquired between 12 and 21min after contrast agent injection was five times smaller in SV40 mouse mammary ducts (pcontrast agent from the SV40 ducts. The contrast agent washout rate measured between 12min and 90min after injection was ~20% faster (p<0.004) in SV40 mammary ducts than in FVB/N mammary ducts. These results may be due to higher permeability of the SV40 ducts, likely due to the presence of in situ cancers. Therefore, increased permeability of ducts may indicate early stage breast cancers. Copyright © 2017 Elsevier Inc. All rights reserved.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
Sarikaya, Ismet; Elgazzar, Abdelhamid H; Sarikaya, Ali; Alfeeli, Mahmoud
2017-10-01
Fluorine-18-sodium fluoride (F-NaF) PET/CT is a relatively new and high-resolution bone imaging modality. Since the use of F-NaF PET/CT has been increasing, it is important to accurately assess the images and be aware of normal distribution and major artifacts. In this pictorial review article, we will describe the normal uptake patterns of F-NaF in the bone tissues, particularly in complex structures, as well as its physiologic soft tissue distribution and certain artifacts seen on F-NaF PET/CT images.
Liu, Geng; Niu, Junjie; Zhang, Chao; Guo, Guanlin
2015-12-01
Data distribution is usually skewed severely by the presence of hot spots in contaminated sites. This causes difficulties for accurate geostatistical data transformation. Three types of typical normal distribution transformation methods termed the normal score, Johnson, and Box-Cox transformations were applied to compare the effects of spatial interpolation with normal distribution transformation data of benzo(b)fluoranthene in a large-scale coking plant-contaminated site in north China. Three normal transformation methods decreased the skewness and kurtosis of the benzo(b)fluoranthene, and all the transformed data passed the Kolmogorov-Smirnov test threshold. Cross validation showed that Johnson ordinary kriging has a minimum root-mean-square error of 1.17 and a mean error of 0.19, which was more accurate than the other two models. The area with fewer sampling points and that with high levels of contamination showed the largest prediction standard errors based on the Johnson ordinary kriging prediction map. We introduce an ideal normal transformation method prior to geostatistical estimation for severely skewed data, which enhances the reliability of risk estimation and improves the accuracy for determination of remediation boundaries.
Onishi, Airin; Fujiwara, Yoshinori; Ishiwata, Kiichi; Ishii, Kenji
2017-01-01
Background Increasing plasma glucose levels and insulin resistance can alter the distribution pattern of fluorine-18-labeled fluorodeoxyglucose (18F-FDG) in the brain and relatively reduce 18F-FDG uptake in Alzheimer's disease (AD)-related hypometabolic regions, leading to the appearance of an AD-like pattern. However, its relationship with plasma insulin levels is unclear. We aimed to compare the effects of plasma glucose levels, plasma insulin levels and insulin resistance on the appearance of the AD-like pattern in 18F-FDG images. Methods Fifty-nine cognitively normal older subjects (age = 75.7 ± 6.4 years) underwent 18F-FDG positron emission tomography along with measurement of plasma glucose and insulin levels. As an index of insulin resistance, the Homeostasis model assessment of Insulin Resistance (HOMA-IR) was calculated. Results Plasma glucose levels, plasma insulin levels, and HOMA-IR were 102.2 ± 8.1 mg/dL, 4.1 ± 1.9 μU/mL, and 1.0 ± 0.5, respectively. Whole-brain voxelwise analysis showed a negative correlation of 18F-FDG uptake with plasma glucose levels in the precuneus and lateral parietotemporal regions (cluster-corrected p < 0.05), and no correlation with plasma insulin levels or HOMA-IR. In the significant cluster, 18F-FDG uptake decreased by approximately 4–5% when plasma glucose levels increased by 20 mg/dL. In the precuneus region, volume-of-interest analysis confirmed a negative correlation of 18F-FDG uptake with plasma glucose levels (r = -0.376, p = 0.002), and no correlation with plasma insulin levels (r = 0.156, p = 0.12) or HOMA-IR (r = 0.096, p = 0.24). Conclusion This study suggests that, of the three parameters, plasma glucose levels have the greatest effect on the appearance of the AD-like pattern in 18F-FDG images. PMID:28715453
Directory of Open Access Journals (Sweden)
Kenji Ishibashi
Full Text Available Increasing plasma glucose levels and insulin resistance can alter the distribution pattern of fluorine-18-labeled fluorodeoxyglucose (18F-FDG in the brain and relatively reduce 18F-FDG uptake in Alzheimer's disease (AD-related hypometabolic regions, leading to the appearance of an AD-like pattern. However, its relationship with plasma insulin levels is unclear. We aimed to compare the effects of plasma glucose levels, plasma insulin levels and insulin resistance on the appearance of the AD-like pattern in 18F-FDG images.Fifty-nine cognitively normal older subjects (age = 75.7 ± 6.4 years underwent 18F-FDG positron emission tomography along with measurement of plasma glucose and insulin levels. As an index of insulin resistance, the Homeostasis model assessment of Insulin Resistance (HOMA-IR was calculated.Plasma glucose levels, plasma insulin levels, and HOMA-IR were 102.2 ± 8.1 mg/dL, 4.1 ± 1.9 μU/mL, and 1.0 ± 0.5, respectively. Whole-brain voxelwise analysis showed a negative correlation of 18F-FDG uptake with plasma glucose levels in the precuneus and lateral parietotemporal regions (cluster-corrected p < 0.05, and no correlation with plasma insulin levels or HOMA-IR. In the significant cluster, 18F-FDG uptake decreased by approximately 4-5% when plasma glucose levels increased by 20 mg/dL. In the precuneus region, volume-of-interest analysis confirmed a negative correlation of 18F-FDG uptake with plasma glucose levels (r = -0.376, p = 0.002, and no correlation with plasma insulin levels (r = 0.156, p = 0.12 or HOMA-IR (r = 0.096, p = 0.24.This study suggests that, of the three parameters, plasma glucose levels have the greatest effect on the appearance of the AD-like pattern in 18F-FDG images.
International Nuclear Information System (INIS)
Fay, D.; Layzer, A.
1975-01-01
The Berk--Schrieffer method of strong-coupling superconductivity for nearly ferromagnetic systems is generalized to arbitrary L-state pairing and realistic (hard-core) potentials. Application to 3 He yields a P-state transition but very low values for T/sub c/ and an unsatisfactory normal-state momentum distribution
Kalayeh, H. M.; Landgrebe, D. A.
1983-01-01
A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109
Ohmaru, Natsuki; Nakatsu, Takaaki; Izumi, Reishi; Mashima, Keiichi; Toki, Misako; Kobayashi, Asako; Ogawa, Hiroko; Hirohata, Satoshi; Ikeda, Satoru; Kusachi, Shozo
2011-01-01
Even high-normal albuminuria is reportedly associated with cardiovascular events. We determined the urine albumin creatinine ratio (UACR) in spot urine samples and analyzed the UACR distribution and the prevalence of high-normal levels. The UACR was determined using immunoturbidimetry in 332 untreated asymptomatic non-diabetic Japanese patients with hypertension and in 69 control subjects. The microalbuminuria and macroalbuminuria levels were defined as a UCAR ≥30 and creatinine and a UCAR ≥300 µg/mg·creatinine, respectively. The distribution patterns showed a highly skewed distribution for the lower levels, and a common logarithmic transformation produced a close fit to a Gaussian distribution with median, 25th and 75th percentile values of 22.6, 13.5 and 48.2 µg/mg·creatinine, respectively. When a high-normal UACR was set at >20 to creatinine, 19.9% (66/332) of the hypertensive patients exhibited a high-normal UACR. Microalbuminuria and macroalbuminuria were observed in 36.1% (120/336) and 2.1% (7/332) of the patients, respectively. UACR was significantly correlated with the systolic and diastolic blood pressures and the pulse pressure. A stepwise multivariate analysis revealed that these pressures as well as age were independent factors that increased UACR. The UACR distribution exhibited a highly skewed pattern, with approximately 60% of untreated, non-diabetic hypertensive patients exhibiting a high-normal or larger UACR. Both hypertension and age are independent risk factors that increase the UACR. The present study indicated that a considerable percentage of patients require anti-hypertensive drugs with antiproteinuric effects at the start of treatment.
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
Moustafa, Abdelmoniem; Abi-Saleh, Bernard; El-Baba, Mohammad; Hamoui, Omar; AlJaroudi, Wael
2016-02-01
In patients presenting with non-ST-elevation myocardial infarction (NSTEMI), left anterior descending (LAD) coronary artery and three-vessel disease are the most commonly encountered culprit lesions in the presence of ST depression, while one third of patients with left circumflex (LCX) artery related infarction have normal ECG. We sought to determine the predictors of presence of culprit lesion in NSTEMI patients based on ECG, echocardiographic, and clinical characteristics. Patients admitted to the coronary care unit with the diagnosis of NSTEMI between June 2012 and December 2013 were retrospectively identified. Admission ECG was interpreted by an electrophysiologist that was blinded to the result of the coronary angiogram. Patients were dichotomized into either normal or abnormal ECG group. The primary endpoint was presence of culprit lesion. Secondary endpoints included length of stay, re-hospitalization within 60 days, and in-hospital mortality. A total of 118 patients that were identified; 47 with normal and 71 with abnormal ECG. At least one culprit lesion was identified in 101 patients (86%), and significantly more among those with abnormal ECG (91.5% vs. 76.6%, P=0.041).The LAD was the most frequently detected culprit lesion in both groups. There was a higher incidence of two and three-vessel disease in the abnormal ECG group (P=0.041).On the other hand, there was a trend of higher LCX involvement (25% vs. 13.8%, P=0.18) and more normal coronary arteries in the normal ECG group (23.4% vs. 8.5%, P=0.041). On multivariate analysis, prior history of coronary artery disease (CAD) [odds ratio (OR) 6.4 (0.8-52)], male gender [OR 5.0 (1.5-17)], and abnormal admission ECG [OR 3.6 (1.12-12)], were independent predictors of a culprit lesion. There was no difference in secondary endpoints between those with normal and abnormal ECG. Among patients presenting with NSTEMI, prior history of CAD, male gender and abnormal admission ECG were independent predictors of a
Neel, John H.; Stallings, William M.
An influential statistics test recommends a Levene text for homogeneity of variance. A recent note suggests that Levene's test is upwardly biased for small samples. Another report shows inflated Alpha estimates and low power. Neither study utilized more than two sample sizes. This Monte Carlo study involved sampling from a normal population for…
International Nuclear Information System (INIS)
Silva, Mario; Nemec, Stefan F.; Dufresne, Valerie; Occhipinti, Mariaelena; Heidinger, Benedikt H.; Bankier, Alexander A.; Chamberlain, Ryan
2016-01-01
Pulmonary parametric response map (PRM) was proposed for quantitative densitometric phenotypization of chronic obstructive pulmonary disease. However, little is known about this technique in healthy subjects. The purpose of this study was to describe the normal spectrum of densitometric classification of pulmonary PRM in a group of healthy adults. 15 healthy volunteers underwent spirometrically monitored chest CT at total lung capacity (TLC) and functional residual capacity (FRC). The paired CT scans were analyzed by PRM for voxel-by-voxel characterization of lung parenchyma according to 4 densitometric classifications: normal lung (TLC ≥ -950 HU, FRC ≥ -856 HU); expiratory low attenuation area (LAA) (TLC ≥ -950 HU, FRC < -856 HU); dual LAA (TLC<-950 HU, FRC < -856 HU); uncharacterized (TLC < -950 HU, FRC ≥ -856 HU). PRM spectrum was 78 % ± 10 % normal lung, 20 % ± 8 % expiratory LAA, and 1 % ± 1 % dual LAA. PRM was similar between genders, there was moderate correlation between dual LAA and spirometrically assessed TLC (R = 0.531; p = 0.042), and between expiratory LAA and Vol Exp/Insp ratio (R = -0.572; p = 0.026). PRM reflects the predominance of normal lung parenchyma in a group of healthy volunteers. However, PRM also confirms the presence of physiological expiratory LAA seemingly related to air trapping and a minimal amount of dual LAA likely reflecting emphysema. (orig.)
International Nuclear Information System (INIS)
Miyamoto, H.; Kubo, M.; Katori, T.
1981-01-01
Experimental investigation by 3-D photoelasticity has been carried out to measure the stress distribution of partial penetration welded nozzles attached to the bottom head of a pressure vessel. A 3-D photoelastic stress freezing method was chosen as the most effective means of observation of the stress distribution in the vicinity of the nozzle/wall weld. The experimental model was a 1:20 scale spherical bottom head. Both an axisymmetric nozzle and an asymmetric nozzle were investigated. Epoxy resin, which is a thermosetting plastic, was used as the model material. The oblique effect was examined by comparing the stress distribution of the asymmetric nozzle with that of the axisymmetric nozzle. Furthermore, the experimental results were compared with the analytical results using 3-D finite element method (FEM). The stress distributions obtained from the frozen fringe pattern of the 3-D photoelastic model were in good agreement with those by 3-D FEM. (orig.)
International Nuclear Information System (INIS)
Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.
1983-01-01
The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I- 131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I- 131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The ''normal'' adrenal glands were seldom seen and faintly imaged in 2% at 24 hr after injection and in 16% at 48 hr, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extraadrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I- 131 MIBG uptake at 24 through 72 hr
International Nuclear Information System (INIS)
Nakajo, M.; Shapiro, B.; Copp, J.; Kalff, V.; Gross, M.D.; Sisson, J.C.; Beierwaltes, W.H.
1983-01-01
The scintigraphic distribution of m-[ 131 I]iodobenzylguanidine (I-131 MIBG), an adrenal medullary imaging agent, was studied to determine the patterns of uptake of this agent in man. The normal distribution of I-131 MIBG includes clear portrayal of the salivary glands, liver, spleen, and urinary bladder. The heart, middle and lower lung zones, and colon were less frequently or less clearly seen. The upper lung zones and kidneys were seldom visualized. The thyroid appeared only in cases of inadequate thyroidal blockade. The normal adrenal glands were seldom seen and faintly imaged in 2% at 24 h after injection and in 16% at 48 h, in patients shown not to have pheochromocytomas, whereas intra-adrenal, extra-adrenal, and malignant pheochromocytomas usually appeared as intense focal areas of I-131 MIBG uptake at 24 through 72 h
Energy Technology Data Exchange (ETDEWEB)
Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu [Department of Analytical and Bioinorganic Chemistry, Kyoto Pharmaceutical University, Kyoto (Japan); Hirunuma, Rieko; Enomoto, Shuichi [Radioisotope Technology Division, Cyclotron Center, Institute of Physical and Chemical Research (RIKEN), Wako, Saitama (Japan)
2001-05-01
Because vanadium ions such as vanadyl (VO{sup 2+}) and vanadate (VO{sup 3-}) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ({sup 48}V) and zinc ({sup 65}Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)
International Nuclear Information System (INIS)
Yasui, Hiroyuki; Takino, Toshikazu; Fugono, Jun; Sakurai, Hiromu; Hirunuma, Rieko; Enomoto, Shuichi
2001-01-01
Because vanadium ions such as vanadyl (VO 2+ ) and vanadate (VO 3- ) ions were demonstrated to normalize blood glucose levels of diabetic animals and patients, the action mechanism of vanadium treatment has been of interest. In this study, we focused on understanding interactions among trace elements in diabetic rats, in which a multitracer technique was used. The effects of vanadyl sulfate (VS)-treatment on the tissue distribution of trace vanadium ( 48 V) and zinc ( 65 Zn) in normal and streptozotocin (STZ)-induced diabetic rats were examined, and were evaluated in terms of the uptake ratio. The uptake ratio of both elements in tissues significantly changed between STZ-rats and those treated with VS. These results indicated that vanadium treatment in STZ-rats alters the tissue distribution of endogenous elements, suggesting the importance of the relationship between biotrace elements and pathophysiology. (author)
Nishimura, Meiko; Hayashi, Mitsuhiro; Mizutani, Yu; Takenaka, Kei; Imamura, Yoshinori; Chayahara, Naoko; Toyoda, Masanori; Kiyota, Naomi; Mukohara, Toru; Aikawa, Hiroaki; Fujiwara, Yasuhiro; Hamada, Akinobu; Minami, Hironobu
2018-04-06
The development of skin rashes is the most common adverse event observed in cancer patients treated with epidermal growth factor receptor-tyrosine kinase inhibitors such as erlotinib. However, the pharmacological evidence has not been fully revealed. Erlotinib distribution in the rashes was more heterogeneous than that in the normal skin, and the rashes contained statistically higher concentrations of erlotinib than adjacent normal skin in the superficial skin layer (229 ± 192 vs. 120 ± 103 ions/mm 2 ; P = 0.009 in paired t -test). LC-MS/MS confirmed that the concentration of erlotinib in the skin rashes was higher than that in normal skin in the superficial skin layer (1946 ± 1258 vs. 1174 ± 662 ng/cm 3 ; P = 0.028 in paired t -test). The results of MALDI-MSI and LC-MS/MS were well correlated (coefficient of correlation 0.879, P distribution of erlotinib in the skin tissue was visualized using non-labeled MALDI-MSI. Erlotinib concentration in the superficial layer of the skin rashes was higher than that in the adjacent normal skin. We examined patients with advanced pancreatic cancer who developed skin rashes after treatment with erlotinib and gemcitabine. We biopsied both the rash and adjacent normal skin tissues, and visualized and compared the distribution of erlotinib within the skin using matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI). The tissue concentration of erlotinib was also measured by liquid chromatography-tandem mass spectrometry (LC-MS/MS) with laser microdissection.
International Nuclear Information System (INIS)
Mattke, U.H.
1991-08-01
The fission product release during normal operation from the core of a high temperature reactor is well known to be very low. A HTR-Modul-reactor with a reduced power of 170 MW th is examined under the aspect whether the contamination with Cs-137 as most important nuclide will be so low that a helium turbine in the primary circuit is possible. The program SPTRAN is the tool for the computations and siumlations of fission product transport in HTRs. The program initially developed for computations of accident events has been enlarged for computing the fission product transport under the conditions of normal operation. The theoretical basis, the used programs and data basis are presented followed by the results of the computations. These results are explained and discussed; moreover the consequences and future possibilities of development are shown. (orig./HP) [de
An, Jing; Hu, Fangdi; Wang, Changhong; Zhang, Zijia; Yang, Li; Wang, Zhengtao
2016-09-01
1. Pinoresinol di-O-β-d-glucopyranoside (PDG), geniposide (GE), geniposidic acid (GA), aucubin (AN) and chlorogenic acid (CA) are the representative active ingredients in Eucommiae cortex (EC), which may be estrogenic. 2. The ultra high-performance liquid chromatography/tandem mass spectrometry (UHPLC-MS/MS) method for simultaneous determination of the five ingredients showed good linearity, low limits of quantification and high extraction recoveries, as well as acceptable precision, accuracy and stability in mice plasma and tissue samples (liver, spleen, kidney and uterus). It was successfully applied to the comparative study on pharmacokinetics and tissue distribution of PDG, GE, GA, AN and CA between normal and ovariectomized (OVX) mice. 3. The results indicated that except CA, the plasma and tissue concentrations of PDG, GE, GA in OVX mice were all greater than those in normal mice. AN could only be detected in the plasma and liver homogenate of normal mice, which was poorly absorbed in OVX mice and low in other measured tissues. PDG, GE and GA seem to be better absorbed in OVX mice than in normal mice proved by the remarkable increased value of AUC0-∞ and Cmax. It is beneficial that PDG, GE, GA have better plasma absorption and tissue distribution in pathological state.
Aragao, Glaucia M F; Corradini, Maria G; Normand, Mark D; Peleg, Micha
2007-11-01
Published survival curves of Escherichia coli in two growth media, with and without the presence of salt, at various temperatures and in a Greek eggplant salad having various levels of essential oil, all had a characteristic downward concavity when plotted on semi logarithmic coordinates. Some also exhibited what appeared as a 'shoulder' of considerable length. Regardless of whether a shoulder was noticed, the survival pattern could be considered as a manifestation of an underlying unimodal distribution of the cells' death times. Mathematically, the data could be described equally well by the Weibull and log normal distribution functions, which had similar modes, means, standard deviations and coefficients of skewness. When plotted in their probability density function (PDF) form, the curves also appeared very similar visually. This enabled us to quantify and compare the effect of temperature or essential oil concentration on the organism's survival in terms of these temporal distributions' characteristics. Increased lethality was generally expressed in a shorter mean and mode, a smaller standard deviation and increased overall symmetry as judged by the distributions' degree of skewness. The 'shoulder', as expected, simply indicated that the distribution's standard deviation was much smaller than its mode. Rate models based on the two distribution functions could be used to predict non isothermal survival patterns. They were derived on the assumption that the momentary inactivation rate is the isothermal rate at the momentary temperature at a time that corresponds to the momentary survival ratio. In this application, however, the Weibullian model with a fixed power was not only simpler and more convenient mathematically than the one based on the log normal distribution, but it also provided more accurate estimates of the dynamic inactivation patterns.
International Nuclear Information System (INIS)
Majewski, Wojciech; Wesolowska, Iwona; Urbanczyk, Hubert; Hawrylewicz, Leszek; Schwierczok, Barbara; Miszczyk, Leszek
2009-01-01
Purpose: To estimate bladder movements and changes in dose distribution in the bladder and surrounding tissues associated with changes in bladder filling and to estimate the internal treatment margins. Methods and Materials: A total of 16 patients with bladder cancer underwent planning computed tomography scans with 80- and 150-mL bladder volumes. The bladder displacements associated with the change in volume were measured. Each patient had treatment plans constructed for a 'partially empty' (80 mL) and a 'partially full' (150 mL) bladder. An additional plan was constructed for tumor irradiation alone. A subsequent 9 patients underwent sequential weekly computed tomography scanning during radiotherapy to verify the bladder movements and estimate the internal margins. Results: Bladder movements were mainly observed cranially, and the estimated internal margins were nonuniform and largest (>2 cm) anteriorly and cranially. The dose distribution in the bladder worsened if the bladder increased in volume: 70% of patients (11 of 16) would have had bladder underdosed to 70%, 80%, and 90% of the prescribed dose was 23%, 20%, and 15% for the rectum and 162, 144, 123 cm 3 for the intestines, respectively) than with a 'partially full' bladder (volume that received >70%, 80%, and 90% of the prescribed dose was 28%, 24%, and 18% for the rectum and 180, 158, 136 cm 3 for the intestines, respectively). The change in bladder filling during RT was significant for the dose distribution in the intestines. Tumor irradiation alone was significantly better than whole bladder irradiation in terms of organ sparing. Conclusion: The displacements of the bladder due to volume changes were mainly related to the upper wall. The internal margins should be nonuniform, with the largest margins cranially and anteriorly. The changes in bladder filling during RT could influence the dose distribution in the bladder and intestines. The dose distribution in the rectum and bowel was slightly better with
Paiva, B; Pérez-Andrés, M; Vídriales, M-B; Almeida, J; de las Heras, N; Mateos, M-V; López-Corral, L; Gutiérrez, N C; Blanco, J; Oriol, A; Hernández, M T; de Arriba, F; de Coca, A G; Terol, M-J; de la Rubia, J; González, Y; Martín, A; Sureda, A; Schmidt-Hieber, M; Schmitz, A; Johnsen, H E; Lahuerta, J-J; Bladé, J; San-Miguel, J F; Orfao, A
2011-04-01
Disappearance of normal bone marrow (BM) plasma cells (PC) predicts malignant transformation of monoclonal gammopathy of undetermined significance (MGUS) and smoldering myeloma (SMM) into symptomatic multiple myeloma (MM). The homing, behavior and survival of normal PC, but also CD34(+) hematopoietic stem cells (HSC), B-cell precursors, and clonal PC largely depends on their interaction with stromal cell-derived factor-1 (SDF-1) expressing, potentially overlapping BM stromal cell niches. Here, we investigate the distribution, phenotypic characteristics and competitive migration capacity of these cell populations in patients with MGUS, SMM and MM vs healthy adults (HA) aged >60 years. Our results show that BM and peripheral blood (PB) clonal PC progressively increase from MGUS to MM, the latter showing a slightly more immature immunophenotype. Of note, such increased number of clonal PC is associated with progressive depletion of normal PC, B-cell precursors and CD34(+) HSC in the BM, also with a parallel increase in PB. In an ex vivo model, normal PC, B-cell precursors and CD34(+) HSC from MGUS and SMM, but not MM patients, were able to abrogate the migration of clonal PC into serial concentrations of SDF-1. Overall, our results show that progressive competition and replacement of normal BM cells by clonal PC is associated with more advanced disease in patients with MGUS, SMM and MM.
International Nuclear Information System (INIS)
Fujine, Sachio; Uchiyama, Gunzou; Sugikawa, Susumu; Maeda, Mitsuru; Tsujino, Takeshi.
1989-10-01
Tritium distribution ratios between the organic and aqueous phases were measured for the system of 30 % tributyl phosphate(TBP)-normal dodecane(nDD)/uranyl nitrate-nitric acid water. It was confirmed that tritium is extracted by TBP into the organic phase in both chemical forms of tritiated water (HTO) and tritiated nitric acid (TNO 3 ). The value of tritium distribution ratio ranged from 0.002 to 0.005 for the conditions of 0-6 mol/L nitric acid, 0.5-800 mCi/L tritium in aqueous phase, and 0-125 g-U/L uranium in organic phase. Isotopic distribution coefficient of tritium between the organic and aqueous phases was observed to be about 0.95. (author)
Tsunoda, A; Mitsuoka, H; Sato, K; Kanayama, S
2000-06-01
Our purpose was to quantify the intracranial cerebrospinal fluid (CSF) volume components using an original MRI-based segmentation technique and to investigate whether a CSF volume index is useful for diagnosis of normal pressure hydrocephalus (NPH). We studied 59 subjects: 16 patients with NPH, 14 young and 13 elderly normal volunteers, and 16 patients with cerebrovascular disease. Images were acquired on a 1.5-T system, using a 3D-fast asymmetrical spin-echo (FASE) method. A region-growing method (RGM) was used to extract the CSF spaces from the FASE images. Ventricular volume (VV) and intracranial CSF volume (ICV) were measured, and a VV/ICV ratio was calculated. Mean VV and VV/ICV ratio were higher in the NPH group than in the other groups, and the differences were statistically significant, whereas the mean ICV value in the NPH group was not significantly increased. Of the 16 patients in the NPH group, 13 had VV/ICV ratios above 30%. In contrast, no subject in the other groups had a VV/ICV ratios higher than 30%. We conclude that these CSF volume parameters, especially the VV/ICV ratio, are useful for the diagnosis of NPH.
International Nuclear Information System (INIS)
Tsunoda, A.; Mitsuoka, H.; Sato, K.; Kanayama, S.
2000-01-01
Our purpose was to quantify the intracranial cerebrospinal fluid (CSF) volume components using an original MRI-based segmentation technique and to investigate whether a CSF volume index is useful for diagnosis of normal pressure hydrocephalus (NPH). We studied 59 subjects: 16 patients with NPH, 14 young and 13 elderly normal volunteers, and 16 patients with cerebrovascular disease. Images were acquired on a 1.5-T system, using a 3D-fast asymmetrical spin- echo (FASE) method. A region-growing method (RGM) was used to extract the CSF spaces from the FASE images. Ventricular volume (VV) and intracranial CSF volume (ICV) were measured, and a VV/ICV ratio was calculated. Mean VV and VV/ICV ratio were higher in the NPH group than in the other groups, and the differences were statistically significant, whereas the mean ICV value in the NPH group was not significantly increased. Of the 16 patients in the NPH group, 13 had VV/ICV ratios above 30%. In contrast, no subject in the other groups had a VV/ICV ratios higher than 30%. We conclude that these CSF volume parameters, especially the VV/ICV ratio, are useful for the diagnosis of NPH. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Tsunoda, A. [Department of Neurosurgery, Koshigaya Municipal Hospital, 10-47-1 Higashikoshigaya, Koshigaya City, Saitama 343-0023 (Japan); Mitsuoka, H.; Sato, K. [Department of Neurosurgery, Juntendo University (Japan); Kanayama, S. [Research and Development Centre, Toshiba Corporation (Japan)
2000-06-01
Our purpose was to quantify the intracranial cerebrospinal fluid (CSF) volume components using an original MRI-based segmentation technique and to investigate whether a CSF volume index is useful for diagnosis of normal pressure hydrocephalus (NPH). We studied 59 subjects: 16 patients with NPH, 14 young and 13 elderly normal volunteers, and 16 patients with cerebrovascular disease. Images were acquired on a 1.5-T system, using a 3D-fast asymmetrical spin- echo (FASE) method. A region-growing method (RGM) was used to extract the CSF spaces from the FASE images. Ventricular volume (VV) and intracranial CSF volume (ICV) were measured, and a VV/ICV ratio was calculated. Mean VV and VV/ICV ratio were higher in the NPH group than in the other groups, and the differences were statistically significant, whereas the mean ICV value in the NPH group was not significantly increased. Of the 16 patients in the NPH group, 13 had VV/ICV ratios above 30%. In contrast, no subject in the other groups had a VV/ICV ratios higher than 30%. We conclude that these CSF volume parameters, especially the VV/ICV ratio, are useful for the diagnosis of NPH. (orig.)
International Nuclear Information System (INIS)
Fry-Petit, A. M.; Sheckelton, J. P.; McQueen, T. M.; Rebola, A. F.; Fennie, C. J.; Mourigal, M.; Valentine, M.; Drichko, N.
2015-01-01
For over a century, vibrational spectroscopy has enhanced the study of materials. Yet, assignment of particular molecular motions to vibrational excitations has relied on indirect methods. Here, we demonstrate that applying group theoretical methods to the dynamic pair distribution function analysis of neutron scattering data provides direct access to the individual atomic displacements responsible for these excitations. Applied to the molecule-based frustrated magnet with a potential magnetic valence-bond state, LiZn 2 Mo 3 O 8 , this approach allows direct assignment of the constrained rotational mode of Mo 3 O 13 clusters and internal modes of MoO 6 polyhedra. We anticipate that coupling this well known data analysis technique with dynamic pair distribution function analysis will have broad application in connecting structural dynamics to physical properties in a wide range of molecular and solid state systems
Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.
2017-04-01
Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.
Yang, Tao; Liu, Shan; Zheng, Tian-Hui; Tao, Yan-Yan; Liu, Cheng-Hai
2015-05-26
Fuzheng Huayu recipe (FZHY) is formulated on the basis of Chinese medicine theory in treating liver fibrosis. To illuminate the influence of the pathological state of liver fibrosis on the pharmacokinetics and tissue distribution profiles of lignan components from FZHY. Male Wistar rats were randomly divided into normal group and Hepatic fibrosis group (induced by dimethylnitrosamine). Six lignan components were detected and quantified by ultrahigh performance liquid chromatography-tandem mass spectrometry(UHPLC-MS/MS)in the plasma and tissue of normal and hepatic fibrosis rats. A rapid, sensitive and convenient UHPLC-MS/MS method has been developed for the simultaneous determination of six lignan components in different rat biological samples successfully. After oral administration of FZHY at a dose of 15g/kg, the pharmacokinetic behaviors of schizandrin A (SIA), schizandrin B (SIB), schizandrin C (SIC), schisandrol A (SOA), Schisandrol B (SOB) and schisantherin A (STA) have been significantly changed in hepatic fibrosis rats compared with the normal rats, and their AUC(0-t) values were increased by 235.09%, 388.44%, 223.30%, 669.30%, 295.08% and 267.63% orderly (Pdistribution results showed the amount of SIA, SIB, SOA and SOB were significant increased in heart, lung, spleen and kidney of hepatic fibrosis rats compared with normal rats at most of the time point (Pdistribution of lignan components in normal and hepatic fibrosis rats. The hepatic fibrosis could alter the pharmacokinetics and tissue distribution properties of lignan components in rats after administration of FZHY. The results might be helpful for guide the clinical application of this medicine. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Krupskii, Pavel; Joe, Harry; Lee, David; Genton, Marc G.
2017-01-01
The multivariate Hüsler–Reiß copula is obtained as a direct extreme-value limit from the convolution of a multivariate normal random vector and an exponential random variable multiplied by a vector of constants. It is shown how the set of Hüsler–Reiß parameters can be mapped to the parameters of this convolution model. Assuming there are no singular components in the Hüsler–Reiß copula, the convolution model leads to exact and approximate simulation methods. An application of simulation is to check if the Hüsler–Reiß copula with different parsimonious dependence structures provides adequate fit to some data consisting of multivariate extremes.
Krupskii, Pavel
2017-11-02
The multivariate Hüsler–Reiß copula is obtained as a direct extreme-value limit from the convolution of a multivariate normal random vector and an exponential random variable multiplied by a vector of constants. It is shown how the set of Hüsler–Reiß parameters can be mapped to the parameters of this convolution model. Assuming there are no singular components in the Hüsler–Reiß copula, the convolution model leads to exact and approximate simulation methods. An application of simulation is to check if the Hüsler–Reiß copula with different parsimonious dependence structures provides adequate fit to some data consisting of multivariate extremes.
International Nuclear Information System (INIS)
Tarburton, J.P.; Halpern, S.E.; Hagan, P.L.; Sudora, E.; Chen, A.; Fridman, D.M.; Pfaff, A.E.
1990-01-01
Studies were performed to determine in vitro and in vivo effects of acetylation on Fab' fragments of ZCE-025, a monoclonal anti-CEA antibody. Isoelectric focusing revealed a drop in isoelectric point of 1.7 pI units following acetylation. Biodistribution studies of acetylated and nonacetylated [111In]Fab' were performed in normal BALB/c mice and in nude mice bearing the T-380 CEA-producing human colon tumor. The acetylated fragments remained in the vascular compartment longer and had significantly diminished renal uptake of 111In compared to controls. While acetylation itself effected a 50% drop in immunoreactivity, tumor uptake of the acetylated and nonacetylated 111In-labeled Fab' fragments was comparable, with the exception of one data point, through 72 h
International Nuclear Information System (INIS)
Ueda, J.; Kobayashi, Y.; Kenko, Y.; Koike, H.; Kubo, T.; Takano, Y.; Hara, K.; Sumitomo Hospital, Osaka; Osaka National Hospital
1988-01-01
The quantity of water, lipid and some metals was measured in autopsy specimens of 8 normal livers, 9 livers with fatty change, and in 12 livers with metastases of various origins. These parameters contribute to the CT number measured in the liver. Water played a major role in demonstration of liver metastases as a low-density area on CT. Other contributory factors include iron, magnesium and zinc. Lipid and calcium had no influence in this respect. Heavy accumulation of calcium in a metastatic lesion gives a high-density area on CT. However, even when a metastatic lesion was perceived on CT as a low-density area, the calcium content of the lesion was not always lower than that of the non-tumour region. (orig.)
Directory of Open Access Journals (Sweden)
José Raúl Machado Fernández
2018-01-01
Full Text Available Se presenta el nuevo detector LN-MoM-CA-CFAR que tiene una desviación reducida en la tasa de probabilidad de falsa alarma operacional con respecto al valor concebido de diseño. La solución corrige un problema fundamental de los procesadores CFAR que ha sido ignora-do en múltiples desarrollos. En efecto, la mayoría de los esquemas previamente propuestos tratan con los cambios bruscos del nivel del clutter mientras que la presente solución corrige los cambios lentos estadísticos de la señal de fondo. Se ha demostrado que estos tienen una influencia marcada en la selección del factor de ajuste multiplicativo CFAR, y consecuen-temente en el mantenimiento de la probabilidad de falsa alarma. Los autores aprovecharon la alta precisión que se alcanza en la estimación del parámetro de forma Log-Normal con el MoM, y la amplia aplicación de esta distribución en la modelación del clutter, para crear una arquitectura que ofrece resultados precisos y con bajo costo computacional. Luego de un procesamiento intensivo de 100 millones de muestras Log-Normal, se creó un esquema que, mejorando el desempeño del clásico CA-CFAR a través de la corrección continua de su fac-tor de ajuste, opera con una excelente estabilidad alcanzando una desviación de solamente 0,2884 % para la probabilidad de falsa alarma de 0,01.
International Nuclear Information System (INIS)
Frieling, M. von; Bradaczek, H.
1990-01-01
In regard to X-ray diffraction, Langmuir-Blodgett (LB) films consisting of lipid bilayers represent a 'one-dimensional crystal' with a very small number of unit cells in the direction of stacking. Such bounded systems yield X-ray diffraction diagrams which, in certain respects, contain more information than those of the conventional effectively infinite single crystals. This additional information consists of the profiles of the broadened reflections and their dislocation from the reciprocal-lattice points. These profiles are specific for each different structure and hence enable the direct calculation of unambiguous electron-density distributions from a single set of intensity data. At first, the Q function (the generalized Patterson function), i.e. the distance statistics of the structure sought after is calculated from the intensity data. Thereafter, the unambiguous convolution square root of the Q function must be determined, which is identical to the unknown electron-density distribution. For this purpose two mathematically completely different methods were established and compared. They were applied to diffraction patterns of Langmuir-Blodgett films of simple synthetic lipids with characteristic molecular subunits and showed identical results within the experimental resolution. This verifies the structures and the methods to calculate them. Furthermore, all features of the simple structures were compatible with the expectations. All one-dimensional electron-density distributions showed the common features of lipid bilayers. The characteristic molecular subunits can be recognized and reveal some interesting details. In general, they yield information about orientation, conformation and localization of molecular subunits and membrane components. (orig.)
Messiter, A. F.
1979-01-01
Analytical solutions are derived which incorporate additional physical effects as higher order terms for the case when the sonic line is very close to the wall. The functional form used for the undisturbed velocity profile is described to indicate how various parameters will be calculated for later comparison with experiment. The basic solutions for the pressure distribution are derived. Corrections are added for flow along a wall having longitudinal curvature and for flow in a circular pipe, and comparisons with available experimental data are shown.
DEFF Research Database (Denmark)
Danbolt, Mathias
2017-01-01
I de seneste par år har der været heftige diskussioner i danske medier om brugen af raciale stereotyper på indpakningen af såkaldte kolonialvarer i supermarkederne. Kritikken af billedernes racistiske effekter er ofte blevet afvist med påstanden om, at det er absurd at tale om racisme i mødet med...... noget så hverdagsagtigt som emballagen på kaffe, kakao eller chokolade. Men kikker man den danske mærkevarehistorie i kortene, står det klart at reklamebilleders påvirkningskraft ikke er til at spøge med. Mærkevareindustrien har været central i udbredelsen og normaliseringen af racistiske forståelser af...
Hána, T.; Eliášová, M.; Machalická, K.; Vokáč, M.
2017-10-01
Noticing the current architecture, there are many examples of glass bearing members such as beams, panes, ribs stairs or even columns. Most of these elements are made of laminated glass from panes bonded by polymer interlayer so the task of transferring shear forces between the glass panes needs to be investigated due to the lack of knowledge. This transfer depends on stiffness of polymer material, which is affected by temperature and load duration. It is essential to catch the safe side with limit cases when designing these members if the exact material behaviour is not specified. There are lots of interlayers for structural laminated glass applications available on a market. Most of them exhibit different properties, which need to be experimentally verified. This paper is focused on tangent shear modulus of PVB (polyvinyl-buthyral) interlayer and its effect on the stress distribution in glass panes when loaded. This distribution may be determined experimentally or numerically, respectively. This enables to design structural laminated glass members more effectively regarding price and safety. Furthermore, this is the way, how to extend the use of laminated glass in architectural design.
DEFF Research Database (Denmark)
Hindsø, Louise; Fuchs, Andreas; Kühl, Jørgen Tobias
2017-01-01
regional normal reference values of the left ventricle. The aim of this study was to derive reference values of regional LV myocardial thickness (LVMT) and mass (LVMM) from a healthy study group of the general population using cardiac computed tomography angiography (CCTA). We wanted to introduce LV...... myocardial distribution (LVMD) as a measure of regional variation of the LVMT. Moreover, we wanted to determine whether these parameters varied between men and women. We studied 568 (181 men; 32%) adults, free of cardiovascular disease and risk factors, who underwent 320-detector CCTA. Mean age was 55 (range...... 40-84) years. Regional LVMT and LVMM were measured, according to the American Heart Association's 17 segment model, using semi-automatic software. Mean LVMT were 6.6 mm for men and 5.4 mm for women (p normal LV was thickest in the basal septum (segment 3; men = 8.3 mm; women = 7.2 mm...
Watabe, Tadashi; Naka, Sadahiro; Ikeda, Hayato; Horitsugi, Genki; Kanai, Yasukazu; Isohashi, Kayako; Ishibashi, Mana; Kato, Hiroki; Shimosegawa, Eku; Watabe, Hiroshi; Hatazawa, Jun
2014-01-01
Acetylcholinesterase (AChE) inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11)C-Donepezil (DNP) and the AChE activity in the normal rat, with special focus on the adrenal glands. The distribution of (11)C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight = 220 ± 8.9 g). A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11)C-DNP (45.0 ± 10.7 MBq). The whole-body distribution of the (11)C-DNP PET was evaluated based on the Vt (total distribution volume) by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11)C-DNP in the body (following the liver) (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3), respectively), indicating that the distribution of (11)C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach) (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively), indicating high activity of AChE in the adrenal glands. We demonstrated the whole-body distribution of (11)C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11)C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.
Directory of Open Access Journals (Sweden)
Tadashi Watabe
Full Text Available PURPOSE: Acetylcholinesterase (AChE inhibitors have been used for patients with Alzheimer's disease. However, its pharmacokinetics in non-target organs other than the brain has not been clarified yet. The purpose of this study was to evaluate the relationship between the whole-body distribution of intravenously administered (11C-Donepezil (DNP and the AChE activity in the normal rat, with special focus on the adrenal glands. METHODS: The distribution of (11C-DNP was investigated by PET/CT in 6 normal male Wistar rats (8 weeks old, body weight = 220 ± 8.9 g. A 30-min dynamic scan was started simultaneously with an intravenous bolus injection of (11C-DNP (45.0 ± 10.7 MBq. The whole-body distribution of the (11C-DNP PET was evaluated based on the Vt (total distribution volume by Logan-plot analysis. A fluorometric assay was performed to quantify the AChE activity in homogenized tissue solutions of the major organs. RESULTS: The PET analysis using Vt showed that the adrenal glands had the 2nd highest level of (11C-DNP in the body (following the liver (13.33 ± 1.08 and 19.43 ± 1.29 ml/cm(3, respectively, indicating that the distribution of (11C-DNP was the highest in the adrenal glands, except for that in the excretory organs. The AChE activity was the third highest in the adrenal glands (following the small intestine and the stomach (24.9 ± 1.6, 83.1 ± 3.0, and 38.5 ± 8.1 mU/mg, respectively, indicating high activity of AChE in the adrenal glands. CONCLUSIONS: We demonstrated the whole-body distribution of (11C-DNP by PET and the AChE activity in the major organs by fluorometric assay in the normal rat. High accumulation of (11C-DNP was observed in the adrenal glands, which suggested the risk of enhanced cholinergic synaptic transmission by the use of AChE inhibitors.
Franz, Marcus; Wolheim, Anke; Richter, Petra; Umbreit, Claudia; Dahse, Regine; Driemel, Oliver; Hyckel, Peter; Virtanen, Ismo; Kosmehl, Hartwig; Berndt, Alexander
2010-04-01
The contribution of stromal laminin chain expression to malignant potential, tumour stroma reorganization and vessel formation in oral squamous cell carcinoma (OSCC) is not fully understood. Therefore, the expression of the laminin chains alpha2, alpha3, alpha4, alpha5 and gamma2 in the stromal compartment/vascular structures in OSCC was analysed. Frozen tissue of OSCC (9x G1, 24x G2, 8x G3) and normal (2x)/hyperplastic (11x) oral mucosa was subjected to laminin chain and alpha-smooth muscle actin (ASMA) immunohistochemistry. Results were correlated to tumour grade. The relation of laminin chain positive vessels to total vessel number was assessed by immunofluorescence double labelling with CD31. Stromal laminin alpha2 chain significantly decreases and alpha3, alpha4, alpha5 and gamma2 chains and also ASMA significantly increase with rising grade. The amount of stromal alpha3, alpha4 and gamma2 chains significantly increased with rising ASMA positivity. There is a significant decrease in alpha3 chain positive vessels with neoplastic transformation. Mediated by myofibroblasts, OSCC development is associated with a stromal up-regulation of laminin isoforms possibly contributing to a migration promoting microenvironment. A vascular basement membrane reorganization concerning alpha3 and gamma2 chain laminins during tumour angioneogenesis is suggested.
International Nuclear Information System (INIS)
Feenstra, A.; Vaalburg, W.; Nolten, G.M.J.; Reiffers, S.; Talma, A.G.; Wiegman, T.; van der Molen, H.D.; Woldring, M.G.
1983-01-01
Tritiated 17α-methylestradiol was synthesized to investigate the potential of the carbon-11-labeled analog as an estrogen-receptor-binding radiopharmaceutical. In vitro, 17α-methylestradiol is bound with high affinity to the cytoplasmic estrogen receptor from rabbit uterus (K/sub d/ = 1.96 x 10 -10 M), and it sediments as an 8S hormone-receptor complex in sucrose gradients. The compound shows specific uptake in the uterus of the adult rat, within 1 h after injection. In female rats bearing DMBA-induced tumors, specific uterine and tumor uptakes were observed, although at 30 min the tumor uptake was only 23 to 30% of the uptake in the uterus. Tritiated 17α-methylestradiol with a specific activity of 6 Ci/mmole showed a similar tissue distribution. Our results indicate that a 17 α-methylestradiol is promising as an estrogen-receptor-binding radiopharmaceutical
Wang, Lihong; Gong, Zaiwu
2017-10-10
As meteorological disaster systems are large complex systems, disaster reduction programs must be based on risk analysis. Consequently, judgment by an expert based on his or her experience (also known as qualitative evaluation) is an important link in meteorological disaster risk assessment. In some complex and non-procedural meteorological disaster risk assessments, a hesitant fuzzy linguistic preference relation (HFLPR) is often used to deal with a situation in which experts may be hesitant while providing preference information of a pairwise comparison of alternatives, that is, the degree of preference of one alternative over another. This study explores hesitation from the perspective of statistical distributions, and obtains an optimal ranking of an HFLPR based on chance-restricted programming, which provides a new approach for hesitant fuzzy optimisation of decision-making in meteorological disaster risk assessments.
Deng, Li; Li, Yongzhi; Zhang, Xinshi; Chen, Bo; Deng, Yulin; Li, Yujuan
2015-10-10
A UPLC-MS method was developed for determination of pterostilbene (PTS) in plasma and tissues of mice. PTS was separated on Agilent Zorbax XDB-C18 column (50 × 2.1 mm, 1.8 μm) with gradient mobile phase at the flow rate of 0.2 ml/min. The detection was performed by negative ion electrospray ionization in multiple reaction monitoring mode. The linear calibration curve of PTS in mouse plasma and tissues ranged from 1.0 to 5000 and 0.50 to 500 ng/ml (r(2)>0.9979), respectively, with lowest limits of quantification (LLOQ) were between 0.5 and 2.0 ng/ml, respectively. The accuracy and precision of the assay were satisfactory. The validated method was applied to the study of bioavailability and tissue distribution of PTS in normal and Lewis lung carcinoma (LLC) bearing mice. The bioavailability of PTS (dose 14, 28 and 56 mg/kg) in normal mice were 11.9%, 13.9% and 26.4%, respectively; and the maximum level (82.1 ± 14.2 μg/g) was found in stomach (dose 28 mg/kg). The bioavailability, peak concentration (Cmax), time to peak concentration (Tmax) of PTS in LLC mice was increased compared with normal mice. The results indicated the UPLC-MS method is reliable and bioavailability and tissue distribution of PTS in normal and LLC mice were dramatically different. Copyright © 2015 Elsevier B.V. All rights reserved.
Patschan, D; Michurina, T; Shi, H K; Dolff, S; Brodsky, S V; Vasilieva, T; Cohen-Gould, L; Winaver, J; Chander, P N; Enikolopov, G; Goligorsky, M S
2007-04-01
Nestin, a marker of multi-lineage stem and progenitor cells, is a member of intermediate filament family, which is expressed in neuroepithelial stem cells, several embryonic cell types, including mesonephric mesenchyme, endothelial cells of developing blood vessels, and in the adult kidney. We used Nestin-green fluorescent protein (GFP) transgenic mice to characterize its expression in normal and post-ischemic kidneys. Nestin-GFP-expressing cells were detected in large clusters within the papilla, along the vasa rectae, and, less prominently, in the glomeruli and juxta-glomerular arterioles. In mice subjected to 30 min bilateral renal ischemia, glomerular, endothelial, and perivascular cells showed increased Nestin expression. In the post-ischemic period, there was an increase in fluorescence intensity with no significant changes in the total number of Nestin-GFP-expressing cells. Time-lapse fluorescence microscopy performed before and after ischemia ruled out the possibility of engraftment by the circulating Nestin-expressing cells, at least within the first 3 h post-ischemia. Incubation of non-perfused kidney sections resulted in a medullary-to-cortical migration of Nestin-GFP-positive cells with the rate of expansion of their front averaging 40 microm/30 min during the first 3 h and was detectable already after 30 min of incubation. Explant matrigel cultures of the kidney and aorta exhibited sprouting angiogenesis with cells co-expressing Nestin and endothelial marker, Tie-2. In conclusion, several lines of circumstantial evidence identify a sub-population of Nestin-expressing cells with the mural cells, which are recruited in the post-ischemic period to migrate from the medulla toward the renal cortex. These migrating Nestin-positive cells may be involved in the process of post-ischemic tissue regeneration.
Wiig, Helge; Gyenge, Christina; Iversen, Per Ole; Gullberg, Donald; Tenstad, Olav
2008-05-01
The interstitial space is a dynamic microenvironment that consists of interstitial fluid and structural molecules of the extracellular matrix, such as glycosaminoglycans (hyaluronan and proteoglycans) and collagen. Macromolecules can distribute in the interstitium only in those spaces unoccupied by structural components, a phenomenon called interstitial exclusion. The exclusion phenomenon has direct consequences for plasma volume regulation. Early studies have assigned a major role to collagen as an excluding agent that accounts for the sterical (geometrical) exclusion. More recently, it has been shown that the contribution of negatively charged glycosaminoglycans might also be significant, resulting in an additional electrostatical exclusion effect. This charge effect may be of importance for drug uptake and suggests that either the glycosaminoglycans or the net charge of macromolecular substances to be delivered may be targeted to increase the available volume and uptake of macromolecular therapeutic agents in tumor tissue. Here, we provide an overview of the structural components of the interstitium and discuss the importance the sterical and electrostatical components have on the dynamics of transcapillary fluid exchange.
Obana, Koichiro; Takahashi, Tsutomu; No, Tetsuo; Kaiho, Yuka; Kodaira, Shuichi; Yamashita, Mikiya; Sato, Takeshi; Nakamura, Takeshi
2014-04-01
describe the aftershocks of a Mw 7.4 intraplate normal-faulting earthquake that occurred 150 km east Ogasawara (Bonin) Islands, Japan, on 21 December 2010. It occurred beneath the outer trench slope of the Izu-Ogasawara trench, where the Pacific plate subducts beneath the Philippine Sea plate. Aftershock observations using ocean bottom seismographs (OBSs) began soon after the earthquake and multichannel seismic reflection surveys were conducted across the aftershock area. Aftershocks were distributed in a NW-SE belt 140 km long, oblique to the N-S trench axis. They formed three subparallel lineations along a fracture zone in the Pacific plate. The OBS observations combined with data from stations on Chichi-jima and Haha-jima Islands revealed a migration of the aftershock activity. The first hour, which likely outlines the main shock rupture, was limited to an 80 km long area in the central part of the subsequent aftershock area. The first hour activity occurred mainly around, and appears to have been influenced by, nearby large seamounts and oceanic plateau, such as the Ogasawara Plateau and the Uyeda Ridge. Over the following days, the aftershocks expanded beyond or into these seamounts and plateau. The aftershock distribution and migration suggest that crustal heterogeneities related to a fracture zone and large seamounts and oceanic plateau in the incoming Pacific plate affected the rupture of the main shock. Such preexisting structures may influence intraplate normal-faulting earthquakes in other regions of plate flexure prior to subduction.
Directory of Open Access Journals (Sweden)
Patrik Felipe Nazario
2010-01-01
Full Text Available The aim of this study was to determine the possible relationship between loss of the normal medial longitudinal arch measured by the height of the navicular bone in a static situation and variables related to plantar pressure distribution measured in a dynamic situation. Eleven men (21 ± 3 years, 74 ± 10 kg and 175 ± 4 cm participated in the study. The Novel Emed-AT System was used for the acquisition of plantar pressure distribution data (peak pressure, mean pressure, contact area, and relative load at a sampling rate of 50 Hz. The navicular drop test proposed by Brody (1982 was used to assess the height of the navicular bone for classification of the subjects. The results were compared by the Mann-Whitney U test, with the level of significance set at p ≤ 0.05. Differences were observed between the two groups in the mid-foot region for all variables studied, with the observation of higher mean values in subjects with flat feet. There were also significant differences in contact area, relative load, peak pressure, and mean pressure between groups. The present study demonstrates the importance of paying attention to subjects with flat feet because changes in plantar pressure distribution are associated with discomfort and injuries.
Wong, Wing-Cheong; Ng, Hong-Kiat; Tantoso, Erwin; Soong, Richie; Eisenhaber, Frank
2018-02-12
signal-to-noise ratio by 50% and the statistical/detection sensitivity by as high as 30% regardless of the downstream mapping and normalization methods. Most importantly, the power-law correction improves concordance in significant calls among different normalization methods of a data series averagely by 22%. When presented with a higher sequence depth (4 times difference), the improvement in concordance is asymmetrical (32% for the higher sequencing depth instance versus 13% for the lower instance) and demonstrates that the simple power-law correction can increase significant detection with higher sequencing depths. Finally, the correction dramatically enhances the statistical conclusions and eludes the metastasis potential of the NUGC3 cell line against AGS of our dilution analysis. The finite-size effects due to undersampling generally plagues transcript count data with reproducibility issues but can be minimized through a simple power-law correction of the count distribution. This distribution correction has direct implication on the biological interpretation of the study and the rigor of the scientific findings. This article was reviewed by Oliviero Carugo, Thomas Dandekar and Sandor Pongor.
International Nuclear Information System (INIS)
Matsuda, Shin; Uchida, Tatsumi; Yui, Tokuo; Kariyone, Shigeo
1982-01-01
T and B lymphocyte survival and organ distribution were studied by using 111 In-oxine labeled autologous lymphocytes in 3 normal subjects, 3 patients with chronic lymphocytic leukemia (CLL) and 9 with malignant lymphoma (ML).FDisappearance curves of the labeled lymphocytes showed two exponential components in all cases. The half time of the first component was within 1 hour in all cases. That of the second one was 50.7 +- 6.4 hours for all lymphocytes, 52.0 +- 5.5 hours for T lymphocytes and 31.6 +- 4.9 hours for B lymphocytes in normal subjects, 192.6 hours for T-CLL and 57.7 +- 46.9 hours for B-CLL, and 60.2 +- 30.7 hours for T cell type of malignant lymphoma (T-ML) and 63.7 +- 24.5 hours for B cell type of malignant lymphoma (B-ML). These data might suggest that all lymphocyte disappearance curve reflected T lymphocyte disappearance curve chiefly, and the half time of B lymphocytes was shorter than that of T lymphocytes. In the T-CLL, the half time of the second component prolonged extremely in comparison with that of normal T lymphocytes. The labeled cells were accumulated in the lungs, spleen and liver immediately after the infusion, then in the spleen most remarkably 1 hour after the infusion in all cases. The radioactivity over the bone marrow was observed from 1 hour in all cases and that of lymph nodes were first noticed 18 hours after the infusion in T-CLL and T-ML, 68 hours in B-CLL but were not noticed in normal subjects and B-ML. The recovery of labeled cells in the blood was 28.5 +- 7.9% for all lymphocytes, 19.7 +- 1.9% for T lymphocytes and 11.0 +- 5.1% for B lymphocytes in normal subjects, 25.8 +- 1.6% for CLL, and 17.6 +- 11.0% for T-ML, 7.7 +- 5.2% for B-ML, respectively. (J.P.N.)
Stuart, C. A.; Wen, G.; Gustafson, W. C.; Thompson, E. A.
2000-01-01
Basal, "insulin-independent" glucose uptake into skeletal muscle is provided by glucose transporters positioned at the plasma membrane. The relative amount of the three glucose transporters expressed in muscle has not been previously quantified. Using a combination of qualitative and quantitative ribonuclease protection assay (RPA) methods, we found in normal human muscle that GLUT1, GLUT3, and GLUT4 mRNA were expressed at 90 +/- 10, 46 +/- 4, and 156 +/- 12 copies/ng RNA, respectively. Muscle was fractionated by DNase digestion and differential sedimentation into membrane fractions enriched in plasma membranes (PM) or low-density microsomes (LDM). GLUT1 and GLUT4 proteins were distributed 57% to 67% in LDM, whereas GLUT3 protein was at least 88% in the PM-enriched fractions. These data suggest that basal glucose uptake into resting human muscle could be provided in part by each of these three isoforms.
Directory of Open Access Journals (Sweden)
Seed Mike
2012-11-01
Full Text Available Abstract Background We present the first phase contrast (PC cardiovascular magnetic resonance (CMR measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG. Methods A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30–39 weeks. Flow was measured in the major fetal vessels and indexed to the fetal weight. Results There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96. Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO 540±101, main pulmonary artery (MPA 327±68, ascending aorta (AAo 198±38, superior vena cava (SVC 147±46, ductus arteriosus (DA 220±39,pulmonary blood flow (PBF 106±59,descending aorta (DAo 273±85, umbilical vein (UV 160±62, foramen ovale (FO107±54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60±4, AAo37±4, SVC 28±7, DA 41±8, PBF 19±10, DAo50±12, UV 30±9, FO 21±12. Conclusion This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.
Seed, Mike; van Amerom, Joshua F P; Yoo, Shi-Joon; Al Nafisi, Bahiyah; Grosse-Wortmann, Lars; Jaeggi, Edgar; Jansz, Michael S; Macgowan, Christopher K
2012-11-26
We present the first phase contrast (PC) cardiovascular magnetic resonance (CMR) measurements of the distribution of blood flow in twelve late gestation human fetuses. These were obtained using a retrospective gating technique known as metric optimised gating (MOG). A validation experiment was performed in five adult volunteers where conventional cardiac gating was compared with MOG. Linear regression and Bland Altman plots were used to compare MOG with the gold standard of conventional gating. Measurements using MOG were then made in twelve normal fetuses at a median gestational age of 37 weeks (range 30-39 weeks). Flow was measured in the major fetal vessels and indexed to the fetal weight. There was good correlation between the conventional gated and MOG measurements in the adult validation experiment (R=0.96). Mean flows in ml/min/kg with standard deviations in the major fetal vessels were as follows: combined ventricular output (CVO) 540 ± 101, main pulmonary artery (MPA) 327 ± 68, ascending aorta (AAo) 198 ± 38, superior vena cava (SVC) 147 ± 46, ductus arteriosus (DA) 220 ± 39,pulmonary blood flow (PBF) 106 ± 59,descending aorta (DAo) 273 ± 85, umbilical vein (UV) 160 ± 62, foramen ovale (FO)107 ± 54. Results expressed as mean percentages of the CVO with standard deviations were as follows: MPA 60 ± 4, AAo37 ± 4, SVC 28 ± 7, DA 41 ± 8, PBF 19 ± 10, DAo50 ± 12, UV 30 ± 9, FO 21 ± 12. This study demonstrates how PC CMR with MOG is a feasible technique for measuring the distribution of the normal human fetal circulation in late pregnancy. Our preliminary results are in keeping with findings from previous experimental work in fetal lambs.
International Nuclear Information System (INIS)
Mangel, L.; Skriba, Z.; Major, T.; Polgar, C.; Fodor, J.; Somogyi, A.; Nemeth, G.
2002-01-01
The purpose of this study was to prove that by using conformal external beam radiotherapy (RT) normal brain structures can be protected even when applying an alternative approach of biological dose escalation: hypofractionation (HOF) without total dose reduction (TDR). Traditional 2-dimensional (2D) and conformal 3-dimensional (3D) treatment plans were prepared for 10 gliomas representing the subanatomical sites of the supratentorial brain. Isoeffect distributions were generated by the biologically effective dose (BED) formula to analyse the effect of conventionally fractionated (CF) and HOF schedules on both the spatial biological dose distribution and biological dose-volume histograms. A comparison was made between 2D-CF (2.0 Gy/day) and 3D-HOF (2.5 Gy/day) regimens, applying the same 60 Gy total doses. Integral biologically effective dose (IBED) and volumes received biologically equivalent to a dose of 54 Gy or more (V-BED54) were calculated for the lower and upper brain stem as organs of risk. The IBED values were lower with the 3D-HOF than with the 2D-CF schedule in each tumour location, means 22.7±17.1 and 40.4±16.9 in Gy, respectively (p<0.0001). The V-BED54 values were also smaller or equal in 90% of the cases favouring the 3D-HOF scheme. The means were 2.7±4.8 ccm for 3D-HOF and 10.7±12.7 ccm for 2D-CF (p=0.0006). Our results suggest that with conformal RT, fraction size can gradually be increased. HOF radiotherapy regimens without TDR shorten the treatment time and seem to be an alternative way of dose escalation in the treatment of glioblastoma
Energy Technology Data Exchange (ETDEWEB)
Mangel, L.; Skriba, Z.; Major, T.; Polgar, C.; Fodor, J.; Somogyi, A.; Nemeth, G. [National Research Inst. for Radiobiology and Radiohygiene, Budapest (Hungary)
2002-04-01
The purpose of this study was to prove that by using conformal external beam radiotherapy (RT) normal brain structures can be protected even when applying an alternative approach of biological dose escalation: hypofractionation (HOF) without total dose reduction (TDR). Traditional 2-dimensional (2D) and conformal 3-dimensional (3D) treatment plans were prepared for 10 gliomas representing the subanatomical sites of the supratentorial brain. Isoeffect distributions were generated by the biologically effective dose (BED) formula to analyse the effect of conventionally fractionated (CF) and HOF schedules on both the spatial biological dose distribution and biological dose-volume histograms. A comparison was made between 2D-CF (2.0 Gy/day) and 3D-HOF (2.5 Gy/day) regimens, applying the same 60 Gy total doses. Integral biologically effective dose (IBED) and volumes received biologically equivalent to a dose of 54 Gy or more (V-BED54) were calculated for the lower and upper brain stem as organs of risk. The IBED values were lower with the 3D-HOF than with the 2D-CF schedule in each tumour location, means 22.7{+-}17.1 and 40.4{+-}16.9 in Gy, respectively (p<0.0001). The V-BED54 values were also smaller or equal in 90% of the cases favouring the 3D-HOF scheme. The means were 2.7{+-}4.8 ccm for 3D-HOF and 10.7{+-}12.7 ccm for 2D-CF (p=0.0006). Our results suggest that with conformal RT, fraction size can gradually be increased. HOF radiotherapy regimens without TDR shorten the treatment time and seem to be an alternative way of dose escalation in the treatment of glioblastoma.
Hussein, Laila; Medina, Alexander; Barrionnevo, Ana; Lammuela-Raventos, Rosa M; Andres-Lacueva, Cristina
2009-06-01
The urinary flavonoids are considered a reliable biomarker for the intake of polyphenol-rich foods. To assess the normal distribution of urinary polyphenol [PP] excretion among healthy male children and adolescents on a typical Egyptian diet. To follow up the impact of nutritional intervention with tomato juice on the urinary excretion of [PP]. Forty-nine male subjects 7-14 years old collected a 24-h urine sample and filled a dietary record during a 7-day period. A daily serving of 230 g fresh tomato juice was followed for 18 days in a subgroup. Total urinary [PP] excretions were measured before and after termination of the intervention program. The total urinary [PP] was analyzed after a clean-up solid-phase extraction step by the Folin-Ciocalteu reagent in the 96 micro plates. The results were expressed as gallic acid equivalents (GAE). The urinary [PP] excretion averaged 48.6+/-5.5 mg GAE/24 h, equivalent to 89.5+/-8.4 mg GAE/g creatinine. The mean urinary [PP] excretion increased significantly (Ptomato juice (287.4+/-64.3 mg GAE/g creatinine) compared with the respective mean baseline level (94.5+/-8.92 mg GAE/g creatinine). Clinical laboratory reference limits for urinary polyphenols are presented for Egyptian male children and adolescents. Measuring the urinary polyphenol excretion proved a good biomarker for the dietary polyphenol intake and the results demonstrated that tomato [PP] was highly bioavailable in the human body.
Yang, Lianrong; Meng, Xin; Kuang, Haixue
2018-04-15
A simple, highly sensitive ultra-performance liquid chromatography- electrospray ionization-mass spectrometry (LC-ESI-MS) method has been developed to quantify of withanolide B and obakunone (IS) in guinea pig plasma and tissues, and to compare the pharmacokinetics and tissue distribution of withanolide B in normal and psoriasis guinea pigs. After mixing with IS, plasma and tissues were pretreated by protein precipitation with methanol. Chromatographic separation was performed on a C18 column using aqueous (0.1% formic acid) and acetonitrile (0.1% formic acid) solutions at 0.4 mL/min as the mobile phase. The gradient program was selected (0-4.0 min, 2-98% B; 4.0-4.5 min, 98-2% B; and 4.5-5 min, 2% B). Detection was performed on a 4000 QTRAP UPLC-ESI-MS/MS system from AB Sciex in the multiple reaction monitoring (MRM) mode. Withanolide B and obakunone (IS) were monitored under positive ionization conditions. The optimized mass transition ion-pairs (m/z) for quantitation were 455.1/109.4 for withanolide B and 455.1/161.1 for obakunone. Copyright © 2018. Published by Elsevier B.V.
John R. Jones
1985-01-01
Quaking aspen is the most widely distributed native North American tree species (Little 1971, Sargent 1890). It grows in a great diversity of regions, environments, and communities (Harshberger 1911). Only one deciduous tree species in the world, the closely related Eurasian aspen (Populus tremula), has a wider range (Weigle and Frothingham 1911)....
Carpenter, Donald A.
2008-01-01
Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…
Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.
2003-01-01
The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for
Smooth quantile normalization.
Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada
2018-04-01
Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.
Directory of Open Access Journals (Sweden)
Rodica Pripoaie
2015-08-01
Full Text Available This work presents the application of the normal distribution Gauss-Laplace in case of a company is the most modern Romanian sea-river port on the Danube, specialized service providers, with a handling capacity of approx. 20,000,000 tons / year. The normal distribution Gauss-Laplace is the most known and used probability distribution, because it surprises better the evolution of economic and financial phenomena. Around the average, which has the greatest frequency, gravitate values more to less distant than average, but with the same standard deviation. It is noted that, although used in the forecasting calculations, analysis of profitability threshold - even ignores the risk of decisional operations (regarding deviations between the forecast and achievements, which may, in certain circumstances, influence much the activity of the company. This can be held into account when carefully studying the evolution of turnover follows a law of probability. In case not exist any information on the law of probability of turnover and no reason that one case appear more than another, according of Laplace law, we consider that these cases are uniformly distributed, therefore they follow a normal distribution.
Pastore, Lisa M; Young, Steven L; Manichaikul, Ani; Baker, Valerie L; Wang, Xin Q; Finkelstein, Joel S
2017-01-01
To study whether reported, but inconsistent, associations between the FMR1 CGG repeat lengths in the intermediate, high normal, or low normal range differentiate women diagnosed with diminished ovarian reserve (DOR) from population controls and whether associations vary by race/ethnic group. Case-control study. Academic and private fertility clinics. DOR cases (n = 129; 95 Whites, 22 Asian, 12 other) from five U.S. fertility clinics were clinically diagnosed, with regular menses and no fragile X syndrome family history. Normal fertility controls (n = 803; 386 Whites, 219 African-Americans, 102 Japanese, 96 Chinese) from the United States-based SWAN Study had one or more menstrual period in the 3 months pre-enrollment, one or more pregnancy, no history of infertility or hormone therapy, and menopause ≥46 years. Previously, the SWAN Chinese and Japanese groups had similar FMR1 CGG repeat lengths, thus they were combined. None. FMR1 CGG repeat lengths. Median CGG repeats were nearly identical by case/control group. DOR cases had fewer CGG repeats in the shorter FMR1 allele than controls among Whites, but this was not significant among Asians. White cases had fewer CGG repeats in the shorter allele than Asian cases. No significant differences were found in the high normal/intermediate range between cases and controls or by race/ethnic group within cases in the longer allele. This study refutes prior reports of an association between DOR and high normal/intermediate repeats and confirms an association between DOR and low normal repeats in Whites. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Takahashi, Nobukazu; Ishida, Yoshio; Hirose, Yoshiaki; Kawano, Shigeo; Fukuoka, Syuji; Hayashida, Kohei; Takamiya, Makoto; Nonogi, Hiroshi
1995-01-01
Visual interpretation of 123 I-BMIPP (BMIPP) myocardial images has difficulties in detecting mild reduction in tracer uptake. We studied the significance of the objective assessment of myocardial BMIPP maldistributions at rest by using a Bull's-eye map and its normal data file for detecting ischemic heart disease. Twenty nine patients, 15 with prior myocardial infarction and 14 with effort angina were studied. The initial 15-min BMIPP image was evaluated by visual analysis and by generating the extent Bull's-eye map which exhibits regions with reduced % uptake under mean-2SD of 10 normal controls. The sensitivity for determining coronary lesions in non-infarcted myocardial regions with the extent map was superior to that with visual analysis (67% vs. 33%). In the regions supplied by the stenotic coronary artery, those which showed visually negative but positive in the map and which showed positive in both had higher incidence of wall motion abnormalities and severe coronary stenosis than those with normal findings in both. These results suggest that the objective assessment based on the normal data file in a Bull's-eye polar map is clinically important for improving the limitation or the visual interpretation in 123 I-BMIPP imaging. (author)
Directory of Open Access Journals (Sweden)
Salvador Cruz Rambaud
2015-07-01
Full Text Available This paper proposes an expression of the value of an annuity with payments of 1 unit each when the interest rate is random. In order to attain this objective, we proceed on the assumption that the non-central moments of the capitalization factor are known. Specifically, to calculate the value of these annuities, we propose two different expressions. First, we suppose that the random interest rate is normally distributed; then, we assume that it follows the beta distribution. A practical application of these two methodologies is also implemented using the R statistical software.
Directory of Open Access Journals (Sweden)
Mirella Leme Franco Geraldini Sirol
2006-06-01
Full Text Available Foram realizados quatro estudos de simulação para verificar a distribuição de inversas de variáveis com distribuição normal, em função de diferentes variâncias, médias, pontos de truncamentos e tamanhos amostrais. As variáveis simuladas foram GMD, com distribuição normal, representando o ganho médio diário e DIAS, obtido a partir da inversa de GMD, representando dias para se obter determinado peso. Em todos os estudos, foi utilizado o sistema SAS® (1990 para simulação dos dados e para posterior análise dos resultados. As médias amostrais de DIAS foram dependentes dos desvios-padrão utilizados na simulação. As análises de regressão mostraram redução da média e do desvio-padrão de DIAS em função do aumento na média de GMD. A inclusão de um ponto de truncamento entre 10 e 25% do valor da média de GMD reduziu a média de GMD e aumentou a de DIAS, quando o coeficiente de variação de GMD foi superior a 25%. O efeito do tamanho dos grupos nas médias de GMD e DIAS não foi significativo, mas o desvio-padrão e CV amostrais médios de GMD aumentaram com o tamanho do grupo. Em virtude da dependência entre a média e o desvio-padrão e da variação observada nos desvios-padrão de DIAS em função do tamanho do grupo, a utilização de DIAS como critério de seleção pode diminuir a acurácia da variação. Portanto, para a substituição de GMD por DIAS, é necessária a utilização de um método de análise robusto o suficiente para a eliminação da heterogeneidade de variância.Four simulation studies were conducted to verify the distribution of the inverse of variables with normal distribution, relatively to variances, averages, truncation points and sample sizes. The variables simulated were GMD, with normal distribution and representing average daily gain, and DIAS defined as a multiple of the inverse of GMD and representing days to reach a fixed body weight. The SAS® (1990 system was used, for simulation
Hyder, Fahmeed; Herman, Peter; Bailey, Christopher J; Møller, Arne; Globinsky, Ronen; Fulbright, Robert K; Rothman, Douglas L; Gjedde, Albert
2016-05-01
Regionally variable rates of aerobic glycolysis in brain networks identified by resting-state functional magnetic resonance imaging (R-fMRI) imply regionally variable adenosine triphosphate (ATP) regeneration. When regional glucose utilization is not matched to oxygen delivery, affected regions have correspondingly variable rates of ATP and lactate production. We tested the extent to which aerobic glycolysis and oxidative phosphorylation power R-fMRI networks by measuring quantitative differences between the oxygen to glucose index (OGI) and the oxygen extraction fraction (OEF) as measured by positron emission tomography (PET) in normal human brain (resting awake, eyes closed). Regionally uniform and correlated OEF and OGI estimates prevailed, with network values that matched the gray matter means, regardless of size, location, and origin. The spatial agreement between oxygen delivery (OEF≈0.4) and glucose oxidation (OGI ≈ 5.3) suggests that no specific regions have preferentially high aerobic glycolysis and low oxidative phosphorylation rates, with globally optimal maximum ATP turnover rates (VATP ≈ 9.4 µmol/g/min), in good agreement with (31)P and (13)C magnetic resonance spectroscopy measurements. These results imply that the intrinsic network activity in healthy human brain powers the entire gray matter with ubiquitously high rates of glucose oxidation. Reports of departures from normal brain-wide homogeny of oxygen extraction fraction and oxygen to glucose index may be due to normalization artefacts from relative PET measurements. © The Author(s) 2016.
Christodorescu, Mihai; Kinder, Johannes; Jha, Somesh; Katzenbeisser, Stefan; Veith, Helmut
2005-01-01
Malware is code designed for a malicious purpose, such as obtaining root privilege on a host. A malware detector identifies malware and thus prevents it from adversely affecting a host. In order to evade detection by malware detectors, malware writers use various obfuscation techniques to transform their malware. There is strong evidence that commercial malware detectors are susceptible to these evasion tactics. In this paper, we describe the design and implementation of a malware normalizer ...
International Nuclear Information System (INIS)
Battisti, W.P.; Artymyshyn, R.P.; Murray, M.
1989-01-01
The plasticity of the beta 1- and beta 2-adrenergic receptor subtypes was examined in the interpeduncular nucleus (IPN) of the adult rat. The beta-adrenergic receptor antagonist 125I-pindolol (125I-PIN) was used in conjunction with the selective subtype antagonists ICI 118,551 and ICI 89,406 to determine the subnuclear distribution of beta 1- and beta 2-adrenergic receptors in this nucleus and to correlate the receptor distribution with the distribution of both noradrenergic afferents from the locus coeruleus (LC) and non-noradrenergic afferents from the fasiculus retroflexus (FR). The density of these binding sites was examined following lesions that decreased (LC lesions) or increased (FR lesions) the density of the noradrenergic projection in the IPN. Quantitative radioautography indicated that beta 1-labeled binding sites account for the larger percentage of binding sites in the IPN. The beta 1-binding sites are densest in those subnuclei that receive a noradrenergic projection from the LC: the central, rostral, and intermediate subnuclei. beta 1-binding sites are algo homogeneously distributed throughout the lateral subnuclei, where there is no detectable noradrenergic innervation. beta 2-binding sites have a more restricted distribution. They are concentrated in the ventral half of the lateral subnuclei, where they account for 70% of total 125I-PIN binding sites. beta 2-binding sites are also present along the ventral border of the IPN. Some of this labeling extends into the central and intermediate subnuclei. Bilateral lesions of the LC, which selectively remove noradrenergic innervation to the IPN, result in an increase in the beta 1-binding sites. Bilateral lesions of the FR, which remove the major cholinergic and peptidergic input from the IPN, elicit an increase in noradrenergic projections and a decrease in beta 1-binding sites
International Nuclear Information System (INIS)
Perrow, C.
1989-01-01
The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de
Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan
2016-02-01
In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.
Directory of Open Access Journals (Sweden)
Jin H. Jo
2015-03-01
Full Text Available Distributed generation allows a variety of small, modular power-generating technologies to be combined with load management and energy storage systems to improve the quality and reliability of our electricity supply. As part of the US Environmental Protection Agency's effort to reduce CO2 emissions from existing power plants by 30% by 2030, distributed generation through solar photovoltaic systems provides a viable option for mitigating the negative impacts of centralized fossil fuel plants. This study conducted a detailed analysis to identify the rooftops in a town in Central Illinois that are suitable for distributed generation solar photovoltaic systems with airborn LiDAR data and to quantify their energy generation potential with an energy performance model. By utilizing the available roof space of the 9,718 buildings in the case study area, a total of 39.27 MW solar photovoltaic systems can provide electrical generation of 53,061 MWh annually. The unique methodology utilized for this assessment of a town's solar potential provides an effective way to invest in a more sustainable energy future and ensure economic stability.
Nonaka, Kouichi; Ohata, Ken; Ban, Shinichi; Ichihara, Shin; Takasugi, Rumi; Minato, Yohei; Tashima, Tomoaki; Matsuyama, Yasushi; Takita, Maiko; Matsuhashi, Nobuyuki; Neumann, Helmut
2015-12-16
Probe-based confocal laser endomicroscopy (pCLE) is capable of acquiring in vivo magnified cross-section images of the gastric mucosa. Intravenous injection of fluorescein sodium is used for confocal imaging. However, it is still under debate if local administration of the dye to the mucosa is also effective for confocal imaging as it is not yet clear if topical application also reveals the intramucosal distribution of fluorescein. The objective of this study was to evaluate the intramucosal distribution of fluorescein sodium after topical application and to compare the distribution to the conventional intravenous injection used for confocal imaging. pCLE of the stomach uninfected with Helicobacter pylori was performed in a healthy male employing intravenous administration and local mucosal application of fluorescein. The mucosa of the lower gastric body was biopsied 1 min and 5 min after intravenous administration or local mucosal application of fluorescein, and the distribution of fluorescein in the biopsy samples was examined histologically. Green fluorescence was already observed in the cytoplasm of fundic glandular cells in the biopsied deep mucosa 1 min after local mucosal application of fluorescein. It was also observed in the foveolar lumen and inter-foveolar lamina propria, although it was noted at only a few sites. In the tissue biopsied 5 min after the local mucosal application of fluorescein, green fluorescence was more frequently noted in the cytoplasm of fundic glandular cells than in that 1 min after the local mucosal application of fluorescein, although obvious green fluorescence was not identified in the foveolar lumen or inter-foveolar lamina propria. The distribution of intravenously administered fluorescein in the cytoplasm of fundic glandular cells was also clearly observed similarly to that after local mucosal application of fluorescein. Green fluorescence in more cells was observed in many cells 5 min after intravenous administration compared
Dabiri, Shahriar; Talebi, Amin; Shahryari, Jahanbanoo; Meymandi, Manzoumeh Shamsi; Safizadeh, Hossein
2013-02-01
This study seeks to determine the relationships between manifestation of myofibroblasts in the stroma tissue of hyperplastic pre-invasive breast lesions to invasive cancer by investigating clinicopathological data of patients, their effect on steroid receptor expression and HER2, and angiogenesis according to CD34 antigen expression. 100 cases of invasive ductal carcinoma were immunohistochemically investigated for the presence of smooth muscle actin (SMA), ER/PR, HER2, anti-CD34 antibody and microvessel count (MVC). Patients were scored in four different zones of invasive areas: invasive cancer, DCIS, fibrocystic disease ± ductal intraepithelial neoplasia (FCD ± DIN), and normal tissue. There was a significant difference in stromal myofibroblasts between all areas except for the stroma of DCIS and FCD ± DIN (P normal areas (P = 0.054). There was a significant difference in MVC observed in all areas except for DCIS and FCD ± DIN (P < 0.001). We noted significant inverse correlations between MVC, HER2 expression, and the numbers of involved lymph nodes in invasive cancer and DCIS (P < 0.001). Most MVC were present in grade I, with the least frequent observed in grade III cases in the stroma of invasive cancer, DCIS and FCD ± DIN (P < 0.001). Angiogenesis can be observed before any significant myofibroblastic changes in the pre-invasive breast lesions. The elevated content of myofibroblasts in stroma of tumor; probably may be a worse prognostic factor and the steps from atypical epithelial hyperplasia to DCIS and then to the invasive carcinoma do not appear to be always part of a linear progression.
Gundersen, H J; Seefeldt, T; Osterby, R
1980-01-01
The width of individual glomerular epithelial foot processes appears very different on electron micrographs. A method for obtainining distributions of the true width of foot processes from that of their apparent width on electron micrographs has been developed based on geometric probability theory pertaining to a specific geometric model. Analyses of foot process width in humans and rats show a remarkable interindividual invariance implying rigid control and therefore great biological significance of foot process width or a derivative thereof. The very low inter-individual variation of the true width, shown in the present paper, makes it possible to demonstrate slight changes in rather small groups of patients or experimental animals.
DEFF Research Database (Denmark)
Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov
2012-01-01
Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...
DEFF Research Database (Denmark)
Madsen, Louise Sofia; Handberg, Charlotte
2018-01-01
implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease......BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... was to understand the reflections on cancer survivorship care of lymphoma survivors to aid the future planning of cancer survivorship care and overcome barriers to participation. METHODS: Data were generated in a hematological ward during 4 months of ethnographic fieldwork, including participant observation and 46...
Normal modified stable processes
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Shephard, N.
2002-01-01
Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...
Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper
2018-03-01
Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Rao BN
2005-12-01
Full Text Available Abstract Background Despite the high incidence of cervical cancer reported from India, large scale population based studies on the HPV prevalence and genotype distribution are very few from this region. In view of the clinical trials for HPV vaccine taking place in India, it is of utmost importance to understand the prevalence of HPV genotypes in various geographical regions of India. We investigated the genotype distribution of high-risk HPV types in squamous cell carcinomas and the prevalence of high-risk HPV in cervicovaginal samples in the southern state of Andhra Pradesh (AP, India. Methods HPV genotyping was done in cervical cancer specimens (n = 41 obtained from women attending a regional cancer hospital in Hyderabad. HPV-DNA testing was also done in cervicovaginal samples (n = 185 collected from women enrolled in the cervical cancer screening pilot study conducted in the rural community, of Medchal Mandal, twenty kilometers away from Hyderabad. Results High-risk HPV types were found in 87.8% (n = 36/41 of the squamous cell carcinomas using a PCR-based line blot assay. Among the HPV positive cancers, the overall type distribution of the major high-risk HPV types was as follows: HPV 16 (66.7%, HPV 18 (19.4%, HPV 33 (5.6%, HPV 35 (5.6%, HPV 45 (5.6%, HPV 52 (2.8%, HPV 58(2.8%, HPV 59(2.8% and HPV 73 (2.8%. Women participating in the community screening programme provided both a self-collected vaginal swab and a clinician-collected cervical swab for HPV DNA testing. Primary screening for high risk HPV was performed using the Digene Hybrid Capture 2 (hc2 assay. All hc2 positive samples by any one method of collection were further analyzed using the Roche PCR-based line blot for genotype determination. The prevalence of high risk HPV infection in this community-based screening population was 10.3% (19/185 using the clinician-collected and 7.0% (13/185 using the self-collected samples. The overall agreement between self-collected and clinician
Shi, Xuqin; Tang, Yuping; Zhu, Huaxu; Li, Weixia; Li, Zhenhao; Li, Wei; Duan, Jin-ao
2014-01-01
Astragali Radix (AR) and Angelicae Sinensis Radix (ASR) were frequently combined and used in China as herbal pair called as Danggui Buxue Decoction (DBD) for treatment of blood deficiency syndrome, such as women's ailments. This study is to investigate the tissue distribution profiles of five major bio-active constituents (ferulic acid, caffeic acid, calycosin-7-O-β-glucoside, ononin and astragaloside IV) in DBD after oral administration of DBD in blood deficiency rats, and to compare the difference between normal and blood deficiency rats. The blood deficiency rats were induced by bleeding from orbit at the dosages of 5.0mLkg(-1) every day, and the experimental period was 12 days. At the finally day of experimental period, both normal and blood deficiency rats were orally administrated with DBD, and then the tissues samples were collected at different time points. Ferulic acid, caffeic acid, calycosin-7-O-β-glucoside, ononin and astragaloside IV in different tissues were detected simultaneously by UPLC-TQ/MS, and the histograms were drawn. The results showed that the overall trend was CLiver>CKidney>CHeart>CSpleen>CLung, CC-30min>CM-30min>CM-60min>CC-5min>CM-5min>CC-60min>CM-240min>CC-240min. The contents of the detected compounds in liver were more than that in other tissues no matter in normal or blood deficiency rats. Compared to normal rats, partial contents of the compounds in blood deficiency rats' tissues at different time points had significant difference (Pdistribution investigation in blood deficiency animals which is conducted by bleeding. And the results demonstrated that the five DBD components in normal and blood deficiency rats had obvious differences in some organs and time points, suggesting that the blood flow and perfusion rate of the organ were altered in blood deficiency animals. Copyright © 2013 Elsevier B.V. All rights reserved.
Ulbrich, Erika J; Nanz, Daniel; Leinhard, Olof Dahlqvist; Marcon, Magda; Fischer, Michael A
2018-01-01
To determine age- and gender-dependent whole-body adipose tissue and muscle volumes in healthy Swiss volunteers in Dixon MRI in comparison with anthropometric and bioelectrical impedance (BIA) measurements. Fat-water-separated whole-body 3 Tesla MRI of 80 healthy volunteers (ages 20 to 62 years) with a body mass index (BMI) of 17.5 to 26.2 kg/m 2 (10 men, 10 women per decade). Age and gender-dependent volumes of total adipose tissue (TAT), visceral adipose tissue (VAT), total abdominal subcutaneous adipose tissue (ASAT) and total abdominal adipose tissue (TAAT), and the total lean muscle tissue (TLMT) normalized for body height were determined by semi-automatic segmentation, and correlated with anthropometric and BIA measurements as well as lifestyle parameters. The TAT, ASAT, VAT, and TLMT indexes (TATi, ASATi, VATi, and TLMTi, respectively) (L/m 2 ± standard deviation) for women/men were 6.4 ± 1.8/5.3 ± 1.7, 1.6 ± 0.7/1.2 ± 0.5, 0.4 ± 0.2/0.8 ± 0.5, and 5.6 ± 0.6/7.1 ± 0.7, respectively. The TATi correlated strongly with ASATi (r > 0.93), VATi, BMI and BIA (r > 0.70), and TAATi (r > 0.96), and weak with TLMTi for both genders (r > -0.34). The VAT was the only parameter showing an age dependency (r > 0.32). The BMI and BIA showed strong correlation with all MR-derived adipose tissue volumes. The TAT mass was estimated significantly lower from BIA than from MRI (both genders P muscle volumes might serve as normative values. The estimation of adipose tissue volumes was significantly lower from anthropometric and BIA measurements than from MRI. Magn Reson Med 79:449-458, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Fukuoka, Tetsuo; Kobayashi, Kimiko; Yamanaka, Hiroki; Obata, Koichi; Dai, Yi; Noguchi, Koichi
2008-09-10
We compared the distribution of the alpha-subunit mRNAs of voltage-gated sodium channels Nav1.1-1.3 and Nav1.6-1.9 and a related channel, Nax, in histochemically identified neuronal subpopulations of the rat dorsal root ganglia (DRG). In the naïve DRG, the expression of Nav1.1 and Nav1.6 was restricted to A-fiber neurons, and they were preferentially expressed by TrkC neurons, suggesting that proprioceptive neurons possess these channels. Nav1.7, -1.8, and -1.9 mRNAs were more abundant in C-fiber neurons compared with A-fiber ones. Nax was evenly expressed in both populations. Although Nav1.8 and -1.9 were preferentially expressed by TrkA neurons, other alpha-subunits were expressed independently of TrkA expression. Actually, all IB4(+) neurons expressed both Nav1.8 and -1.9, and relatively limited subpopulations of IB4(+) neurons (3% and 12%, respectively) expressed Nav1.1 and/or Nav1.6. These findings provide useful information in interpreting the electrophysiological characteristics of some neuronal subpopulations of naïve DRG. After L5 spinal nerve ligation, Nav1.3 mRNA was up-regulated mainly in A-fiber neurons in the ipsilateral L5 DRG. Although previous studies demonstrated that nerve growth factor (NGF) and glial cell-derived neurotrophic factor (GDNF) reversed this up-regulation, the Nav1.3 induction was independent of either TrkA or GFRalpha1 expression, suggesting that the induction of Nav1.3 may be one of the common responses of axotomized DRG neurons without a direct relationship to NGF/GDNF supply. (c) 2008 Wiley-Liss, Inc.
Hesper, Tobias; Schleich, Christoph; Buchwald, Alexander; Hosalkar, Harish S; Antoch, Gerald; Krauspe, Rüdiger; Zilkens, Christoph; Bittersohl, Bernd
2018-01-01
Objective To assess age-dependent and regional differences in T2* relaxation measurements in hip joint cartilage of asymptomatic volunteers at 3 T. Design Three age cohorts (cohort 1: age 20-30 years, 15 individuals; cohort 2: age 30-40 years, 17 individuals; cohort 3: age 40-50 years, 15 individuals) were enrolled. T2* values were obtained in the central and peripheral cartilage of the acetabulum and the femoral head in 7 regions (anterior to superior and posterior). Results T2* did not differ among age cohorts in acetabular cartilage (cohort 1: 24.65 ± 6.56 ms, cohort 2: 24.70 ± 4.83 ms, cohort 3: 25.81 ± 5.10 ms, P = 0.10) and femoral head cartilage (cohort 1: 27.08 ± 8.24 ms, cohort 2: 25.90 ± 7.82 ms, cohort 3: 26.50 ± 5.61 ms, P = 0.34). Analysis of the regional T2* distribution pattern indicates increased T2* values in the anterior, anterior-superior, superior-anterior, and the posterior-superior aspects of acetabular and femoral head cartilage. For acetabular cartilage, higher values were observed in the central region (25.90 ± 4.80 ms vs. 24.21 ± 4.05 ms, P cartilage did not reveal such differences (26.62 ± 5.74 ms vs. 26.37 ± 5.89 ms, P = 0.44). Conclusions The T2* analysis of presumably healthy hip joint cartilage does not seem to be stratified according to age in this population. Regional T2* variation throughout hip joint cartilage is apparent in this modality.
Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr
2012-01-01
Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Mixed normal inference on multicointegration
Boswijk, H.P.
2009-01-01
Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the
Liu, Pei; Li, Wei; Li, Zhen-hao; Qian, Da-wei; Guo, Jian-ming; Shang, Er-xin; Su, Shu-lan; Tang, Yu-ping; Duan, Jin-ao
2014-07-03
Xiang-Fu-Si-Wu Decoction (XFSWD) has been widely used to treat primary dysmenorrhea in clinical practice for hundreds of years and shown great efficacy. One fraction of XFSWD, which was an elution product by macroporous adsorption resin from aqueous extract solution with 60% ethanol (XFSWE), showed great analgesic effect. The present study was conducted to investigate the possible pharmacokinetic and tissue distribution profiles of four major bioactive constituents (berberine, protopine, tetrahydrocoptisine and tetrahydropalmatine) after oral administration of XFSWE in dysmenorrheal symptom rats, and to compare the difference between normal and dysmenorrheal symptom rats. Estradiol benzoate and oxytocin were used to produce dysmenorrheal symptom rat model. The experimental period was seven days. At the final day of experimental period, both normal and dysmenorrheal symptom rats were orally administrated with XFSWE, and then the blood and tissues samples were collected at different time points. Berberine, protopine, tetrahydrocoptisine and tetrahydropalmatine in blood and tissue samples were determined by LC-MS/MS. Pharmacokinetic parameters were calculated from the plasma concentration-time data using non-compartmental methods. The differences of pharmacokinetic parameters among groups were tested by one-way analysis of variance (ANOVA). There were statistically significant differences (Pnormal and dysmenorrheal symptom rats that orally administered with same dosage of XFSWE. In tissue distribution study, the results showed that the overall trend was C(Spleen)>C(Liver)>C(Kidney)>C(Uterus)>C(Heart)>C(Lung)>C(Ovary)>C(Brain)>C(Thymus), C(M-60 min)>C(M-120 min)>C(M-30 min)>C(C-60 min)>C(C-120 min)>C(C-30 min). The contents of protopine in liver, spleen and uterus were more than that in other tissues of dysmenorrheal symptom rats. Compared to normal rats, partial contents of the compounds in dysmenorrheal symptom rats׳ tissues at different time points had significant
A Robust Alternative to the Normal Distribution.
1982-07-07
for any Purpose of the United States Governuent DEPARTMENT OF STATISTICS t -, STANFORD UIVERSITY I STANFORD, CALIFORNIA A Robust Alternative to the...Stanford University Technical Report No. 3. [5] Bhattacharya, S. K. (1966). A Modified Bessel Function lodel in Life Testing. Metrika 10, 133-144
Khazanov, G. V.; Gallagher, D. L.; Gamayunov, K.
2007-01-01
It is well known that the effects of EMIC waves on RC ion and RB electron dynamics strongly depend on such particle/wave characteristics as the phase-space distribution function, frequency, wave-normal angle, wave energy, and the form of wave spectral energy density. Therefore, realistic characteristics of EMIC waves should be properly determined by modeling the RC-EMIC waves evolution self-consistently. Such a selfconsistent model progressively has been developing by Khaznnov et al. [2002-2006]. It solves a system of two coupled kinetic equations: one equation describes the RC ion dynamics and another equation describes the energy density evolution of EMIC waves. Using this model, we present the effectiveness of relativistic electron scattering and compare our results with previous work in this area of research.
Normal Pressure Hydrocephalus (NPH)
... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...
Group normalization for genomic data.
Ghandi, Mahmoud; Beer, Michael A
2012-01-01
Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.
Group normalization for genomic data.
Directory of Open Access Journals (Sweden)
Mahmoud Ghandi
Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.
Normalization: A Preprocessing Stage
Patro, S. Gopal Krishna; Sahu, Kishore Kumar
2015-01-01
As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...
Baine, Fiona K; Peerbhai, Nabeelah; Krause, Amanda
2018-07-15
Huntington disease (HD) is a progressive neurodegenerative disease, characterised by a triad of movement disorder, emotional and behavioural disturbances and cognitive impairment. The underlying cause is an expanded CAG repeat in the huntingtin gene. For a small proportion of patients presenting with HD-like symptoms, the mutation in this gene is not identified and they are said to have a HD "phenocopy". South Africa has the highest number of recorded cases of an African-specific phenocopy, Huntington disease-like 2 (HDL2), caused by a repeat expansion in the junctophilin-3 gene. However, a significant proportion of black patients with clinical symptoms suggestive of HD still test negative for HD and HDL2. This study thus aimed to investigate five other loci associated with HD phenocopy syndromes - ATN1, ATXN2, ATXN7, TBP and C9orf72. In a sample of patients in whom HD and HDL2 had been excluded, a single expansion was identified in the ATXN2 gene, confirming a diagnosis of Spinocerebellar ataxia 2. The results indicate that common repeat expansion disorders do not contribute significantly to the HD-like phenotype in black South African patients. Importantly, allele sizing reveals unique distributions of normal repeat lengths across the associated loci in the African population studied. Copyright © 2018 Elsevier B.V. All rights reserved.
Adriaens, Antita; Polis, Ingeborgh; Waelbers, Tim; Vandermeulen, Eva; Dobbeleir, André; De Spiegeleer, Bart; Peremans, Kathelijne
2013-01-01
Functional imaging provides important insights into canine brain pathologies such as behavioral problems. Two (99m) Tc-labeled single photon emission computed tomography (SPECT) cerebral blood flow tracers-ethylcysteinate dimer (ECD) and hexamethylpropylene amine oxime (HMPAO)-are commonly used in human medicine and have been used previously in dogs but intrasubject comparison of both tracers in dogs is lacking. Therefore, this study investigated whether regional distribution differences between both tracers occur in dogs as is reported in humans. Eight beagles underwent two SPECT examinations first with (99m) Tc-ECD and followed by (99m) Tc-HMPAO. SPECT scanning was performed with a triple head gamma camera equipped with ultrahigh resolution parallel hole collimators. Images were reconstructed using filtered backprojection with a Butterworth filter. Emission data were fitted to a template permitting semiquantification using predefined regions or volumes of interest (VOIs). For each VOI, perfusion indices were calculated by normalizing the regional counts per voxel to total brain counts per voxel. The obtained perfusion indices for each region for both tracers were compared with a paired Student's T-test. Significant (P < 0.05) regional differences were seen in the subcortical region and the cerebellum. Both tracers can be used to visualize regional cerebral blood flow in dogs, however, due to the observed regional differences, they are not entirely interchangeable. © 2013 Veterinary Radiology & Ultrasound.
Directory of Open Access Journals (Sweden)
Celsemy E. Maia
2001-05-01
Full Text Available Objetivou-se, com este trabalho, desenvolver uma metodologia através de um embasamento estatístico, para determinação de nível crítico em tecido vegetal, oriunda de condições de campo. A obtenção do nível crítico pela distribuição contínua de probabilidade, é uma nova proposta para interpretação de análise foliar, baseada na distribuição normal reduzida. Para isto, são necessários dados de produtividade (P e de Q, donde Q é definido como a relação entre P e n i (Q= P/n i, e n i é o teor do nutriente de que se deseja encontrar o nível crítico. Inicialmente encontra-se P que representa 90% da máxima, pela equação P(90% = 1,281552s1 + X e para o cálculo de Q que 90% do valor máximo pela equação Q = 1,281552s2 + X onde X e s1 são a média aritmética e o desvio-padrão de P e X e s2, a média e o desvio-padrão de Q. O nível crítico é obtido por NCi = (1,281552s1 + X/(1,281552s2 + X. O nível crítico foliar determinado pela metodologia da distribuição contínua de probabilidade permitiu calcular-se, para a cultura do café, valores dentro da faixa de referência recomendada pela literatura.This study develops a methodology through a statistical method, for the determination of critical level in foliar analysis. The obtaining of the critical level with the continuous distribution of probability is a new proposal for foliar analysis interpretation, based on the reduced normal distribution. For this purpose it is necessary to have data of productivity (P and of Q, defined as the relationship between P and n i (Q = P/n i, where n i is the content of the nutrient for which critical level is to be found. Initially Pr which represents 90% of the maxim, is calculated with the equation P(90% = 1.281552s1 + x1 and for the calculation of Q 90% of the maximum value with the equation Q = 1.281552s2 + x2 where x1 and s1 are the arithmetic average and the standard deviation of P and x2 and s2 the average and the standard
International Nuclear Information System (INIS)
Pagani, M.; Salmaso, D.; Jonsson, C.; Hatherly, R.; Larsson, S.A.; Jacobsson, H.; Waegner, A.
2002-01-01
The increasing implementation of standardisation techniques in brain research and clinical diagnosis has highlighted the importance of reliable baseline data from normal control subjects for inter-subject analysis. In this context, knowledge of the regional cerebral blood flow (rCBF) distribution in normal ageing is a factor of the utmost importance. In the present study, rCBF was investigated in 50 healthy volunteers (25 men, 25 women), aged 31-78 years, who were examined at rest by means of single-photon emission tomography (SPET) using technetium-99m d,l-hexamethylpropylene amine oxime (HMPAO). After normalising the CBF data, 27 left and 27 right volumes of interest (VOIs) were selected and automatically outlined by standardisation software (computerised brain atlas). The heavy load of flow data thus obtained was reduced in number and grouped in factors by means of principal component analysis (PCA). PCA extracted 12 components explaining 81% of the variance and including the vast majority of cortical and subcortical regions. Analysis of variance and regression analyses were performed for rCBF, age and gender before PCA was applied and subsequently for each single extracted factor. There was a significantly higher CBF on the right side than on the left side (P<0.001). In the overall analysis, a significant decrease was found in CBF (P=0.05) with increasing age, and this decrease was particularly evident in the left hemisphere (P=0.006). When gender was specifically analysed, CBF was found to decrease significantly with increasing age in females (P=0.037) but not in males. Furthermore, a significant decrease in rCBF with increasing age was found in the brain vertex (P=0.05), left frontotemporal cortex (P=0.012) and temporocingulate cortex (P=0.003). By contrast, relative rCBF in central structures increased with age (P=0.001). The ability of standardisation software and PCA to identify functionally connected brain regions might contribute to a better
Energy Technology Data Exchange (ETDEWEB)
Pagani, M. [Inst. of Neurobiology and Molecular Medicine, CNR, Rome (Italy); Dept. of Hospital Physics, Karolinska Hospital, Stockholm (Sweden); Salmaso, D. [Inst. of Psychology, CNR, Rome (Italy); Jonsson, C.; Hatherly, R.; Larsson, S.A. [Dept. of Hospital Physics, Karolinska Hospital, Stockholm (Sweden); Jacobsson, H. [Dept. of Diagnostic Radiology, Karolinska Hospital, Stockholm (Sweden); Waegner, A. [Dept. of Clinical Neuroscience, Karolinska Institutet, Stockholm (Sweden); Dept. of Clinical Neuroscience, Karolinska Hospital, Stockholm (Sweden)
2002-01-01
The increasing implementation of standardisation techniques in brain research and clinical diagnosis has highlighted the importance of reliable baseline data from normal control subjects for inter-subject analysis. In this context, knowledge of the regional cerebral blood flow (rCBF) distribution in normal ageing is a factor of the utmost importance. In the present study, rCBF was investigated in 50 healthy volunteers (25 men, 25 women), aged 31-78 years, who were examined at rest by means of single-photon emission tomography (SPET) using technetium-99m d,l-hexamethylpropylene amine oxime (HMPAO). After normalising the CBF data, 27 left and 27 right volumes of interest (VOIs) were selected and automatically outlined by standardisation software (computerised brain atlas). The heavy load of flow data thus obtained was reduced in number and grouped in factors by means of principal component analysis (PCA). PCA extracted 12 components explaining 81% of the variance and including the vast majority of cortical and subcortical regions. Analysis of variance and regression analyses were performed for rCBF, age and gender before PCA was applied and subsequently for each single extracted factor. There was a significantly higher CBF on the right side than on the left side (P<0.001). In the overall analysis, a significant decrease was found in CBF (P=0.05) with increasing age, and this decrease was particularly evident in the left hemisphere (P=0.006). When gender was specifically analysed, CBF was found to decrease significantly with increasing age in females (P=0.037) but not in males. Furthermore, a significant decrease in rCBF with increasing age was found in the brain vertex (P=0.05), left frontotemporal cortex (P=0.012) and temporocingulate cortex (P=0.003). By contrast, relative rCBF in central structures increased with age (P=0.001). The ability of standardisation software and PCA to identify functionally connected brain regions might contribute to a better
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Normalized modes at selected points without normalization
Kausel, Eduardo
2018-04-01
As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á
International Nuclear Information System (INIS)
Poujeol, P.
1972-06-01
This academic work reports the simultaneous study on the same kidney of the distribution of glomerular filtrations and the distribution of blood flow rate in the renal cortex. Th author combined the technique of perfusion of sodium "1"4C ferro-cyanide which allows the measurement of individual glomerular filtrations, and a technique based on the use of microspheres which allows the assessment of blood flow distribution in the glomeruli of different nephron classes. Experiments have been performed on a normal rat, and on a rat submitted to a chronic NaCl overload [fr
International Nuclear Information System (INIS)
Weissman, S.D.
1989-01-01
The foot may be thought of as a bag of bones tied tightly together and functioning as a unit. The bones re expected to maintain their alignment without causing symptomatology to the patient. The author discusses a normal radiograph. The bones must have normal shape and normal alignment. The density of the soft tissues should be normal and there should be no fractures, tumors, or foreign bodies
The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data
Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark;
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).
Probability distribution relationships
Directory of Open Access Journals (Sweden)
Yousry Abdelkader
2013-05-01
Full Text Available In this paper, we are interesting to show the most famous distributions and their relations to the other distributions in collected diagrams. Four diagrams are sketched as networks. The first one is concerned to the continuous distributions and their relations. The second one presents the discrete distributions. The third diagram is depicted the famous limiting distributions. Finally, the Balakrishnan skew-normal density and its relationship with the other distributions are shown in the fourth diagram.
Quantum arrival times and operator normalization
International Nuclear Information System (INIS)
Hegerfeldt, Gerhard C.; Seidel, Dirk; Gonzalo Muga, J.
2003-01-01
A recent approach to arrival times used the fluorescence of an atom entering a laser illuminated region, and the resulting arrival-time distribution was close to the axiomatic distribution of Kijowski, but not exactly equal, neither in limiting cases nor after compensation of reflection losses by normalization on the level of expectation values. In this paper we employ a normalization on the level of operators, recently proposed in a slightly different context. We show that in this case the axiomatic arrival-time distribution of Kijowski is recovered as a limiting case. In addition, it is shown that Allcock's complex potential model is also a limit of the physically motivated fluorescence approach and connected to Kijowski's distribution through operator normalization
... I'm breast-feeding my newborn and her bowel movements are yellow and mushy. Is this normal for baby poop? Answers from Jay L. Hoecker, M.D. Yellow, mushy bowel movements are perfectly normal for breast-fed babies. Still, ...
Visual Memories Bypass Normalization.
Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam
2018-05-01
How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.
International Nuclear Information System (INIS)
Takahashi, Nobukazu; Mitani, Isao; Sumita, Shinichi
1998-01-01
Visual interpretation of iodine-123-beta-15-(p-iodophenyl)-3(R,S)-methyl-pentadecanoic acid ( 123 I-BMIPP) myocardial images cannot easily detect mild reduction in tracer uptake. Objective assessment of myocardial 123 I-BMIPP maldistributions at rest was attempted using a bull's-eye map and its normal data file for detecting myocardial damage in patients with mitochondrial encephalomyopathy. Six patients, two with Kearns-Sayre syndrome and four with mitochondrial myopathy, encephalopathy, lactic acidosis, and strokelike episodes (MELAS), and 10 normal subjects were studied. Fractional myocardial uptake of 1 23 I-BMIPP was also measured by dynamic static imaging to assess the global myocardial free fatty acid. These data were compared with the cardiothoracic ratio measured by chest radiography and left ventricular ejection fraction assessed by echocardiography. Abnormal cardiothoracic ratio and lower ejection fraction were detected in only one patient with Kearns-Sayre syndrome. Abnormal fractional myocardial uptake was detected in two patients (1.61%, 1.91%), whereas abnormal regional 123 I-BMIPP uptake assessed by the bull's-eye map was detected in five patients (83%). All patients showed abnormal uptake in the anterior portion, and one showed progressive atrioventricular conduction abnormality and systolic dysfunction with extended 123 I-BMIPP abnormal uptake. The results suggest that assessment based on the normal data file in a bull's-eye polar map is clinically useful for detection of myocardial damage in patients with mitochondrial encephalomyopathy. (author)
International Nuclear Information System (INIS)
Haehlen, Peter; Elmiger, Bruno
2000-01-01
The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be
... improves the chance of a good recovery. Without treatment, symptoms may worsen and cause death. What research is being done? The NINDS conducts and supports research on neurological disorders, including normal pressure hydrocephalus. Research on disorders such ...
Normality in Analytical Psychology
Myers, Steve
2013-01-01
Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262
Hydrocephalus - occult; Hydrocephalus - idiopathic; Hydrocephalus - adult; Hydrocephalus - communicating; Dementia - hydrocephalus; NPH ... Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. ... Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders ...
... Spread the Word Shop AAP Find a Pediatrician Family Life Medical Home Family Dynamics Adoption & Foster Care ... Español Text Size Email Print Share Normal Functioning Family Page Content Article Body Is there any way ...
... page: //medlineplus.gov/ency/article/002456.htm Normal growth and development To use the sharing features on this page, please enable JavaScript. A child's growth and development can be divided into four periods: ...
Runckel, Jack F.; Hieser, Gerald
1961-01-01
An investigation has been conducted at the Langley 16-foot transonic tunnel to determine the loading characteristics of flap-type ailerons located at inboard, midspan, and outboard positions on a 45 deg. sweptback-wing-body combination. Aileron normal-force and hinge-moment data have been obtained at Mach numbers from 0.80 t o 1.03, at angles of attack up to about 27 deg., and at aileron deflections between approximately -15 deg. and 15 deg. Results of the investigation indicate that the loading over the ailerons was established by the wing-flow characteristics, and the loading shapes were irregular in the transonic speed range. The spanwise location of the aileron had little effect on the values of the slope of the curves of hinge-moment coefficient against aileron deflection, but the inboard aileron had the greatest value of the slope of the curves of hinge-moment coefficient against angle of attack and the outboard aileron had the least. Hinge-moment and aileron normal-force data taken with strain-gage instrumentation are compared with data obtained with pressure measurements.
Adamson, T. C., Jr.; Liou, M. S.; Messiter, A. F.
1980-01-01
An asymptotic description is derived for the interaction between a shock wave and a turbulent boundary layer in transonic flow, for a particular limiting case. The dimensionless difference between the external flow velocity and critical sound speed is taken to be much smaller than one, but large in comparison with the dimensionless friction velocity. The basic results are derived for a flat plate, and corrections for longitudinal wall curvature and for flow in a circular pipe are also shown. Solutions are given for the wall pressure distribution and the shape of the shock wave. Solutions for the wall shear stress are obtained, and a criterion for incipient separation is derived. Simplified solutions for both the wall pressure and skin friction distributions in the interaction region are given. These results are presented in a form suitable for use in computer programs.
DEFF Research Database (Denmark)
Nissen, Nina Konstantin; Holm, Lotte; Baarts, Charlotte
2015-01-01
of practices for monitoring their bodies based on different kinds of calculations of weight and body size, observations of body shape, and measurements of bodily firmness. Biometric measurements are familiar to them as are health authorities' recommendations. Despite not belonging to an extreme BMI category...... provides us with knowledge about how to prevent future overweight or obesity. This paper investigates body size ideals and monitoring practices among normal-weight and moderately overweight people. Methods : The study is based on in-depth interviews combined with observations. 24 participants were...... recruited by strategic sampling based on self-reported BMI 18.5-29.9 kg/m2 and socio-demographic factors. Inductive analysis was conducted. Results : Normal-weight and moderately overweight people have clear ideals for their body size. Despite being normal weight or close to this, they construct a variety...
Normalization of satellite imagery
Kim, Hongsuk H.; Elman, Gregory C.
1990-01-01
Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.
International Nuclear Information System (INIS)
Olmos, C.
1990-05-01
The restricted holonomy group of a Riemannian manifold is a compact Lie group and its representation on the tangent space is a product of irreducible representations and a trivial one. Each one of the non-trivial factors is either an orthogonal representation of a connected compact Lie group which acts transitively on the unit sphere or it is the isotropy representation of a single Riemannian symmetric space of rank ≥ 2. We prove that, all these properties are also true for the representation on the normal space of the restricted normal holonomy group of any submanifold of a space of constant curvature. 4 refs
Research on Normal Human Plantar Pressure Test
Directory of Open Access Journals (Sweden)
Liu Xi Yang
2016-01-01
Full Text Available FSR400 pressure sensor, nRF905 wireless transceiver and MSP40 SCM are used to design the insole pressure collection system, LabVIEW is used to make HMI of data acquisition, collecting a certain amount of normal human foot pressure data, statistical analysis of pressure distribution relations about five stages of swing phase during walking, using the grid closeness degree to identify plantar pressure distribution pattern recognition, and the algorithm simulation, experimental results demonstrated this method feasible.
Normality in Analytical Psychology
Directory of Open Access Journals (Sweden)
Steve Myers
2013-11-01
Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.
DEFF Research Database (Denmark)
Møldrup, Claus; Traulsen, Janine Morgall; Almarsdóttir, Anna Birna
2003-01-01
Objective: To consider public perspectives on the use of medicines for non-medical purposes, a usage called medically-enhanced normality (MEN). Method: Examples from the literature were combined with empirical data derived from two Danish research projects: a Delphi internet study and a Telebus...
Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon
2017-10-01
The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.
Idiopathic Normal Pressure Hydrocephalus
Directory of Open Access Journals (Sweden)
Basant R. Nassar BS
2016-04-01
Full Text Available Idiopathic normal pressure hydrocephalus (iNPH is a potentially reversible neurodegenerative disease commonly characterized by a triad of dementia, gait, and urinary disturbance. Advancements in diagnosis and treatment have aided in properly identifying and improving symptoms in patients. However, a large proportion of iNPH patients remain either undiagnosed or misdiagnosed. Using PubMed search engine of keywords “normal pressure hydrocephalus,” “diagnosis,” “shunt treatment,” “biomarkers,” “gait disturbances,” “cognitive function,” “neuropsychology,” “imaging,” and “pathogenesis,” articles were obtained for this review. The majority of the articles were retrieved from the past 10 years. The purpose of this review article is to aid general practitioners in further understanding current findings on the pathogenesis, diagnosis, and treatment of iNPH.
DEFF Research Database (Denmark)
Ipsen, David Hojland; Tveden-Nyborg, Pernille; Lykkesfeldt, Jens
2016-01-01
Objective: The liver coordinates lipid metabolism and may play a vital role in the development of dyslipidemia, even in the absence of obesity. Normal weight dyslipidemia (NWD) and patients with nonalcoholic fatty liver disease (NAFLD) who do not have obesity constitute a unique subset...... of individuals characterized by dyslipidemia and metabolic deterioration. This review examined the available literature on the role of the liver in dyslipidemia and the metabolic characteristics of patients with NAFLD who do not have obesity. Methods: PubMed was searched using the following keywords: nonobese......, dyslipidemia, NAFLD, NWD, liver, and metabolically obese/unhealthy normal weight. Additionally, article bibliographies were screened, and relevant citations were retrieved. Studies were excluded if they had not measured relevant biomarkers of dyslipidemia. Results: NWD and NAFLD without obesity share a similar...
Lyerly, Anne Drapkin
2012-12-01
The concept of "normal birth" has been promoted as ideal by several international organizations, although debate about its meaning is ongoing. In this article, I examine the concept of normalcy to explore its ethical implications and raise a trio of concerns. First, in its emphasis on nonuse of technology as a goal, the concept of normalcy may marginalize women for whom medical intervention is necessary or beneficial. Second, in its emphasis on birth as a socially meaningful event, the mantra of normalcy may unintentionally avert attention to meaning in medically complicated births. Third, the emphasis on birth as a normal and healthy event may be a contributor to the long-standing tolerance for the dearth of evidence guiding the treatment of illness during pregnancy and the failure to responsibly and productively engage pregnant women in health research. Given these concerns, it is worth debating not just what "normal birth" means, but whether the term as an ideal earns its keep. © 2012, Copyright the Authors Journal compilation © 2012, Wiley Periodicals, Inc.
Aerosol lung inhalation scintigraphy in normal subjects
Energy Technology Data Exchange (ETDEWEB)
Sui, Osamu; Shimazu, Hideki
1985-03-01
We previously reported basic and clinical evaluation of aerosol lung inhalation scintigraphy with /sup 99m/Tc-millimicrosphere albumin (milli MISA) and concluded aerosol inhalation scintigraphy with /sup 99m/Tc-milli MISA was useful for routine examination. But central airway deposit of aerosol particles was found in not only the patients with chronic obstructive pulmonary disease (COPD) but also normal subjects. So we performed aerosol inhalation scintigraphy in normal subjects and evaluated their scintigrams. The subjects had normal values of FEVsub(1.0)% (more than 70%) in lung function tests, no abnormal findings in chest X-ray films and no symptoms and signs. The findings of aerosol inhalation scintigrams in them were classified into 3 patterns; type I: homogeneous distribution without central airway deposit, type II: homogeneous distribution with central airway deposit, type III: inhomogeneous distribution. These patterns were compared with lung function tests. There was no significant correlation between type I and type II in lung function tests. Type III was different from type I and type II in inhomogeneous distribution. This finding showed no correlation with %VC, FEVsub(1.0)%, MMF, V radical50 and V radical50/V radical25, but good correlation with V radical25 in a maximum forced expiratory flow-volume curve. Flow-volume curve is one of the sensitive methods in early detection of COPD, so inhomogeneous distribution of type III is considered to be due to small airway dysfunction.
International Nuclear Information System (INIS)
Wagatsuma, Kazuaki
2015-01-01
This paper describes several interesting excitation phenomena occurring in a microwave-induced plasma (MIP) excited with Okamoto-cavity, especially when a small amount of oxygen was mixed with nitrogen matrix in the composition of the plasma gas. An ion-to-atom ratio of iron, which was estimated from the intensity ratio of ion to atomic lines having almost the same excitation energy, was reduced by adding oxygen gas to the nitrogen MIP, eventually contributing to an enhancement in the emission intensities of the atomic lines. Furthermore, Boltzmann plots for iron atomic lines were observed in a wide range of the excitation energy from 3.4 to 6.9 eV, indicating that plots of the atomic lines having lower excitation energies (3.4 to 4.8 eV) were well fitted on a straight line while those having more than 5.5 eV deviated upwards from the linear relationship. This overpopulation would result from any other excitation process in addition to the thermal excitation that principally determines the Boltzmann distribution. A Penning-type collision with excited species of nitrogen molecules probably explains this additional excitation mechanism, in which the resulting iron ions recombine with captured electrons, followed by cascade de-excitations between closely-spaced excited levels just below the ionization limit. As a result, these high-lying levels might be more populated than the low-lying levels of iron atom. The ionization of iron would be caused less actively in the nitrogen–oxygen plasma than in a pure nitrogen plasma, because excited species of nitrogen molecule, which can provide the ionization energy in a collision with iron atom, are consumed through collisions with oxygen molecules to cause their dissociation. It was also observed that the overpopulation occurred to a lesser extent when oxygen gas was added to the nitrogen plasma. The reason for this was also attributed to decreased number density of the excited nitrogen species due to collisions with oxygen
Normal zone soliton in large composite superconductors
International Nuclear Information System (INIS)
Kupferman, R.; Mints, R.G.; Ben-Jacob, E.
1992-01-01
The study of normal zone of finite size (normal domains) in superconductors, has been continuously a subject of interest in the field of applied superconductivity. It was shown that in homogeneous superconductors normal domains are always unstable, so that if a normal domain nucleates, it will either expand or shrink. While testing the stability of large cryostable composite superconductors, a new phenomena was found, the existence of stable propagating normal solitons. The formation of these propagating domains was shown to be a result of the high Joule power generated in the superconductor during the relatively long process of current redistribution between the superconductor and the stabilizer. Theoretical studies were performed in investigate the propagation of normal domains in large composite super conductors in the cryostable regime. Huang and Eyssa performed numerical calculations simulating the diffusion of heat and current redistribution in the conductor, and showed the existence of stable propagating normal domains. They compared the velocity of normal domain propagation with the experimental data, obtaining a reasonable agreement. Dresner presented an analytical method to solve this problem if the time dependence of the Joule power is given. He performed explicit calculations of normal domain velocity assuming that the Joule power decays exponentially during the process of current redistribution. In this paper, the authors propose a system of two one-dimensional diffusion equations describing the dynamics of the temperature and the current density distributions along the conductor. Numerical simulations of the equations reconfirm the existence of propagating domains in the cryostable regime, while an analytical investigation supplies an explicit formula for the velocity of the normal domain
HIROSE,Hideo
1998-01-01
TYPES OF THE DISTRIBUTION:13;Normal distribution (2-parameter)13;Uniform distribution (2-parameter)13;Exponential distribution ( 2-parameter)13;Weibull distribution (2-parameter)13;Gumbel Distribution (2-parameter)13;Weibull/Frechet Distribution (3-parameter)13;Generalized extreme-value distribution (3-parameter)13;Gamma distribution (3-parameter)13;Extended Gamma distribution (3-parameter)13;Log-normal distribution (3-parameter)13;Extended Log-normal distribution (3-parameter)13;Generalized ...
International Nuclear Information System (INIS)
Mahan, G.D.
1992-01-01
The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors
Normal vibrations in gallium arsenide
International Nuclear Information System (INIS)
Dolling, G.; Waugh, J.L.T.
1964-01-01
The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296 o K. The frequencies of normal modes of vibration propagating in the [ζ00], (ζζζ], and (0ζζ] crystal directions have been determined with a precision of between 1 and 2·5 per cent. A limited number of normal modes have also been studied at 95 and 184 o K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296 o K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10 12 c/s) for these modes (at 296 o K) have been assigned: T 8·02±0·08 and L 8·55±02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7·56 ± 008, TA 2·36 ± 0·015, LO 7·22 ± 0·15, LA 6·80 ± 0·06; (b) (0·5, 0·5, 0·5): TO 7·84 ± 0·12, TA 1·86 ± 0·02, LO 7·15 ± 0·07, LA 6·26 ± 0·10; (c) (0, 0·65, 0·65): optic 8·08 ±0·13, 7·54 ± 0·12 and 6·57 ± 0·11, acoustic 5·58 ± 0·08, 3·42 · 0·06 and 2·36 ± 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0·04 e. The frequency distribution function has been derived from one of the force models. (author)
Normal vibrations in gallium arsenide
Energy Technology Data Exchange (ETDEWEB)
Dolling, G; Waugh, J L T
1964-07-01
The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296{sup o}K. The frequencies of normal modes of vibration propagating in the [{zeta}00], ({zeta}{zeta}{zeta}], and (0{zeta}{zeta}] crystal directions have been determined with a precision of between 1 and 2{center_dot}5 per cent. A limited number of normal modes have also been studied at 95 and 184{sup o}K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296{sup o}K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10{sup 12} c/s) for these modes (at 296{sup o}K) have been assigned: T 8{center_dot}02{+-}0{center_dot}08 and L 8{center_dot}55{+-}02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7{center_dot}56 {+-} 008, TA 2{center_dot}36 {+-} 0{center_dot}015, LO 7{center_dot}22 {+-} 0{center_dot}15, LA 6{center_dot}80 {+-} 0{center_dot}06; (b) (0{center_dot}5, 0{center_dot}5, 0{center_dot}5): TO 7{center_dot}84 {+-} 0{center_dot}12, TA 1{center_dot}86 {+-} 0{center_dot}02, LO 7{center_dot}15 {+-} 0{center_dot}07, LA 6{center_dot}26 {+-} 0{center_dot}10; (c) (0, 0{center_dot}65, 0{center_dot}65): optic 8{center_dot}08 {+-}0{center_dot}13, 7{center_dot}54 {+-} 0{center_dot}12 and 6{center_dot}57 {+-} 0{center_dot}11, acoustic 5{center_dot}58 {+-} 0{center_dot}08, 3{center_dot}42 {center_dot} 0{center_dot}06 and 2{center_dot}36 {+-} 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0{center_dot}04 e. The
Individual loss reserving with the Multivariate Skew Normal distribution
Pigeon, M.; Antonio, K.; Denuit, M.
2012-01-01
The evaluation of future cash flows and solvency capital recently gained importance in general insurance. To assist in this process, our paper proposes a novel loss reserving model, designed for individual claims in discrete time. We model the occurrence of claims, as well as their reporting delay,
A Policy Representation Using Weighted Multiple Normal Distribution
Kimura, Hajime; Aramaki, Takeshi; Kobayashi, Shigenobu
In this paper, we challenge to solve a reinforcement learning problem for a 5-linked ring robot within a real-time so that the real-robot can stand up to the trial and error. On this robot, incomplete perception problems are caused from noisy sensors and cheap position-control motor systems. This incomplete perception also causes varying optimum actions with the progress of the learning. To cope with this problem, we adopt an actor-critic method, and we propose a new hierarchical policy representation scheme, that consists of discrete action selection on the top level and continuous action selection on the low level of the hierarchy. The proposed hierarchical scheme accelerates learning on continuous action space, and it can pursue the optimum actions varying with the progress of learning on our robotics problem. This paper compares and discusses several learning algorithms through simulations, and demonstrates the proposed method showing application for the real robot.
X-ray emssion from normal galaxies
International Nuclear Information System (INIS)
Speybroeck, L. van; Bechtold, J.
1981-01-01
A summary of results obtained with the Einstein Observatory is presented. There are two general categories of normal galaxy investigation being pursued - detailed studies of nearby galaxies where individual sources can be detected and possibly correlated with galactic morphology, and shorter observations of many more distant objects to determine the total luminosity distribution of normal galaxies. The principal examples of the first type are the CFA study of M31 and the Columbia study of the Large Magellanic Cloud. The Columbia normal galaxy survey is the principal example of the second type, although there also are smaller CFA programs concentrating on early galaxies and peculiar galaxies, and MIT has observed some members of the local group. (Auth.)
The self-normalized Donsker theorem revisited
Parczewski, Peter
2016-01-01
We extend the Poincar\\'{e}--Borel lemma to a weak approximation of a Brownian motion via simple functionals of uniform distributions on n-spheres in the Skorokhod space $D([0,1])$. This approach is used to simplify the proof of the self-normalized Donsker theorem in Cs\\"{o}rg\\H{o} et al. (2003). Some notes on spheres with respect to $\\ell_p$-norms are given.
Normal human bone marrow and its variations in MRI
International Nuclear Information System (INIS)
Vahlensieck, M.; Schmidt, H.M.
2000-01-01
Physiology and age dependant changes of human bone marrow are described. The resulting normal distribution patterns of active and inactive bone marrow including the various contrasts on different MR-sequences are discussed. (orig.) [de
Short proofs of strong normalization
Wojdyga, Aleksander
2008-01-01
This paper presents simple, syntactic strong normalization proofs for the simply-typed lambda-calculus and the polymorphic lambda-calculus (system F) with the full set of logical connectives, and all the permutative reductions. The normalization proofs use translations of terms and types to systems, for which strong normalization property is known.
The Semiparametric Normal Variance-Mean Mixture Model
DEFF Research Database (Denmark)
Korsholm, Lars
1997-01-01
We discuss the normal vairance-mean mixture model from a semi-parametric point of view, i.e. we let the mixing distribution belong to a non parametric family. The main results are consistency of the non parametric maximum likelihood estimat or in this case, and construction of an asymptotically...... normal and efficient estimator....
Analysis of the normal optical, Michel and molecular potentials on ...
Indian Academy of Sciences (India)
6. — journal of. June 2016 physics pp. 1275–1286. Analysis of the normal ... the levels are obtained for the three optical potentials to estimate the quality ... The experimental angular distribution data for the 40Ca(6Li, d)44Ti reaction .... analysed using the normal optical, Michel and molecular potentials within the framework.
Extravascular transport in normal and tumor tissues.
Jain, R K; Gerlowski, L E
1986-01-01
The transport characteristics of the normal and tumor tissue extravascular space provide the basis for the determination of the optimal dosage and schedule regimes of various pharmacological agents in detection and treatment of cancer. In order for the drug to reach the cellular space where most therapeutic action takes place, several transport steps must first occur: (1) tissue perfusion; (2) permeation across the capillary wall; (3) transport through interstitial space; and (4) transport across the cell membrane. Any of these steps including intracellular events such as metabolism can be the rate-limiting step to uptake of the drug, and these rate-limiting steps may be different in normal and tumor tissues. This review examines these transport limitations, first from an experimental point of view and then from a modeling point of view. Various types of experimental tumor models which have been used in animals to represent human tumors are discussed. Then, mathematical models of extravascular transport are discussed from the prespective of two approaches: compartmental and distributed. Compartmental models lump one or more sections of a tissue or body into a "compartment" to describe the time course of disposition of a substance. These models contain "effective" parameters which represent the entire compartment. Distributed models consider the structural and morphological aspects of the tissue to determine the transport properties of that tissue. These distributed models describe both the temporal and spatial distribution of a substance in tissues. Each of these modeling techniques is described in detail with applications for cancer detection and treatment in mind.
The distribution choice for the threshold of solid state relay
International Nuclear Information System (INIS)
Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang
2009-01-01
Either normal distribution or Weibull distribution can be accepted as sample distribution of the threshold of solid state relay. By goodness-of-fit method, bootstrap method and Bayesian method, the Weibull distribution is chosen later. (authors)
Computer modeling the boron compound factor in normal brain tissue
International Nuclear Information System (INIS)
Gavin, P.R.; Huiskamp, R.; Wheeler, F.J.; Griebenow, M.L.
1993-01-01
The macroscopic distribution of borocaptate sodium (Na 2 B 12 H 11 SH or BSH) in normal tissues has been determined and can be accurately predicted from the blood concentration. The compound para-borono-phenylalanine (p-BPA) has also been studied in dogs and normal tissue distribution has been determined. The total physical dose required to reach a biological isoeffect appears to increase directly as the proportion of boron capture dose increases. This effect, together with knowledge of the macrodistribution, led to estimates of the influence of the microdistribution of the BSH compound. This paper reports a computer model that was used to predict the compound factor for BSH and p-BPA and, hence, the equivalent radiation in normal tissues. The compound factor would need to be calculated for other compounds with different distributions. This information is needed to design appropriate normal tissue tolerance studies for different organ systems and/or different boron compounds
Bicervical normal uterus with normal vagina | Okeke | Annals of ...
African Journals Online (AJOL)
To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...
Immunolocalization of transforming growth factor alpha in normal human tissues
DEFF Research Database (Denmark)
Christensen, M E; Poulsen, Steen Seier
1996-01-01
anchorage-independent growth of normal cells and was, therefore, considered as an "oncogenic" growth factor. Later, its immunohistochemical presence in normal human cells as well as its biological effects in normal human tissues have been demonstrated. The aim of the present investigation was to elucidate...... the distribution of the growth factor in a broad spectrum of normal human tissues. Indirect immunoenzymatic staining methods were used. The polypeptide was detected with a polyclonal as well as a monoclonal antibody. The polyclonal and monoclonal antibodies demonstrated almost identical immunoreactivity. TGF......-alpha was found to be widely distributed in cells of normal human tissues derived from all three germ layers, most often in differentiated cells. In epithelial cells, three different kinds of staining patterns were observed, either diffuse cytoplasmic, cytoplasmic in the basal parts of the cells, or distinctly...
Family business and corporate social responsibility in a samle of Dutch firms
prof. Uhlaner, L.M.; van Goor-Balk, H.J.M.; Masurel, E.
2004-01-01
This paper explores corporate social responsibility in family businesses. In particular, the research investigates family businesses in relation to a wide variety of constituent or stakeholder groups. It reports the preliminary results of focused interviews with 42 small and medium-sized Dutch
Use of In-Flight Data to Validate Mars Samle Return Autonomous RvD GNC
DEFF Research Database (Denmark)
Barrena, V.; Colmenarejo, P.; Suatoni, M.
D is based on RF, camera and LIDAR measurements. It includes design, prototyping and verification at three different levels: algorithms design and verification in a Functional Engineering Simulator, SW demonstrator verified in Real Time Avionics Test Benching and Dynamic Test Benching. Moreover...... and testing of a vision based optical stimulator (ViSOS by DTU) to enhance the on-ground validation capabilities. After checking different alternatives for the proposed HARvD-GNC experiment with PRISMA resources, an efficient but cost-effective approach was chosen. The approach is based on designing MSR......-like dedicated manoeuvres sequencing using the already existing on-board PRISMA GNC/AOCS system (based on relative GPS measurements for the closed-loop execution of the manoeuvres sequencing and acquiring RF and camera images as part of the HARvD-GNC experiment data). This option allows downloading the sensor...
Comparison of spectrum normalization techniques for univariate ...
Indian Academy of Sciences (India)
Laser-induced breakdown spectroscopy; univariate study; normalization models; stainless steel; standard error of prediction. Abstract. Analytical performance of six different spectrum normalization techniques, namelyinternal normalization, normalization with total light, normalization with background along with their ...
Hemoglobin levels in normal Filipino pregnant women.
Kuizon, M D; Natera, M G; Ancheta, L P; Platon, T P; Reyes, G D; Macapinlac, M P
1981-09-01
The hemoglobin concentrations during pregnancy in Filipinos belonging to the upper income group, who were prescribed 105 mg elemental iron daily, and who had acceptable levels of transferrin saturation, were examined in an attempt to define normal levels. The hemoglobin concentrations for each trimester followed a Gaussian distribution. The hemoglobin values equal to the mean minus one standard deviation were 11.4 gm/dl for the first trimester and 10.4 gm/dl for the second and third trimesters. Using these values as the lower limits of normal, in one group of pregnant women the prevalence of anemia during the last two trimesters was found lower than that obtained when WHO levels for normal were used. Groups of women with hemoglobin of 10.4 to 10.9 gm/dl (classified anemic by WHO criteria but normal in the present study) and those with 11.0 gm/dl and above could not be distinguished on the basis of their serum ferritin levels nor on the degree of decrease in their hemoglobin concentration during pregnancy. Many subjects in both groups, however, had serum ferritin levels less than 12 ng/ml which indicate poor iron stores. It might be desirable in future studies to determine the hemoglobin cut-off point that will delineate subjects who are both non-anemic and adequate in iron stores using serum ferritin levels as criterion for the latter.
Role of the normal gut microbiota.
Jandhyala, Sai Manasa; Talukdar, Rupjyoti; Subramanyam, Chivkula; Vuyyuru, Harish; Sasikala, Mitnala; Nageshwar Reddy, D
2015-08-07
Relation between the gut microbiota and human health is being increasingly recognised. It is now well established that a healthy gut flora is largely responsible for overall health of the host. The normal human gut microbiota comprises of two major phyla, namely Bacteroidetes and Firmicutes. Though the gut microbiota in an infant appears haphazard, it starts resembling the adult flora by the age of 3 years. Nevertheless, there exist temporal and spatial variations in the microbial distribution from esophagus to the rectum all along the individual's life span. Developments in genome sequencing technologies and bioinformatics have now enabled scientists to study these microorganisms and their function and microbe-host interactions in an elaborate manner both in health and disease. The normal gut microbiota imparts specific function in host nutrient metabolism, xenobiotic and drug metabolism, maintenance of structural integrity of the gut mucosal barrier, immunomodulation, and protection against pathogens. Several factors play a role in shaping the normal gut microbiota. They include (1) the mode of delivery (vaginal or caesarean); (2) diet during infancy (breast milk or formula feeds) and adulthood (vegan based or meat based); and (3) use of antibiotics or antibiotic like molecules that are derived from the environment or the gut commensal community. A major concern of antibiotic use is the long-term alteration of the normal healthy gut microbiota and horizontal transfer of resistance genes that could result in reservoir of organisms with a multidrug resistant gene pool.
Normal matter storage of antiprotons
International Nuclear Information System (INIS)
Campbell, L.J.
1987-01-01
Various simple issues connected with the possible storage of anti p in relative proximity to normal matter are discussed. Although equilibrium storage looks to be impossible, condensed matter systems are sufficiently rich and controllable that nonequilibrium storage is well worth pursuing. Experiments to elucidate the anti p interactions with normal matter are suggested. 32 refs
Normal modes of vibration in nickel
Energy Technology Data Exchange (ETDEWEB)
Birgeneau, R J [Yale Univ., New Haven, Connecticut (United States); Cordes, J [Cambridge Univ., Cambridge (United Kingdom); Dolling, G; Woods, A D B
1964-07-01
The frequency-wave-vector dispersion relation, {nu}(q), for the normal vibrations of a nickel single crystal at 296{sup o}K has been measured for the [{zeta}00], [{zeta}00], [{zeta}{zeta}{zeta}], and [0{zeta}1] symmetric directions using inelastic neutron scattering. The results can be described in terms of the Born-von Karman theory of lattice dynamics with interactions out to fourth-nearest neighbors. The shapes of the dispersion curves are very similar to those of copper, the normal mode frequencies in nickel being about 1.24 times the corresponding frequencies in copper. The fourth-neighbor model was used to calculate the frequency distribution function g({nu}) and related thermodynamic properties. (author)
Deformation associated with continental normal faults
Resor, Phillip G.
Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master
Overview Report: Normal and Emergency Operation Visualization
Energy Technology Data Exchange (ETDEWEB)
Greitzer, Frank L.
2011-05-01
This is an overview report to document and illustrate methods used in a project entitled “Normal and Emergency Operations Visualization” for a utility company, conducted in 2009-2010 timeframe with funding from the utility company and the U.S. Department of Energy. The original final report (about 180 pages) for the project is not available for distribution because it alludes to findings that assessed the design of an operational system that contained proprietary information; this abridged version contains descriptions of methods and some findings to illustrate the approach used, while avoiding discussion of sensitive or proprietary information. The client has approved this abridged version of the report for unlimited distribution to give researchers and collaborators the benefit of reviewing the research concepts and methods that were applied in this study.
Fusion and normalization to enhance anomaly detection
Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.
2009-05-01
This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.
Development of distributed target
Yu Hai Jun; Li Qin; Zhou Fu Xin; Shi Jin Shui; Ma Bing; Chen Nan; Jing Xiao Bing
2002-01-01
Linear introduction accelerator is expected to generate small diameter X-ray spots with high intensity. The interaction of the electron beam with plasmas generated at the X-ray converter will make the spot on target increase with time and debase the X-ray dose and the imaging resolving power. A distributed target is developed which has about 24 pieces of thin 0.05 mm tantalum films distributed over 1 cm. due to the structure adoption, the distributed target material over a large volume decreases the energy deposition per unit volume and hence reduces the temperature of target surface, then reduces the initial plasma formalizing and its expansion velocity. The comparison and analysis with two kinds of target structures are presented using numerical calculation and experiments, the results show the X-ray dose and normalized angle distribution of the two is basically the same, while the surface of the distributed target is not destroyed like the previous block target
Distribution of crushing strength of tablets
DEFF Research Database (Denmark)
Sonnergaard, Jørn
2002-01-01
The distribution of a given set of data is important since most parametric statistical tests are based on the assumption that the studied data are normal distributed. In analysis of fracture mechanics the Weibull distribution is widely used and the derived Weibull modulus is interpreted as a mate...... data from nine model tablet formulations and four commercial tablets are shown to follow the normal distribution. The importance of proper cleaning of the crushing strength apparatus is demonstrated....
Complete Normal Ordering 1: Foundations
Ellis, John; Skliros, Dimitri P.
2016-01-01
We introduce a new prescription for quantising scalar field theories perturbatively around a true minimum of the full quantum effective action, which is to `complete normal order' the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all `cephalopod' Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of `complete normal ordering' (which is an extension of the standard field theory definition of normal ordering) reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative i...
The normal and pathological language
Espejo, Luis D.
2014-01-01
The extraordinary development of normal and pathological psychology has achieved in recent decades, thanks to the dual method of objective observation and oral survey enabled the researcher spirit of neuro-psychiatrist penetrate the intimate mechanism of the nervous system whose supreme manifestation is thought. It is normal psychology explaining the complicated game of perceptions: their methods of transmission, their centers of projection, its transformations and its synthesis to construct ...
Is normal science good science?
Directory of Open Access Journals (Sweden)
Adrianna Kępińska
2015-09-01
Full Text Available “Normal science” is a concept introduced by Thomas Kuhn in The Structure of Scientific Revolutions (1962. In Kuhn’s view, normal science means “puzzle solving”, solving problems within the paradigm—framework most successful in solving current major scientific problems—rather than producing major novelties. This paper examines Kuhnian and Popperian accounts of normal science and their criticisms to assess if normal science is good. The advantage of normal science according to Kuhn was “psychological”: subjective satisfaction from successful “puzzle solving”. Popper argues for an “intellectual” science, one that consistently refutes conjectures (hypotheses and offers new ideas rather than focus on personal advantages. His account is criticized as too impersonal and idealistic. Feyerabend’s perspective seems more balanced; he argues for a community that would introduce new ideas, defend old ones, and enable scientists to develop in line with their subjective preferences. The paper concludes that normal science has no one clear-cut set of criteria encompassing its meaning and enabling clear assessment.
nth roots of normal contractions
International Nuclear Information System (INIS)
Duggal, B.P.
1992-07-01
Given a complex separable Hilbert space H and a contraction A on H such that A n , n≥2 some integer, is normal it is shown that if the defect operator D A = (1 - A * A) 1/2 is of the Hilbert-Schmidt class, then A is similar to a normal contraction, either A or A 2 is normal, and if A 2 is normal (but A is not) then there is a normal contraction N and a positive definite contraction P of trace class such that parallel to A - N parallel to 1 = 1/2 parallel to P + P parallel to 1 (where parallel to · parallel to 1 denotes the trace norm). If T is a compact contraction such that its characteristics function admits a scalar factor, if T = A n for some integer n≥2 and contraction A with simple eigen-values, and if both T and A satisfy a ''reductive property'', then A is a compact normal contraction. (author). 16 refs
Not Normal: the uncertainties of scientific measurements
Bailey, David C.
2017-01-01
Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.
International Nuclear Information System (INIS)
Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow
2013-01-01
Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more
Energy Technology Data Exchange (ETDEWEB)
Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)
2013-11-01
Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more
Cerebral perfusion in homogeneity in normal volunteers
International Nuclear Information System (INIS)
Gruenwald, S.M.; Larcos, G.
1998-01-01
Full text: In the interpretation of cerebral perfusion scans, it is important to know the normal variation in perfusion which may occur between the cerebral hemispheres. For this reason 24 normal volunteers with no neurological or psychiatric history, and who were on no medications, had 99m Tc-HMPAO brain SPECT studies using a single headed gamma camera computer system. Oblique, coronal and sagittal images were reviewed separately by two experienced observers and any differences were resolved by consensus. Semi-quantitation was performed by summing two adjacent oblique slices and drawing right and left mirror image ROIs corresponding to the mid section level of anterior and posterior frontal lobes, anterior and posterior parietal lobes, temporal lobes and cerebellum. From the mean counts per pixel, right: left ROI ratios and ROI: cerebellar ratios were calculated. On qualitative review 6/24 subjects had mild asymmetry in tracer distribution between right and left cerebral lobes. Semi-quantitation revealed a 5-10% difference in counts between right and left ROIs in 12/24 subjects and an additional three subjects had 10-20% difference in counts between right and left temporal lobes. This study demonstrates the presence of mild asymmetry of cerebral perfusion in a significant minority of normal subjects
New Riemannian Priors on the Univariate Normal Model
Directory of Open Access Journals (Sweden)
Salem Said
2014-07-01
Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ. Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and lying within a typical distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that this leads to an improvement in performance over the use of conjugate priors.
Precaval retropancreatic space: Normal anatomy
Energy Technology Data Exchange (ETDEWEB)
Lee, Yeon Hee; Kim, Ki Whang; Kim, Myung Jin; Yoo, Hyung Sik; Lee, Jong Tae [Yonsei University College of Medicine, Seoul (Korea, Republic of)
1992-07-15
The authors defined precaval retropancreatic space as the space between pancreatic head with portal vein and IVC and analyzed the CT findings of this space to know the normal structures and size in this space. We evaluated 100 cases of normal abdominal CT scan to find out normal anatomic structures of precaval retropancreatic space retrospectively. We also measured the distance between these structures and calculated the minimum, maximum and mean values. At the splenoportal confluence level, normal structures between portal vein and IVC were vessel (21%), lymph node (19%), and caudate lobe of liver (2%) in order of frequency. The maximum AP diameter of portocaval lymph node was 4 mm. Common bile duct (CBD) was seen in 44% and the diameter was mean 3 mm and maximum 11 mm. CBD was located in extrapancreatic (75%) and lateral (60.6%) to pancreatic head. At IVC-left renal vein level, the maximum distance between CBD and IVC was 5 mm and the structure between posterior pancreatic surface and IVC was only fat tissue. Knowledge of these normal structures and measurement will be helpful in differentiating pancreatic mass with retropancreatic mass such as lymphadenopathy.
3j Symbols: To Normalize or Not to Normalize?
van Veenendaal, Michel
2011-01-01
The systematic use of alternative normalization constants for 3j symbols can lead to a more natural expression of quantities, such as vector products and spherical tensor operators. The redefined coupling constants directly equate tensor products to the inner and outer products without any additional square roots. The approach is extended to…
International Nuclear Information System (INIS)
Moeller, T.B.; Reif, E.
1998-01-01
This book gives answers to questions frequently heard especially from trainees and doctors not specialising in the field of radiology: Is that a normal finding? How do I decide? What are the objective criteria? The information presented is three-fold. The normal findings of the usual CT and MRI examinations are shown with high-quality pictures serving as a reference, with inscribed important additional information on measures, angles and other criteria describing the normal conditions. These criteria are further explained and evaluated in accompanying texts which also teach the systematic approach for individual picture analysis, and include a check list of major aspects, as a didactic guide for learning. The book is primarily intended for students, radiographers, radiology trainees and doctors from other medical fields, but radiology specialists will also find useful details of help in special cases. (orig./CB) [de
Marrow transfusions into normal recipients
International Nuclear Information System (INIS)
Brecher, G.
1983-01-01
During the past several years we have explored the transfusion of bone marrow into normal nonirradiated mice. While transfused marrow proliferates readily in irradiated animals, only minimal proliferation takes place in nonirradiated recipients. It has generally been assumed that this was due to the lack of available proliferative sites in recipients with normal marrow. Last year we were able to report that the transfusion of 200 million bone marrow cells (about 2/3 of the total complement of marrow cells of a normal mouse) resulted in 20% to 25% of the recipient's marrow being replaced by donor marrow. Thus we can now study the behavior of animals that have been transfused (donor) and endogenous (recipient) marrow cells, although none of the tissues of either donor or recipient have been irradiated. With these animals we hope to investigate the nature of the peculiar phenomenon of serial exhaustion of marrow, also referred to as the limited self-replicability of stem cells
The construction of normal expectations
DEFF Research Database (Denmark)
Quitzau, Maj-Britt; Røpke, Inge
2008-01-01
The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...
The distribution of interlaboratory comparison data
DEFF Research Database (Denmark)
Heydorn, Kaj
2008-01-01
The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality.......The distribution of mutually consistent results from interlaboratory comparisons is expected to be leptokurtic, and readers are warned against accepting conclusions based on simulations assuming normality....
Soares, Marcelo B.; Efstratiadis, Argiris
1997-01-01
This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.
Random Generators and Normal Numbers
Bailey, David H.; Crandall, Richard E.
2002-01-01
Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...
Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.
1995-01-01
The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is
Random-Number Generator Validity in Simulation Studies: An Investigation of Normality.
Bang, Jung W.; Schumacker, Randall E.; Schlieve, Paul L.
1998-01-01
The normality of number distributions generated by various random-number generators were studied, focusing on when the random-number generator reached a normal distribution and at what sample size. Findings suggest the steps that should be followed when using a random-number generator in a Monte Carlo simulation. (SLD)
Normal gravity field in relativistic geodesy
Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao
2018-02-01
Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are
Complete normal ordering 1: Foundations
Directory of Open Access Journals (Sweden)
John Ellis
2016-08-01
Full Text Available We introduce a new prescription for quantising scalar field theories (in generic spacetime dimension and background perturbatively around a true minimum of the full quantum effective action, which is to ‘complete normal order’ the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all ‘cephalopod’ Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of ‘complete normal ordering’ (which is an extension of the standard field theory definition of normal ordering reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative interactions, and by using a point splitting ‘trick’ we extend this result to theories with derivative interactions, such as those appearing as non-linear σ-models in the world-sheet formulation of string theory. We focus here on theories with trivial vacua, generalising the discussion to non-trivial vacua in a follow-up paper.
Normal forms in Poisson geometry
Marcut, I.T.
2013-01-01
The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric
Is My Child's Appetite Normal?
Is My Child’s Appetite Normal? Cayla, who is 4 years old, did not finish her lunch. But she is ready to play. Her ... snack for later. That is okay! Your child’s appetite changes. Children do not grow as fast in ...
Transforming Normal Programs by Replacement
Bossi, Annalisa; Pettorossi, A.; Cocco, Nicoletta; Etalle, Sandro
1992-01-01
The replacement transformation operation, already defined in [28], is studied wrt normal programs. We give applicability conditions able to ensure the correctness of the operation wrt Fitting's and Kunen's semantics. We show how replacement can mimic other transformation operations such as thinning,
Semigroups of data normalization functions
Warrens, Matthijs J.
2016-01-01
Variable centering and scaling are functions that are typically used in data normalization. Various properties of centering and scaling functions are presented. It is shown that if we use two centering functions (or scaling functions) successively, the result depends on the order in which the
Normalizing Catastrophe: Sustainability and Scientism
Bonnett, Michael
2013-01-01
Making an adequate response to our deteriorating environmental situation is a matter of ever increasing urgency. It is argued that a central obstacle to achieving this is the way that scientism has become normalized in our thinking about environmental issues. This is taken to reflect on an underlying "metaphysics of mastery" that vitiates proper…
Neutron RBE for normal tissues
International Nuclear Information System (INIS)
Field, S.B.; Hornsey, S.
1979-01-01
RBE for various normal tissues is considered as a function of neutron dose per fraction. Results from a variety of centres are reviewed. It is shown that RBE is dependent on neutron energy and is tissue dependent, but is not specially high for the more critical tissues or for damage occurring late after irradiation. (author)
Normal and abnormal growth plate
International Nuclear Information System (INIS)
Kumar, R.; Madewell, J.E.; Swischuk, L.E.
1987-01-01
Skeletal growth is a dynamic process. A knowledge of the structure and function of the normal growth plate is essential in order to understand the pathophysiology of abnormal skeletal growth in various diseases. In this well-illustrated article, the authors provide a radiographic classification of abnormal growth plates and discuss mechanisms that lead to growth plate abnormalities
Unit Root Testing and Estimation in Nonlinear ESTAR Models with Normal and Non-Normal Errors.
Directory of Open Access Journals (Sweden)
Umair Khalil
Full Text Available Exponential Smooth Transition Autoregressive (ESTAR models can capture non-linear adjustment of the deviations from equilibrium conditions which may explain the economic behavior of many variables that appear non stationary from a linear viewpoint. Many researchers employ the Kapetanios test which has a unit root as the null and a stationary nonlinear model as the alternative. However this test statistics is based on the assumption of normally distributed errors in the DGP. Cook has analyzed the size of the nonlinear unit root of this test in the presence of heavy-tailed innovation process and obtained the critical values for both finite variance and infinite variance cases. However the test statistics of Cook are oversized. It has been found by researchers that using conventional tests is dangerous though the best performance among these is a HCCME. The over sizing for LM tests can be reduced by employing fixed design wild bootstrap remedies which provide a valuable alternative to the conventional tests. In this paper the size of the Kapetanios test statistic employing hetroscedastic consistent covariance matrices has been derived and the results are reported for various sample sizes in which size distortion is reduced. The properties for estimates of ESTAR models have been investigated when errors are assumed non-normal. We compare the results obtained through the fitting of nonlinear least square with that of the quantile regression fitting in the presence of outliers and the error distribution was considered to be from t-distribution for various sample sizes.
Deformation around basin scale normal faults
International Nuclear Information System (INIS)
Spahic, D.
2010-01-01
in the central Vienna Basin from commercial 3D seismic data. In addition to detailed conventional fault analysis (displacement and fault shape), syn-and anticlinal structures of sedimentary horizons occurring both in hanging wall and footwall are assessed. Reverse drag geometries of variable magnitudes are found to correlate with local displacement maxima along the fault. In contrast, normal drag is observed along segment boundaries and relay zones. Thus, the detailed documentation of the distribution, type and magnitude of fault drag provides additional information on the fault evolution, as initial fault segments as well as linkage or relay zones can be identified. (author) [de
Normal lymphographic findings and diagnostic errors in the retroperitoneal space
International Nuclear Information System (INIS)
Klein, U.; Heinze, H.G.
1980-01-01
Diagnostic erros in the lymphograms of 194 normal patients are shown to be due to topographical variations of the lymph system, influx and distribution of contrast media in the lymph ducts and nodes, and degenerative changes caused by old age. (orig.) [de
SYVAC3 parameter distribution package
Energy Technology Data Exchange (ETDEWEB)
Andres, T; Skeet, A
1995-01-01
SYVAC3 (Systems Variability Analysis Code, generation 3) is a computer program that implements a method called systems variability analysis to analyze the behaviour of a system in the presence of uncertainty. This method is based on simulating the system many times to determine the variation in behaviour it can exhibit. SYVAC3 specializes in systems representing the transport of contaminants, and has several features to simplify the modelling of such systems. It provides a general tool for estimating environmental impacts from the dispersal of contaminants. This report describes a software object type (a generalization of a data type) called Parameter Distribution. This object type is used in SYVAC3, and can also be used independently. Parameter Distribution has the following subtypes: beta distribution; binomial distribution; constant distribution; lognormal distribution; loguniform distribution; normal distribution; piecewise uniform distribution; Triangular distribution; and uniform distribution. Some of these distributions can be altered by correlating two parameter distribution objects. This report provides complete specifications for parameter distributions, and also explains how to use them. It should meet the needs of casual users, reviewers, and programmers who wish to add their own subtypes. (author). 30 refs., 75 tabs., 56 figs.
Superconducting versus normal conducting cavities
Podlech, Holger
2013-01-01
One of the most important issues of high-power hadron linacs is the choice of technology with respect to superconducting or room-temperature operation. The favour for a specific technology depends on several parameters such as the beam energy, beam current, beam power and duty factor. This contribution gives an overview of the comparison between superconducting and normal conducting cavities. This includes basic radiofrequency (RF) parameters, design criteria, limitations, required RF and plug power as well as case studies.
Normal Movement Selectivity in Autism
Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J.
2010-01-01
It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements, but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Moveme...
Log-Normal Turbulence Dissipation in Global Ocean Models
Pearson, Brodie; Fox-Kemper, Baylor
2018-03-01
Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.
Lithium control during normal operation
International Nuclear Information System (INIS)
Suryanarayan, S.; Jain, D.
2010-01-01
Periodic increases in lithium (Li) concentrations in the primary heat transport (PHT) system during normal operation are a generic problem at CANDU® stations. Lithiated mixed bed ion exchange resins are used at stations for pH control in the PHT system. Typically tight chemistry controls including Li concentrations are maintained in the PHT water. The reason for the Li increases during normal operation at CANDU stations such as Pickering was not fully understood. In order to address this issue a two pronged approach was employed. Firstly, PNGS-A data and information from other available sources was reviewed in an effort to identify possible factors that may contribute to the observed Li variations. Secondly, experimental studies were carried out to assess the importance of these factors in order to establish reasons for Li increases during normal operation. Based on the results of these studies, plausible mechanisms/reasons for Li increases have been identified and recommendations made for proactive control of Li concentrations in the PHT system. (author)
Normalization of Gravitational Acceleration Models
Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.
2011-01-01
Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.
A Post-Truncation Parameterization of Truncated Normal Technical Inefficiency
Christine Amsler; Peter Schmidt; Wen-Jen Tsay
2013-01-01
In this paper we consider a stochastic frontier model in which the distribution of technical inefficiency is truncated normal. In standard notation, technical inefficiency u is distributed as N^+ (μ,σ^2). This distribution is affected by some environmental variables z that may or may not affect the level of the frontier but that do affect the shortfall of output from the frontier. We will distinguish the pre-truncation mean (μ) and variance (σ^2) from the post-truncation mean μ_*=E(u) and var...
"Ser diferente é normal?"/"Being different: is it normal?"
Directory of Open Access Journals (Sweden)
Viviane Veras
2007-01-01
Full Text Available A pergunta título deste trabalho retoma o slogan “Ser diferente é normal”, que é parte da campanha criada para uma organização não-governamental que atende portadores de Síndrome de Down. O objetivo é a inclusão social da pessoa com deficiência e o primeiro passo foi propor a inclusão de um grupo de diferentes no grupo dito normal. No vídeo de lançamento da campanha, o diferente, identificado como normal, é mostrado por meio de exemplos – um negro com cabelo black-power, um skin-head, um corpo tatuado, um corpo feminino halterofílico, uma família hippie, uma garota com síndrome de Down. A visão da adolescente dançando reduz, de certo modo, o efeito imaginário que vai além da síndrome, uma vez que apenas o corpo com seus olhinhos puxados se destacam, e não se interrogam questões cognitivas. Minha proposta é refletir sobre o estatuto paradoxal do exemplo, tal como é trabalhado nesse vídeo: se, por definição, um exemplo mostra de fato seu pertencimento a uma classe, pode-se concluir que é exatamente por ser exemplar que ele se encontra fora dela, no exato momento em que a exibe e define. The question in the title of this paper refers to the slogan "ser diferente é normal" ("It´s normal to be different", which is part of a campaign created for a NGO that supports people with Down syndrome. The objective of the campaign is to promote the social inclusion of individuals with Down syndrome, and the first step was to propose the inclusion of a group of "differents" in the so-called normal group. The film launching the campaign shows the different identified as normal by means of examples: a black man exhibiting blackpower haircut, a skin-head, a tattooed body, an over-athletic female body, a hippie family and a girl with Down syndrome. The vision of the dancing teenager lessens the imaginary effect that surpasses the syndrome, since only her body and her little oriental eyes stand out and no cognitive issues are
Hallin, M.; Piegorsch, W.; El Shaarawi, A.
2012-01-01
The random variable X taking values 0,1,2,…,x,… with probabilities pλ(x) = e−λλx/x!, where λ∈R0+ is called a Poisson variable, and its distribution a Poisson distribution, with parameter λ. The Poisson distribution with parameter λ can be obtained as the limit, as n → ∞ and p → 0 in such a way that
National Aeronautics and Space Administration — Distributed Visualization allows anyone, anywhere, to see any simulation, at any time. Development focuses on algorithms, software, data formats, data systems and...
Statistical Tests for Frequency Distribution of Mean Gravity Anomalies
African Journals Online (AJOL)
The hypothesis that a very large number of lOx 10mean gravity anomalies are normally distributed has been rejected at 5% Significance level based on the X2 and the unit normal deviate tests. However, the 50 equal area mean anomalies derived from the lOx 10data, have been found to be normally distributed at the same ...
A skewed distribution with asset pricing applications
de Roon, Frans; Karehnke, P.
2017-01-01
Recent research has identified skewness and downside risk as one of the most important features of risk. We present a new distribution which makes modeling skewed risks no more difficult than normally distributed (symmetric) risks. Our distribution is a combination of the “downside” and “upside”
International Nuclear Information System (INIS)
Golubov, B I
2007-01-01
On the basis of the concept of pointwise dyadic derivative dyadic distributions are introduced as continuous linear functionals on the linear space D d (R + ) of infinitely differentiable functions compactly supported by the positive half-axis R + together with all dyadic derivatives. The completeness of the space D' d (R + ) of dyadic distributions is established. It is shown that a locally integrable function on R + generates a dyadic distribution. In addition, the space S d (R + ) of infinitely dyadically differentiable functions on R + rapidly decreasing in the neighbourhood of +∞ is defined. The space S' d (R + ) of dyadic distributions of slow growth is introduced as the space of continuous linear functionals on S d (R + ). The completeness of the space S' d (R + ) is established; it is proved that each integrable function on R + with polynomial growth at +∞ generates a dyadic distribution of slow growth. Bibliography: 25 titles.
Normal pediatric postmortem CT appearances
Energy Technology Data Exchange (ETDEWEB)
Klein, Willemijn M.; Bosboom, Dennis G.H.; Koopmanschap, Desiree H.J.L.M. [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Nievelstein, Rutger A.J. [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Nikkels, Peter G.J. [University Medical Center Utrecht, Department of Pathology, Utrecht (Netherlands); Rijn, Rick R. van [Academic Medical Center, Department of Radiology, Amsterdam (Netherlands)
2015-04-01
Postmortem radiology is a rapidly developing specialty that is increasingly used as an adjunct to or substitute for conventional autopsy. The goal is to find patterns of disease and possibly the cause of death. Postmortem CT images bring to light processes of decomposition most radiologists are unfamiliar with. These postmortem changes, such as the formation of gas and edema, should not be mistaken for pathological processes that occur in living persons. In this review we discuss the normal postmortem thoraco-abdominal changes and how these appear on CT images, as well as how to differentiate these findings from those of pathological processes. (orig.)
Multispectral histogram normalization contrast enhancement
Soha, J. M.; Schwartz, A. A.
1979-01-01
A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.
Normal modes and continuous spectra
International Nuclear Information System (INIS)
Balmforth, N.J.; Morrison, P.J.
1994-12-01
The authors consider stability problems arising in fluids, plasmas and stellar systems that contain singularities resulting from wave-mean flow or wave-particle resonances. Such resonances lead to singularities in the differential equations determining the normal modes at the so-called critical points or layers. The locations of the singularities are determined by the eigenvalue of the problem, and as a result, the spectrum of eigenvalues forms a continuum. They outline a method to construct the singular eigenfunctions comprising the continuum for a variety of problems
Normal movement selectivity in autism.
Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J
2010-05-13
It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Movement selectivity is a defining characteristic of neurons involved in movement perception, including mirror neurons, and, as such, these findings argue against a mirror system dysfunction in autism. Copyright 2010 Elsevier Inc. All rights reserved.
Update on normal tension glaucoma
Directory of Open Access Journals (Sweden)
Jyotiranjan Mallick
2016-01-01
Full Text Available Normal tension glaucoma (NTG is labelled when typical glaucomatous disc changes, visual field defects and open anterior chamber angles are associated with intraocular pressure (IOP constantly below 21 mmHg. Chronic low vascular perfusion, Raynaud's phenomenon, migraine, nocturnal systemic hypotension and over-treated systemic hypertension are the main causes of normal tension glaucoma. Goldmann applanation tonometry, gonioscopy, slit lamp biomicroscopy, optical coherence tomography and visual field analysis are the main tools of investigation for the diagnosis of NTG. Management follows the same principles of treatment for other chronic glaucomas: To reduce IOP by a substantial amount, sufficient to prevent disabling visual loss. Treatment is generally aimed to lower IOP by 30% from pre-existing levels to 12-14 mmHg. Betaxolol, brimonidine, prostaglandin analogues, trabeculectomy (in refractory cases, systemic calcium channel blockers (such as nifedipine and 24-hour monitoring of blood pressure are considered in the management of NTG. The present review summarises risk factors, causes, pathogenesis, diagnosis and management of NTG.
Normal variation of hepatic artery
International Nuclear Information System (INIS)
Kim, Inn; Nam, Myung Hyun; Rhim, Hyun Chul; Koh, Byung Hee; Seo, Heung Suk; Kim, Soon Yong
1987-01-01
This study was an analyses of blood supply of the liver in 125 patients who received hepatic arteriography and abdominal aortography from Jan. 1984 to Dec. 1986 at the Department of Radiology of Hanyang University Hospital. A. Variations in extrahepatic arteries: 1. The normal extrahepatic artery pattern occurred in 106 of 125 cases (84.8%) ; Right hepatic and left hepatic arteries arising from the hepatic artery proper and hepatic artery proper arising from the common hepatic artery. 2. The most common type of variation of extrahepatic artery was replaced right hepatic artery from superior mesenteric artery: 6 of 125 cases (4.8%). B. Variations in intrahepatic arteries: 1. The normal intrahepatic artery pattern occurred in 83 of 125 cases (66.4%). Right hepatic and left hepatic arteries arising from the hepatic artery proper and middle hepatic artery arising from lower portion of the umbilical point of left hepatic artery. 2. The most common variation of intrahepatic arteries was middle hepatic artery. 3. Among the variation of middle hepatic artery; Right, middle and left hepatic arteries arising from the same location at the hepatic artery proper was the most common type; 17 of 125 cases (13.6%)
Rhythm-based heartbeat duration normalization for atrial fibrillation detection.
Islam, Md Saiful; Ammour, Nassim; Alajlan, Naif; Aboalsamh, Hatim
2016-05-01
Screening of atrial fibrillation (AF) for high-risk patients including all patients aged 65 years and older is important for prevention of risk of stroke. Different technologies such as modified blood pressure monitor, single lead ECG-based finger-probe, and smart phone using plethysmogram signal have been emerging for this purpose. All these technologies use irregularity of heartbeat duration as a feature for AF detection. We have investigated a normalization method of heartbeat duration for improved AF detection. AF is an arrhythmia in which heartbeat duration generally becomes irregularly irregular. From a window of heartbeat duration, we estimate the possible rhythm of the majority of heartbeats and normalize duration of all heartbeats in the window based on the rhythm so that we can measure the irregularity of heartbeats for both AF and non-AF rhythms in the same scale. Irregularity is measured by the entropy of distribution of the normalized duration. Then we classify a window of heartbeats as AF or non-AF by thresholding the measured irregularity. The effect of this normalization is evaluated by comparing AF detection performances using duration with the normalization, without normalization, and with other existing normalizations. Sensitivity and specificity of AF detection using normalized heartbeat duration were tested on two landmark databases available online and compared with results of other methods (with/without normalization) by receiver operating characteristic (ROC) curves. ROC analysis showed that the normalization was able to improve the performance of AF detection and it was consistent for a wide range of sensitivity and specificity for use of different thresholds. Detection accuracy was also computed for equal rates of sensitivity and specificity for different methods. Using normalized heartbeat duration, we obtained 96.38% accuracy which is more than 4% improvement compared to AF detection without normalization. The proposed normalization
Normal tissue dose-effect models in biological dose optimisation
International Nuclear Information System (INIS)
Alber, M.
2008-01-01
Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)
PHARMACOKINETIC VARIATIONS OF OFLOXACIN IN NORMAL AND FEBRILE RABBITS
Directory of Open Access Journals (Sweden)
M. AHMAD, H. RAZA, G. MURTAZA AND N. AKHTAR
2008-12-01
Full Text Available The influence of experimentally Escherichia coli-induced fever (EEIF on the pharmacokinetics of ofloxacin was evaluated. Ofloxacin was administered @ 20 mg.kg-1 body weight intravenously to a group of eight healthy rabbits and compared these results to values in same eight rabbits with EEIF. Pharmacokinetic parameters of ofloxacin in normal and febrile rabbits were determined by using two compartment open kinetic model. Peak plasma level (Cmax and area under the plasma concentration-time curve (AUC0-α in normal and febrile rabbits did not differ (P>0.05. However, area under first moment of plasma concentration-time curve (AUMC0-α in febrile rabbits was significantly (P<0.05 higher than that in normal rabbits. Mean values for elimination rate constant (Ke, elimination half life (t1/2β and apparent volume of distribution (Vd were significantly (P<0.05 lower in febrile rabbits compared to normal rabbits, while mean residence time (MRT and total body clearance (Cl of ofloxacin did not show any significant difference in the normal and febrile rabbits. Clinical significance of the above results can be related to the changes in the volume of distribution and elimination half life that illustrates an altered steady state in febrile condition; hence, the need for an adjustment of dosage regimen in EEIF is required.
Is My Penis Normal? (For Teens)
... Videos for Educators Search English Español Is My Penis Normal? KidsHealth / For Teens / Is My Penis Normal? Print en español ¿Es normal mi pene? ... any guy who's ever worried about whether his penis is a normal size. There's a fairly wide ...
DEFF Research Database (Denmark)
Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger
2008-01-01
, depending on the nature of intraspecific interactions between them: while the individuals of some species repel each other and partition the available area, others form groups of varying size, determined by the fitness of each group member. The spatial distribution pattern of individuals again strongly......Living organisms are distributed over the entire surface of the planet. The distribution of the individuals of each species is not random; on the contrary, they are strongly dependent on the biology and ecology of the species, and vary over different spatial scale. The structure of whole...... populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns...
Striving for the unknown normal
DEFF Research Database (Denmark)
Nielsen, Mikka
During the last decade, more and more people have received prescriptions for ADHD drug treatment, and simultaneously the legitimacy of the ADHD diagnosis has been heavily debated among both professionals and laymen. Based on an anthropological fieldwork among adults with ADHD, I illustrate how...... the ADHD diagnosis both answers and produces existential questions on what counts as normal behaviour and emotions. The diagnosis helps the diagnosed to identify, accept and handle problems by offering concrete explanations and solutions to diffuse experienced problems. But the diagnostic process...... is not only a clarifying procedure with a straight plan for treatment and direct effects. It is also a messy affair. In a process of experimenting with drugs and attempting to determine how or whether the medication eliminates the correct symptoms the diagnosed is put in an introspective, self...
IIH with normal CSF pressures?
Directory of Open Access Journals (Sweden)
Soh Youn Suh
2013-01-01
Full Text Available Idiopathic intracranial hypertension (IIH is a condition of raised intracranial pressure (ICP in the absence of space occupying lesions. ICP is usually measured by lumbar puncture and a cerebrospinal fluid (CSF pressure above 250 mm H 2 O is one of the diagnostic criteria of IIH. Recently, we have encountered two patients who complained of headaches and exhibited disc swelling without an increased ICP. We prescribed acetazolamide and followed both patients frequently; because of the definite disc swelling with IIH related symptoms. Symptoms and signs resolved in both patients after they started taking acetazolamide. It is generally known that an elevated ICP, as measured by lumbar puncture, is the most important diagnostic sign of IIH. However, these cases caution even when CSF pressure is within the normal range, that suspicion should be raised when a patient has papilledema with related symptoms, since untreated papilledema may cause progressive and irreversible visual loss.
International Nuclear Information System (INIS)
Gruenemeyer, D.
1991-01-01
This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements
Modeling error distributions of growth curve models through Bayesian methods.
Zhang, Zhiyong
2016-06-01
Growth curve models are widely used in social and behavioral sciences. However, typical growth curve models often assume that the errors are normally distributed although non-normal data may be even more common than normal data. In order to avoid possible statistical inference problems in blindly assuming normality, a general Bayesian framework is proposed to flexibly model normal and non-normal data through the explicit specification of the error distributions. A simulation study shows when the distribution of the error is correctly specified, one can avoid the loss in the efficiency of standard error estimates. A real example on the analysis of mathematical ability growth data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 is used to show the application of the proposed methods. Instructions and code on how to conduct growth curve analysis with both normal and non-normal error distributions using the the MCMC procedure of SAS are provided.
CT in normal pressure hydrocephalus
International Nuclear Information System (INIS)
Fujita, Katsuzo; Nogaki, Hidekazu; Noda, Masaya; Kusunoki, Tadaki; Tamaki, Norihiko
1981-01-01
CT scans were obtained on 33 patients (age 73y. to 31y.) with the diagnosis of normal pressure hydrocephalus. In each case, the diagnosis was made on the basis of the symptoms, CT and cisternographic findings. Underlying diseases of normal pressure hydrocephalus are ruptured aneurysms (21 cases), arteriovenous malformations (2 cases), head trauma (1 case), cerebrovascular accidents (1 case) and idiopathie (8 cases). Sixteen of 33 patients showed marked improvement, five, moderate or minimal improvement, and twelve, no change. The results were compared with CT findings and clinical response to shunting. CT findings were classified into five types, bases on the degree of periventricular hypodensity (P.V.H.), the extent of brain damage by underlying diseases, and the degree of cortical atrophy. In 17 cases of type (I), CT shows the presence of P.V.H. with or without minimal frontal lobe damage and no cortical atrophy. The good surgical improvements were achieved in all cases of type (I) by shunting. In 4 cases of type (II), CT shows the presence of P.V.H. and severe brain damage without cortical atrophy. The fair clinical improvements were achieved in 2 cases (50%) by shunting. In one case of type (III), CT shows the absence of P.V.H. without brain damage nor cortical atrophy. No clinical improvement was obtained by shunting in this type. In 9 cases of type (IV) with mild cortical atrophy, the fair clinical improvement was achieved in two cases (22%) and no improvement in 7 cases. In 2 cases of type (V) with moderate or marked cortical atrophy, no clinical improvement was obtained by shunting. In conclusion, it appeared from the present study that there was a good correlation between the result of shunting and the type of CT, and clinical response to shunting operation might be predicted by classification of CT findings. (author)
A One-Sample Test for Normality with Kernel Methods
Kellner , Jérémie; Celisse , Alain
2015-01-01
We propose a new one-sample test for normality in a Reproducing Kernel Hilbert Space (RKHS). Namely, we test the null-hypothesis of belonging to a given family of Gaussian distributions. Hence our procedure may be applied either to test data for normality or to test parameters (mean and covariance) if data are assumed Gaussian. Our test is based on the same principle as the MMD (Maximum Mean Discrepancy) which is usually used for two-sample tests such as homogeneity or independence testing. O...
Normal Spin Asymmetries in Elastic Electron-Proton Scattering
International Nuclear Information System (INIS)
M. Gorchtein; P.A.M. Guichon; M. Vanderhaeghen
2004-01-01
We discuss the two-photon exchange contribution to observables which involve lepton helicity flip in elastic lepton-nucleon scattering. This contribution is accessed through the single spin asymmetry for a lepton beam polarized normal to the scattering plane. We estimate this beam normal spin asymmetry at large momentum transfer using a parton model and we express the corresponding amplitude in terms of generalized parton distributions. We further discuss this observable in the quasi-RCS kinematics which may be dominant at certain kinematical conditions and find it to be governed by the photon helicity-flip RCS amplitudes
Normal Spin Asymmetries in Elastic Electron-Proton Scattering
International Nuclear Information System (INIS)
Gorchtein, M.; Guichon, P.A.M.; Vanderhaeghen, M.
2005-01-01
We discuss the two-photon exchange contribution to observables which involve lepton helicity flip in elastic lepton-nucleon scattering. This contribution is accessed through the single spin asymmetry for a lepton beam polarized normal to the scattering plane. We estimate this beam normal spin asymmetry at large momentum transfer using a parton model and we express the corresponding amplitude in terms of generalized parton distributions. We further discuss this observable in the quasi-RCS kinematics which may be dominant at certain kinematical conditions and find it to be governed by the photon helicity-flip RCS amplitudes
Probability distributions with truncated, log and bivariate extensions
Thomopoulos, Nick T
2018-01-01
This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...
Characterizing Normal Groundwater Chemistry in Hawaii
Tachera, D.; Lautze, N. C.; Thomas, D. M.; Whittier, R. B.; Frazer, L. N.
2017-12-01
Hawaii is dependent on groundwater resources, yet how water moves through the subsurface is not well understood in many locations across the state. As marine air moves across the islands water evaporates from the ocean, along with trace amounts of sea-salt ions, and interacts with the anthropogenic and volcanic aerosols (e.g. sulfuric acid, ammonium sulfate, HCl), creating a slightly more acidic rain. When this rain falls, it has a chemical signature distinctive of past processes. As this precipitation infiltrates through soil it may pick up another distinctive chemical signature associated with land use and degree of soil development, and as it flows through the underlying geology, its chemistry is influenced by the host rock. We are currently conducting an investigation of groundwater chemistry in selected aquifer areas of Hawaii, having diverse land use, land cover, and soil development conditions, in an effort to investigate and document what may be considered a "normal" water chemistry for an area. Through this effort, we believe we better assess anomalies due to contamination events, hydrothermal alteration, and other processes; and we can use this information to better understand groundwater flow direction. The project has compiled a large amount of precipitation, soil, and groundwater chemistry data in the three focus areas distributed across in the State of Hawaii. Statistical analyses of these data sets will be performed in an effort to determine what is "normal" and what is anomalous chemistry for a given area. Where possible, results will be used to trace groundwater flow paths. Methods and preliminary results will be presented.
Experimental microangiographic study in normal rabbit liver
International Nuclear Information System (INIS)
Kim, Yoon Gyoo; Park, Jong Yeon; Han, Kook Sang; Moon, Ki Ho; Choi, Chang Ho; Han, Koon Taek; Lee, Suck Hong; Kim, Byung Soo
1994-01-01
Microangiography is an experimental radiologic technique for evaluation of the morphology and the function of small vessels. The purpose of this study is to introduce a good microangiographic technique and to present the microangiographic appearance of normal hepatic vascular pattern. Five white rabbits weighing 2.5-2.9Kg were objected. Polyethylene catheters were inserted in portal vein and then in IVC. Heparin mixed normal saline (2cc/1000cc) was infused through portal vein and blood was drained to IVC. Barium suspension was infused via the catheter placed in portal vein until the liver surface showed satisfactory finding in barium filling. The liver was removed and this preparation was fixed in 10% formaline for 7 days. After fixation, the liver was sectioned on 1-2mm thickness. The slices were radiographed on high resolution plate using Faxitron. H-E staining of liver tissue was also done. The microbrium was well distributed in all small vessels without filling defect. And we could find the hexagonal shaped classic liver lobule, in which the central vein was located at central portion and portal vein at periphery. The enlargement was showed numerous sinusoids, but there was less dye in the central portion of lobule, but the central vein was well filled by microbarium. The peripheral portion of lobule was well filled with microbarium. So, we could find diamond shaped liver acinus, in which central vein was located at priperal portion and the center of liver acinus was terminal portal vein that growed out from a small portal space. The three acini made the complex acinus and acinar agglomerate was composed of three or four complex acini. It is considered that the liver acinus pattern of Rapparport is more acceptable on microangiography than the classic concept of hepatic lobule
Yang, Shan; Tong, Xiangqian
2016-01-01
Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverte...
GC-Content Normalization for RNA-Seq Data
2011-01-01
Background Transcriptome sequencing (RNA-Seq) has become the assay of choice for high-throughput studies of gene expression. However, as is the case with microarrays, major technology-related artifacts and biases affect the resulting expression measures. Normalization is therefore essential to ensure accurate inference of expression levels and subsequent analyses thereof. Results We focus on biases related to GC-content and demonstrate the existence of strong sample-specific GC-content effects on RNA-Seq read counts, which can substantially bias differential expression analysis. We propose three simple within-lane gene-level GC-content normalization approaches and assess their performance on two different RNA-Seq datasets, involving different species and experimental designs. Our methods are compared to state-of-the-art normalization procedures in terms of bias and mean squared error for expression fold-change estimation and in terms of Type I error and p-value distributions for tests of differential expression. The exploratory data analysis and normalization methods proposed in this article are implemented in the open-source Bioconductor R package EDASeq. Conclusions Our within-lane normalization procedures, followed by between-lane normalization, reduce GC-content bias and lead to more accurate estimates of expression fold-changes and tests of differential expression. Such results are crucial for the biological interpretation of RNA-Seq experiments, where downstream analyses can be sensitive to the supplied lists of genes. PMID:22177264
International Nuclear Information System (INIS)
Lou, C.
2002-01-01
An advection-diffusion model has been set up to describe normal grain growth. In this model grains are divided into different groups according to their topological classes (number of sides of a grain). Topological transformations are modelled by advective and diffusive flows governed by advective and diffusive coefficients respectively, which are assumed to be proportional to topological classes. The ordinary differential equations governing self-similar time-independent grain size distribution can be derived analytically from continuity equations. It is proved that the time-independent distributions obtained by solving the ordinary differential equations have the same form as the time-dependent distributions obtained by solving the continuity equations. The advection-diffusion model is extended to describe the stagnation of normal grain growth in thin films. Grain boundary grooving prevents grain boundaries from moving, and the correlation between neighbouring grains accelerates the stagnation of normal grain growth. After introducing grain boundary grooving and the correlation between neighbouring grains into the model, the grain size distribution is close to a lognormal distribution, which is usually found in experiments. A vertex computer simulation of normal grain growth has also been carried out to make a cross comparison with the advection-diffusion model. The result from the simulation did not verify the assumption that the advective and diffusive coefficients are proportional to topological classes. Instead, we have observed that topological transformations usually occur on certain topological classes. This suggests that the advection-diffusion model can be improved by making a more realistic assumption on topological transformations. (author)
A note on inconsistent families of discrete multivariate distributions
Ghosh, Sugata; Dutta, Subhajit; Genton, Marc G.
2017-01-01
We construct a d-dimensional discrete multivariate distribution for which any proper subset of its components belongs to a specific family of distributions. However, the joint d-dimensional distribution fails to belong to that family and in other words, it is ‘inconsistent’ with the distribution of these subsets. We also address preservation of this ‘inconsistency’ property for the symmetric Binomial distribution, and some discrete distributions arising from the multivariate discrete normal distribution.
A note on inconsistent families of discrete multivariate distributions
Ghosh, Sugata
2017-07-05
We construct a d-dimensional discrete multivariate distribution for which any proper subset of its components belongs to a specific family of distributions. However, the joint d-dimensional distribution fails to belong to that family and in other words, it is ‘inconsistent’ with the distribution of these subsets. We also address preservation of this ‘inconsistency’ property for the symmetric Binomial distribution, and some discrete distributions arising from the multivariate discrete normal distribution.
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1976-01-01
Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study
Mass distributions in disk galaxies
Martinsson, Thomas; Verheijen, Marc; Bershady, Matthew; Westfall, Kyle; Andersen, David; Swaters, Rob
We present results on luminous and dark matter mass distributions in disk galaxies from the DiskMass Survey. As expected for normal disk galaxies, stars dominate the baryonic mass budget in the inner region of the disk; however, at about four optical scale lengths (hR ) the atomic gas starts to
Score distributions in information retrieval
Arampatzis, A.; Robertson, S.; Kamps, J.
2009-01-01
We review the history of modeling score distributions, focusing on the mixture of normal-exponential by investigating the theoretical as well as the empirical evidence supporting its use. We discuss previously suggested conditions which valid binary mixture models should satisfy, such as the
Directory of Open Access Journals (Sweden)
Zuzana ANDRÁSSYOVÁ
2012-07-01
Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.
DEFF Research Database (Denmark)
Glaveanu, Vlad Petre
This book challenges the standard view that creativity comes only from within an individual by arguing that creativity also exists ‘outside’ of the mind or more precisely, that the human mind extends through the means of action into the world. The notion of ‘distributed creativity’ is not commonly...... used within the literature and yet it has the potential to revolutionise the way we think about creativity, from how we define and measure it to what we can practically do to foster and develop creativity. Drawing on cultural psychology, ecological psychology and advances in cognitive science......, this book offers a basic framework for the study of distributed creativity that considers three main dimensions of creative work: sociality, materiality and temporality. Starting from the premise that creativity is distributed between people, between people and objects and across time, the book reviews...
Normalization of emotion control scale
Directory of Open Access Journals (Sweden)
Hojatoolah Tahmasebian
2014-09-01
Full Text Available Background: Emotion control skill teaches the individuals how to identify their emotions and how to express and control them in various situations. The aim of this study was to normalize and measure the internal and external validity and reliability of emotion control test. Methods: This standardization study was carried out on a statistical society, including all pupils, students, teachers, nurses and university professors in Kermanshah in 2012, using Williams’ emotion control scale. The subjects included 1,500 (810 females and 690 males people who were selected by stratified random sampling. Williams (1997 emotion control scale, was used to collect the required data. Emotional Control Scale is a tool for measuring the degree of control people have over their emotions. This scale has four subscales, including anger, depressed mood, anxiety and positive affect. The collected data were analyzed by SPSS software using correlation and Cronbach's alpha tests. Results: The results of internal consistency of the questionnaire reported by Cronbach's alpha indicated an acceptable internal consistency for emotional control scale, and the correlation between the subscales of the test and between the items of the questionnaire was significant at 0.01 confidence level. Conclusion: The validity of emotion control scale among the pupils, students, teachers, nurses and teachers in Iran has an acceptable range, and the test itemswere correlated with each other, thereby making them appropriate for measuring emotion control.
Digital Pupillometry in Normal Subjects
Rickmann, Annekatrin; Waizel, Maria; Kazerounian, Sara; Szurman, Peter; Wilhelm, Helmut; Boden, Karl T.
2017-01-01
ABSTRACT The aim of this study was to evaluate the pupil size of normal subjects at different illumination levels with a novel pupillometer. The pupil size of healthy study participants was measured with an infrared-video PupilX pupillometer (MEye Tech GmbH, Alsdorf, Germany) at five different illumination levels (0, 0.5, 4, 32, and 250 lux). Measurements were performed by the same investigator. Ninety images were executed during a measurement period of 3 seconds. The absolute linear camera resolution was approximately 20 pixels per mm. This cross-sectional study analysed 490 eyes of 245 subjects (mean age: 51.9 ± 18.3 years, range: 6–87 years). On average, pupil diameter decreased with increasing light intensities for both eyes, with a mean pupil diameter of 5.39 ± 1.04 mm at 0 lux, 5.20 ± 1.00 mm at 0.5 lux, 4.70 ± 0.97 mm at 4 lux, 3.74 ± 0.78 mm at 32 lux, and 2.84 ± 0.50 mm at 250 lux illumination. Furthermore, it was found that anisocoria increased by 0.03 mm per life decade for all illumination levels (R2 = 0.43). Anisocoria was higher under scotopic and mesopic conditions. This study provides additional information to the current knowledge concerning age- and light-related pupil size and anisocoria as a baseline for future patient studies. PMID:28228832
International Nuclear Information System (INIS)
Verdaguer, E.
1983-01-01
The short wavelength normal modes of self-gravitating rotating polytropic discs in the Bardeen approximation are studied. The discs' oscillations can be seen in terms of two types of modes: the p-modes whose driving forces are pressure forces and the r-modes driven by Coriolis forces. As a consequence of differential rotation coupling between the two takes place and some mixed modes appear, their properties can be studied under the assumption of weak coupling and it is seen that they avoid the crossing of the p- and r-modes. The short wavelength analysis provides a basis for the classification of the modes, which can be made by using the properties of their phase diagrams. The classification is applied to the large wavelength modes of differentially rotating discs with strong coupling and to a uniformly rotating sequence with no coupling, which have been calculated in previous papers. Many of the physical properties and qualitative features of these modes are revealed by the analysis. (author)
AC distribution system for TFTR pulsed loads
International Nuclear Information System (INIS)
Carroll, R.F.; Ramakrishnan, S.; Lemmon, G.N.; Moo, W.I.
1977-01-01
This paper outlines the AC distribution system associated with the Tokamak Fusion Test Reactor and discusses the significant areas related to design, protection, and equipment selection, particularly where there is a departure from normal utility and industrial applications
Notes on power of normality tests of error terms in regression models
Energy Technology Data Exchange (ETDEWEB)
Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)
2015-03-10
Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.
Notes on power of normality tests of error terms in regression models
International Nuclear Information System (INIS)
Střelec, Luboš
2015-01-01
Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models
Van Steen, Maarten
2017-01-01
For this third edition of "Distributed Systems," the material has been thoroughly revised and extended, integrating principles and paradigms into nine chapters: 1. Introduction 2. Architectures 3. Processes 4. Communication 5. Naming 6. Coordination 7. Replication 8. Fault tolerance 9. Security A separation has been made between basic material and more specific subjects. The latter have been organized into boxed sections, which may be skipped on first reading. To assist in understanding the more algorithmic parts, example programs in Python have been included. The examples in the book leave out many details for readability, but the complete code is available through the book's Website, hosted at www.distributed-systems.net.
Quasi-normal modes from non-commutative matrix dynamics
Aprile, Francesco; Sanfilippo, Francesco
2017-09-01
We explore similarities between the process of relaxation in the BMN matrix model and the physics of black holes in AdS/CFT. Focusing on Dyson-fluid solutions of the matrix model, we perform numerical simulations of the real time dynamics of the system. By quenching the equilibrium distribution we study quasi-normal oscillations of scalar single trace observables, we isolate the lowest quasi-normal mode, and we determine its frequencies as function of the energy. Considering the BMN matrix model as a truncation of N=4 SYM, we also compute the frequencies of the quasi-normal modes of the dual scalar fields in the AdS5-Schwarzschild background. We compare the results, and we finda surprising similarity.
International Nuclear Information System (INIS)
Wu Yuanfang; Liu Lianshou
1990-01-01
From the study of even and odd multiplicity distributions for hadron-hadron collision in different rapidity windows, we propose a simple picture for charge correlation with nonzero correlation length and calculate the multiplicity distributions and the normalized moments in different rapidity windows at different energies. The results explain the experimentally observed coincidence and separation of even and odd distributions and also the anomalous energy dependence of normalized moments in narrow rapidity windows. The reason for the separation of even-odd distributions, appearing first at large multiplicities, is shown to be energy conservation. The special role of no-particle events in narrow rapidity windows is pointed out
On the distribution of DDO galaxies
International Nuclear Information System (INIS)
Sharp, N.A.; Jones, B.J.T.; Jones, J.E.
1978-01-01
The distribution of DDO galaxies on the sky and their relationship to normal galaxies have been examined. The results appear to contradict the universality of the luminosity function for galaxies. They also indicate that DDO galaxies are preferentially found in the vicinity of normal galaxies, but not uniformly in that they tend to avoid clusters. This may be due to the dependence of distribution upon morphological type. (author)
A Generic Procedure for BRDF Normalization of Remotely Sensed Data
Energy Technology Data Exchange (ETDEWEB)
D. Yuan
2003-04-01
A generic procedure for Bidirectional Reflectance Distribution Function (BRDF) normalization for airborne multispectral images has been developed and implemented as an add-on module of ENVI at the U.S. Department of Energy's Remote Sensing Laboratory. The main advantage of this procedure is that it does not require multiple image acquisitions over the same area for establishing empirical BRDF functions.
Defining the "normal" postejaculate urinalysis.
Mehta, Akanksha; Jarow, Jonathan P; Maples, Pat; Sigman, Mark
2012-01-01
Although sperm have been shown to be present in the postejaculate urinalysis (PEU) of both fertile and infertile men, the number of sperm present in the PEU of the general population has never been well defined. The objective of this study was to describe the semen and PEU findings in both the general and infertile population, in order to develop a better appreciation for "normal." Infertile men (n = 77) and control subjects (n = 71) were prospectively recruited. Exclusion criteria included azoospermia and medications known to affect ejaculation. All men underwent a history, physical examination, semen analysis, and PEU. The urine was split into 2 containers: PEU1, the initial voided urine, and PEU2, the remaining voided urine. Parametric statistical methods were applied for data analysis to compare sperm concentrations in each sample of semen and urine between the 2 groups of men. Controls had higher average semen volume (3.3 ± 1.6 vs 2.0 ± 1.4 mL, P sperm concentrations (112 million vs 56.2 million, P = .011), compared with infertile men. The presence of sperm in urine was common in both groups, but more prevalent among infertile men (98.7% vs 88.7%, P = .012), in whom it comprised a greater proportion of the total sperm count (46% vs 24%, P = .022). The majority of sperm present in PEU were seen in PEU1 of both controls (69%) and infertile men (88%). An association was noted between severe oligospermia (sperm counts in PEU (sperm in the urine compared with control, there is a large degree of overlap between the 2 populations, making it difficult to identify a specific threshold to define a positive test. Interpretation of a PEU should be directed by whether the number of sperm in the urine could affect subsequent management.
Turbocharging Normalization in Highland Conditions
Directory of Open Access Journals (Sweden)
I. V. Filippov
2017-01-01
Full Text Available To ensure many production processes are used compressors of various types, including turbochargers, which produce compressed air. The actual performance values of turbochargers used in highlands are significantly different from the certified values, and parameters of compressed air do not always guarantee the smooth and efficient functioning for consumers.The paper presents research results of the turbochargers of 4CI 425MX4 type, a series of "CENTAC", manufactured by INGERSOL – RAND Company. The research has been conducted in industrial highland conditions in difficult climatic environment. There were almost no investigations of turbochargers running in highland conditions. The combination of low atmospheric pressure with high temperature of the intake air causes the abnormal operating conditions of a turbocharger. Only N. M. Barannikov in his paper shows the results of theoretical studies of such operating conditions, but as to the practical research, there is no information at all.To normalize the turbocharger operation an option of the mechanical pressurization in the suction pipe is adopted. As a result of theoretical research, a TurboMAX blower MAX500 was chosen as a supercharger. The next stage of theoretical research was to construct characteristics of the turbocharger 4CI 425MX4 with a mechanical supercharger in the suction pipe. The boost reduces to the minimum the time of using additional compressors when parameters of the intake air are changed and ensures the smooth and efficient functioning for consumers.To verify the results of theoretical studies, namely, the technique for recalculation of the turbocharger characteristics under the real conditions of suction, were carried out the experimental researches. The average error between experimental and theoretical data is 2,9783 %, which confirms the validity of the technique used for reduction of the turbocharger characteristics to those under the real conditions of suction.
Vaginal Discharge: What's Normal, What's Not
... Staying Safe Videos for Educators Search English Español Vaginal Discharge: What's Normal, What's Not KidsHealth / For Teens / ... Discharge: What's Normal, What's Not Print What Is Vaginal Discharge? Vaginal discharge is fluid that comes from ...
Should Japan Become a Normal Country
National Research Council Canada - National Science Library
Yildiz, Ahmet
2005-01-01
This thesis evaluates Japanese geopolitical change in the post-Cold War era. It does so by analyzing Japan's history, its foreign policy since 1945, its reasons for becoming a normal country, and the impact of its normalization...
International Nuclear Information System (INIS)
Baidillah, Marlin R; Takei, Masahiro
2017-01-01
A nonlinear normalization model which is called exponential model for electrical capacitance tomography (ECT) with external electrodes under gap permittivity conditions has been developed. The exponential model normalization is proposed based on the inherently nonlinear relationship characteristic between the mixture permittivity and the measured capacitance due to the gap permittivity of inner wall. The parameters of exponential equation are derived by using an exponential fitting curve based on the simulation and a scaling function is added to adjust the experiment system condition. The exponential model normalization was applied to two dimensional low and high contrast dielectric distribution phantoms by using simulation and experimental studies. The proposed normalization model has been compared with other normalization models i.e. Parallel, Series, Maxwell and Böttcher models. Based on the comparison of image reconstruction results, the exponential model is reliable to predict the nonlinear normalization of measured capacitance in term of low and high contrast dielectric distribution. (paper)
On dose distribution comparison
International Nuclear Information System (INIS)
Jiang, Steve B; Sharp, Greg C; Neicu, Toni; Berbeco, Ross I; Flampouri, Stella; Bortfeld, Thomas
2006-01-01
In radiotherapy practice, one often needs to compare two dose distributions. Especially with the wide clinical implementation of intensity-modulated radiation therapy, software tools for quantitative dose (or fluence) distribution comparison are required for patient-specific quality assurance. Dose distribution comparison is not a trivial task since it has to be performed in both dose and spatial domains in order to be clinically relevant. Each of the existing comparison methods has its own strengths and weaknesses and there is room for improvement. In this work, we developed a general framework for comparing dose distributions. Using a new concept called maximum allowed dose difference (MADD), the comparison in both dose and spatial domains can be performed entirely in the dose domain. Formulae for calculating MADD values for various comparison methods, such as composite analysis and gamma index, have been derived. For convenience in clinical practice, a new measure called normalized dose difference (NDD) has also been proposed, which is the dose difference at a point scaled by the ratio of MADD to the predetermined dose acceptance tolerance. Unlike the simple dose difference test, NDD works in both low and high dose gradient regions because it considers both dose and spatial acceptance tolerances through MADD. The new method has been applied to a test case and a clinical example. It was found that the new method combines the merits of the existing methods (accurate, simple, clinically intuitive and insensitive to dose grid size) and can easily be implemented into any dose/intensity comparison tool
International Nuclear Information System (INIS)
Hassanein, A.; Konkashbaev, I.
1999-01-01
The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutions provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters
TRASYS form factor matrix normalization
Tsuyuki, Glenn T.
1992-01-01
A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.
powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks
Murray, Steven G.
2018-05-01
powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.
A note on totally normal spaces
International Nuclear Information System (INIS)
Zougdani, H.K.
1990-10-01
In this note we give the necessary and sufficient condition for a topological space X such that the product space X x Y is totally normal for any (non discrete) metric space Y, and we show that a totally normal p-space need not be a perfectly normal in general, which makes Theorem 2 doubtful. (author). 6 refs
Manufacturing technology for practical Josephson voltage normals
International Nuclear Information System (INIS)
Kohlmann, Johannes; Kieler, Oliver
2016-01-01
In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.
Neutron scattering by normal liquids
Energy Technology Data Exchange (ETDEWEB)
Gennes, P.G. de [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires
1961-07-01
Neutron data on motions in normal liquids well below critical point are reviewed and classified according to the order of magnitude of momentum transfers {Dirac_h}q and energy transfers {Dirac_h}w. For large momentum transfers a perfect gas model is valid. For smaller q and incoherent scattering, the major effects are related to the existence of two characteristic times: the period of oscillation of an atom in its cell, and the average lifetime of the atom in a definite cell. Various interpolation schemes covering both time scales are discussed. For coherent scattering and intermediate q, the energy spread is expected to show a minimum whenever q corresponds to a diffraction peak. For very small q the standard macroscopic description of density fluctuations is applicable. The limits of the various (q) and (w) domains and the validity of various approximations are discussed by a method of moments. The possibility of observing discrete transitions due to internal degrees of freedom in polyatomic molecules, in spite of the 'Doppler width' caused by translational motions, is also examined. (author) [French] L'auteur examine les donnees neutroniques sur les mouvements dans les liquides normaux, bien au-dessous du point critique, et les classe d'apres l'ordre de grandeur des transferts de quantite de mouvement {Dirac_h}q et des transferts d'energie {Dirac_h}w. Pour les grands transferts de, quantite de mouvement, un modele de gaz parfait est valable. En ce qui concerne les faibles valeurs de q et la diffussion incoherente, les principaux effets sont lies a l'existence de deux temps caracteristiques: la periode d'oscillation d'un atome dans sa cellule et la duree moyenne de vie de l'atome dans une cellule determinee. L'auteur etudie divers systemes d'interpolation se rapportant aux deux echelles de temps. Pour la diffusion coherente et les valeurs intermediaires de q, on presume que le spectre d'energie accuse un minimum chaque fois que q correspond a un pic de
Normal distal pulmonary vein anatomy
Directory of Open Access Journals (Sweden)
Wiesława Klimek-Piotrowska
2016-01-01
Full Text Available Background. It is well known that the pulmonary veins (PVs, especially their myocardial sleeves play a critical role in the initiation and maintenance of atrial fibrillation. Understanding the PV anatomy is crucial for the safety and efficacy of all procedures performed on PVs. The aim of this study was to present normal distal PV anatomy and to create a juxtaposition of all PV ostium variants.Methods. A total of 130 randomly selected autopsied adult human hearts (Caucasian were examined. The number of PVs ostia was evaluated and their diameter was measured. The ostium-to-last-tributary distance and macroscopic presence of myocardial sleeves were also evaluated.Results. Five hundred forty-one PV ostia were identified. Four classical PV ostia patterns (two left and two right PVs were observed in 70.8% of all cases. The most common variant was the classical pattern with additional middle right PV (19.2%, followed by the common ostium for the left superior and the inferior PVs (4.44%. Mean diameters of PV ostia (for the classical pattern were: left superior = 13.8 ± 2.9 mm; left inferior = 13.3 ± 3.4 mm; right superior = 14.3 ± 2.9 mm; right inferior = 13.7 ± 3.3 mm. When present, the additional middle right PV ostium had the smallest PV ostium diameter in the heart (8.2 ± 4.1 mm. The mean ostium-to-last-tributary (closest to the atrium distances were: left superior = 15.1 ± 4.6 mm; left inferior = 13.5 ± 4.0 mm; right superior = 11.8 ± 4.0 mm; right inferior = 11.0 ± 3.7 mm. There were no statistically significant differences between sexes in ostia diameters and ostium-to-last-tributary distances.Conclusion. Only 71% of the cases have four standard pulmonary veins. The middle right pulmonary vein is present in almost 20% of patients. Presented data can provide useful information for the clinicians during interventional procedures or radiologic examinations of PVs.
Normal Anti-Invariant Submanifolds of Paraquaternionic Kähler Manifolds
Directory of Open Access Journals (Sweden)
Novac-Claudiu Chiriac
2006-12-01
Full Text Available We introduce normal anti-invariant submanifolds of paraquaternionic Kähler manifolds and study the geometric structures induced on them. We obtain necessary and sufficient conditions for the integrability of the distributions defined on a normal anti-invariant submanifold. Also, we present characterizations of local (global anti-invariant products.
International Nuclear Information System (INIS)
Lee, Jin Hwa; Yoon, Seong Kuk; Choi, Sun Seob; Nam, Kyung Jin; Cho, Se Heon; Kim, Dae Cheol; Kim, Jung Il; Kim, Eun Kyung
2006-01-01
We wanted to evaluate the clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast. From Apr 2003 to Feb 2005, 107 patients with 113 palpable abnormalities who had combined normal sonographic and normal mammographic findings were retrospectively studied. The evaluated parameters included age of the patients, the clinical referrals, the distribution of the locations of the palpable abnormalities, whether there was a past surgical history, the mammographic densities and the sonographic echo patterns (purely hyperechoic fibrous tissue, mixed fibroglandular breast tissue, predominantly isoechoic glandular tissue and isoechoic subcutaneous fat tissue) at the sites of clinical concern, whether there was a change in imaging and/or the physical examination results at follow-up, and whether there were biopsy results. This study period was chosen to allow a follow-up period of at least 12 months. The patients' ages ranged from 22 to 66 years (mean age: 48.8 years) and 62 (58%) of the 107 patients were between 41 and 50 years old (58%). The most common location of the palpable abnormalities was the upper outer portion of the breast (45%) and most of the mammographic densities were dense patterns (BI-RADS Type 3 or 4: 91%). Our cases showed similar distribution for all the types of sonographic echo patterns. 23 patients underwent biopsy; all the biopsy specimens were benign. For the 84 patients with 90 palpable abnormalities who were followed, there was no interval development of breast cancer in the areas of clinical concern. Our results suggest that we can follow up and prevent unnecessary biopsies in women with palpable abnormalities when both the mammography and ultrasonography show normal tissue, but this study was limited by its small sample size. Therefore, a larger study will be needed to better define the negative predictive value of combined normal sonographic and mammographic findings
Statistical distributions as applied to environmental surveillance data
International Nuclear Information System (INIS)
Speer, D.R.; Waite, D.A.
1975-09-01
Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed
Lindström, Robin; Rosvall, Tobias
2013-01-01
En prestandaanalys utfördes på en SAAB 2000 som referensobjekt. Olika metoder för att driva flygplan på ett miljövänligare sätt utvärderades tillsammans med distributed propulsion. Efter undersökningar valdes elmotorer tillsammans med Zink-luft batterier för att driva SAAB 2000 med distributed propulsion. En prestandaanalys utfördes på detta plan på samma sätt som för den ursprungliga SAAB 2000. Resultaten jämfördes och slutsatsen blev att räckvidden var för kort för att konfigurationen skull...
Quasihomogeneous distributions
von Grudzinski, O
1991-01-01
This is a systematic exposition of the basics of the theory of quasihomogeneous (in particular, homogeneous) functions and distributions (generalized functions). A major theme is the method of taking quasihomogeneous averages. It serves as the central tool for the study of the solvability of quasihomogeneous multiplication equations and of quasihomogeneous partial differential equations with constant coefficients. Necessary and sufficient conditions for solvability are given. Several examples are treated in detail, among them the heat and the Schrödinger equation. The final chapter is devoted to quasihomogeneous wave front sets and their application to the description of singularities of quasihomogeneous distributions, in particular to quasihomogeneous fundamental solutions of the heat and of the Schrödinger equation.
Effects of normalization on quantitative traits in association test
2009-01-01
Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414
Effects of normalization on quantitative traits in association test
Directory of Open Access Journals (Sweden)
Yap Von Bing
2009-12-01
Full Text Available Abstract Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest.
Binns, Lewis A.; Valachis, Dimitris; Anderson, Sean; Gough, David W.; Nicholson, David; Greenway, Phil
2002-07-01
Previously, we have developed techniques for Simultaneous Localization and Map Building based on the augmented state Kalman filter. Here we report the results of experiments conducted over multiple vehicles each equipped with a laser range finder for sensing the external environment, and a laser tracking system to provide highly accurate ground truth. The goal is simultaneously to build a map of an unknown environment and to use that map to navigate a vehicle that otherwise would have no way of knowing its location, and to distribute this process over several vehicles. We have constructed an on-line, distributed implementation to demonstrate the principle. In this paper we describe the system architecture, the nature of the experimental set up, and the results obtained. These are compared with the estimated ground truth. We show that distributed SLAM has a clear advantage in the sense that it offers a potential super-linear speed-up over single vehicle SLAM. In particular, we explore the time taken to achieve a given quality of map, and consider the repeatability and accuracy of the method. Finally, we discuss some practical implementation issues.
DEFF Research Database (Denmark)
Kriegbaum, Mette Camilla; Jacobsen, Benedikte; Füchtbauer, Annette
2016-01-01
of C4.4A in normal physiology and cancer progression. The unchallenged C4.4A-deficient mice were viable, fertile, born in a normal Mendelian distribution and, surprisingly, displayed normal development of squamous epithelia. The C4.4A-deficient mice were, nonetheless, significantly lighter than...
Empirical evaluation of data normalization methods for molecular classification.
Huang, Huei-Chung; Qin, Li-Xuan
2018-01-01
Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.
MR guided spatial normalization of SPECT scans
International Nuclear Information System (INIS)
Crouch, B.; Barnden, L.R.; Kwiatek, R.
2010-01-01
Full text: In SPECT population studies where magnetic resonance (MR) scans are also available, the higher resolution of the MR scans allows for an improved spatial normalization of the SPECT scans. In this approach, the SPECT images are first coregistered to their corresponding MR images by a linear (affine) transformation which is calculated using SPM's mutual information maximization algorithm. Non-linear spatial normalization maps are then computed either directly from the MR scans using SPM's built in spatial normalization algorithm, or, from segmented TI MR images using DARTEL, an advanced diffeomorphism based spatial normalization algorithm. We compare these MR based methods to standard SPECT based spatial normalization for a population of 27 fibromyalgia patients and 25 healthy controls with spin echo T 1 scans. We identify significant perfusion deficits in prefrontal white matter in FM patients, with the DARTEL based spatial normalization procedure yielding stronger statistics than the standard SPECT based spatial normalization. (author)
Anomalous normal mode oscillations in semiconductor microcavities
Energy Technology Data Exchange (ETDEWEB)
Wang, H. [Univ. of Oregon, Eugene, OR (United States). Dept. of Physics; Hou, H.Q.; Hammons, B.E. [Sandia National Labs., Albuquerque, NM (United States)
1997-04-01
Semiconductor microcavities as a composite exciton-cavity system can be characterized by two normal modes. Under an impulsive excitation by a short laser pulse, optical polarizations associated with the two normal modes have a {pi} phase difference. The total induced optical polarization is then expected to exhibit a sin{sup 2}({Omega}t)-like oscillation where 2{Omega} is the normal mode splitting, reflecting a coherent energy exchange between the exciton and cavity. In this paper the authors present experimental studies of normal mode oscillations using three-pulse transient four wave mixing (FWM). The result reveals surprisingly that when the cavity is tuned far below the exciton resonance, normal mode oscillation in the polarization is cos{sup 2}({Omega}t)-like, in contrast to what is expected form the simple normal mode model. This anomalous normal mode oscillation reflects the important role of virtual excitation of electronic states in semiconductor microcavities.
Rare Earth Elements Distribution in Beryl
International Nuclear Information System (INIS)
El Gawish, H.K.; Nada, N.; Ghaly, W.A.; Helal, A.I.
2012-01-01
Laser ablation method is applied to a double focusing inductively coupled plasma mass spectrometer to determine the rare earth element distribution in some selected beryl samples. White, green and blue beryl samples are selected from the Egyptian eastern desert. Distributions of chondrite- normalized plot for the rare earth element in the selected beryl samples are investigated
Distribution of age at menopause in two Danish samples
DEFF Research Database (Denmark)
Boldsen, J L; Jeune, B
1990-01-01
We analyzed the distribution of reported age at natural menopause in two random samples of Danish women (n = 176 and n = 150) to determine the shape of the distribution and to disclose any possible trends in the distribution parameters. It was necessary to correct the frequencies of the reported...... ages for the effect of differing ages at reporting. The corrected distribution of age at menopause differs from the normal distribution in the same way in both samples. Both distributions could be described by a mixture of two normal distributions. It appears that most of the parameters of the normal...... distribution mixtures remain unchanged over a 50-year time lag. The position of the distribution, that is, the mean age at menopause, however, increases slightly but significantly....
Multiple imputation in the presence of non-normal data.
Lee, Katherine J; Carlin, John B
2017-02-20
Multiple imputation (MI) is becoming increasingly popular for handling missing data. Standard approaches for MI assume normality for continuous variables (conditionally on the other variables in the imputation model). However, it is unclear how to impute non-normally distributed continuous variables. Using simulation and a case study, we compared various transformations applied prior to imputation, including a novel non-parametric transformation, to imputation on the raw scale and using predictive mean matching (PMM) when imputing non-normal data. We generated data from a range of non-normal distributions, and set 50% to missing completely at random or missing at random. We then imputed missing values on the raw scale, following a zero-skewness log, Box-Cox or non-parametric transformation and using PMM with both type 1 and 2 matching. We compared inferences regarding the marginal mean of the incomplete variable and the association with a fully observed outcome. We also compared results from these approaches in the analysis of depression and anxiety symptoms in parents of very preterm compared with term-born infants. The results provide novel empirical evidence that the decision regarding how to impute a non-normal variable should be based on the nature of the relationship between the variables of interest. If the relationship is linear in the untransformed scale, transformation can introduce bias irrespective of the transformation used. However, if the relationship is non-linear, it may be important to transform the variable to accurately capture this relationship. A useful alternative is to impute the variable using PMM with type 1 matching. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Beare, Brendan K.
2009-01-01
Suppose that X and Y are random variables. We define a replicating function to be a function f such that f(X) and Y have the same distribution. In general, the set of replicating functions for a given pair of random variables may be infinite. Suppose we have some objective function, or cost function, defined over the set of replicating functions, and we seek to estimate the replicating function with the lowest cost. We develop an approach to estimating the cheapest replicating function that i...
2007-01-01
Please note that starting from 1 March 2007, the mail distribution and collection times will be modified for the following buildings: 6, 8, 9, 10, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 29, 69, 40, 70, 101, 102, 109, 118, 152, 153, 154, 155, 166, 167, 169, 171, 174, 261, 354, 358, 576, 579 and 580. Complementary Information on the new times will be posted on the entry doors and left in the mail boxes of each building. TS/FM Group
Stewart, Stan
2004-01-01
Switchgear plays a fundamental role within the power supply industry. It is required to isolate faulty equipment, divide large networks into sections for repair purposes, reconfigure networks in order to restore power supplies and control other equipment.This book begins with the general principles of the Switchgear function and leads on to discuss topics such as interruption techniques, fault level calculations, switching transients and electrical insulation; making this an invaluable reference source. Solutions to practical problems associated with Distribution Switchgear are also included.
The "normal" elongation of river basins
Castelltort, Sebastien
2013-04-01
The spacing between major transverse rivers at the front of Earth's linear mountain belts consistently scales with about half of the mountain half-width [1], despite strong differences in climate and rock uplift rates. Like other empirical measures describing drainage network geometry this result seems to indicate that the form of river basins, among other properties of landscapes, is invariant. Paradoxically, in many current landscape evolution models, the patterns of drainage network organization, as seen for example in drainage density and channel spacing, seem to depend on both climate [2-4] and tectonics [5]. Hovius' observation [1] is one of several unexplained "laws" in geomorphology that still sheds mystery on how water, and rivers in particular, shape the Earth's landscapes. This narrow range of drainage network shapes found in the Earth's orogens is classicaly regarded as an optimal catchment geometry that embodies a "most probable state" in the uplift-erosion system of a linear mountain belt. River basins currently having an aspect away from this geometry are usually considered unstable and expected to re-equilibrate over geological time-scales. Here I show that the Length/Width~2 aspect ratio of drainage basins in linear mountain belts is the natural expectation of sampling a uniform or normal distribution of basin shapes, and bears no information on the geomorphic processes responsible for landscape development. This finding also applies to Hack's [6] law of river basins areas and lengths, a close parent of Hovius' law. [1]Hovius, N. Basin Res. 8, 29-44 (1996) [2]Simpson, G. & Schlunegger, F. J. Geophys. Res. 108, 2300 (2003) [3]Tucker, G. & Bras, R. Water Resour. Res. 34, 2751-2764 (1998) [4]Tucker, G. & Slingerland, R. Water Resour. Res. 33, 2031-2047 (1997) [5]Tucker, G. E. & Whipple, K. X. J. Geophys. Res. 107, 1-1 (2002) [6]Hack, J. US Geol. Surv. Prof. Pap. 294-B (1957)
Knops, Z.F.; Maintz, J.B.A.; Viergever, M.A.; Pluim, J.P.W.; Gee, J.C.; Maintz, J.B.A.; Vannier, M.W.
2003-01-01
A method for the efficient re-binning and shading based correction of intensity distributions of the images prior to normalized mutual information based registration is presented. Our intensity distribution re-binning method is based on the K-means clustering algorithm as opposed to the generally