WorldWideScience

Sample records for statistical density operators

  1. Density operators in quantum mechanics

    International Nuclear Information System (INIS)

    Burzynski, A.

    1979-01-01

    A brief discussion and resume of density operator formalism in the way it occurs in modern physics (in quantum optics, quantum statistical physics, quantum theory of radiation) is presented. Particularly we emphasize the projection operator method, application of spectral theorems and superoperators formalism in operator Hilbert spaces (Hilbert-Schmidt type). The paper includes an appendix on direct sums and direct products of spaces and operators, and problems of reducibility for operator class by using the projection operators. (author)

  2. Nonequilibrium statistical operator in hot-electron transport theory

    International Nuclear Information System (INIS)

    Xing, D.Y.; Liu, M.

    1991-09-01

    The Nonequilibrium Statistical Operator method developed by Zubarev is generalized and applied to the study of hot-electron transport in semiconductors. The steady-state balance equations for momentum and energy are derived to the lowest order in the electron-lattice coupling. We show that the derived balance equations are exactly the same as those obtained by Lei and Ting. This equivalence stems from the fact that to the linear order in the electron-lattice coupling, two statistical density matrices have identical effect when they are used to calculate the average value of a dynamical operator. The application to the steady-state and transient hot-electron transport in multivalley semiconductors is also discussed. (author). 28 refs, 1 fig

  3. Nonequilibrium statistical Zubarev's operator and Green's functions for an inhomogeneous electron gas

    Directory of Open Access Journals (Sweden)

    P.Kostrobii

    2006-01-01

    Full Text Available Nonequilibrium properties of an inhomogeneous electron gas are studied using the method of the nonequilibrium statistical operator by D.N. Zubarev. Generalized transport equations for the mean values of inhomogeneous operators of the electron number density, momentum density, and total energy density for weakly and strongly nonequilibrium states are obtained. We derive a chain of equations for the Green's functions, which connects commutative time-dependent Green's functions "density-density", "momentum-momentum", "enthalpy-enthalpy" with reduced Green's functions of the generalized transport coefficients and with Green's functions for higher order memory kernels in the case of a weakly nonequilibrium spatially inhomogeneous electron gas.

  4. Statistical density of nuclear excited states

    Directory of Open Access Journals (Sweden)

    V. M. Kolomietz

    2015-10-01

    Full Text Available A semi-classical approximation is applied to the calculations of single-particle and statistical level densities in excited nuclei. Landau's conception of quasi-particles with the nucleon effective mass m* < m is used. The approach provides the correct description of the continuum contribution to the level density for realistic finite-depth potentials. It is shown that the continuum states does not affect significantly the thermodynamic calculations for sufficiently small temperatures T ≤ 1 MeV but reduce strongly the results for the excitation energy at high temperatures. By use of standard Woods - Saxon potential and nucleon effective mass m* = 0.7m the A-dependency of the statistical level density parameter K was evaluated in a good qualitative agreement with experimental data.

  5. Planar-channeling spatial density under statistical equilibrium

    International Nuclear Information System (INIS)

    Ellison, J.A.; Picraux, S.T.

    1978-01-01

    The phase-space density for planar channeled particles has been derived for the continuum model under statistical equilibrium. This is used to obtain the particle spatial probability density as a function of incident angle. The spatial density is shown to depend on only two parameters, a normalized incident angle and a normalized planar spacing. This normalization is used to obtain, by numerical calculation, a set of universal curves for the spatial density and also for the channeled-particle wavelength as a function of amplitude. Using these universal curves, the statistical-equilibrium spatial density and the channeled-particle wavelength can be easily obtained for any case for which the continuum model can be applied. Also, a new one-parameter analytic approximation to the spatial density is developed. This parabolic approximation is shown to give excellent agreement with the exact calculations

  6. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  7. Lorentz-covariant reduced-density-operator theory for relativistic-quantum-information processing

    International Nuclear Information System (INIS)

    Ahn, Doyeol; Lee, Hyuk-jae; Hwang, Sung Woo

    2003-01-01

    In this paper, we derived a Lorentz-covariant quantum Liouville equation for the density operator which describes the relativistic-quantum-information processing from Tomonaga-Schwinger equation and an exact formal solution for the reduced density operator is obtained using the projector operator technique and the functional calculus. When all the members of the family of the hypersurfaces become flat hyperplanes, it is shown that our results agree with those of the nonrelativistic case, which is valid only in some specified reference frame. To show that our formulation can be applied to practical problems, we derived the polarization of the vacuum in quantum electrodynamics up to the second order. The formulation presented in this work is general and could be applied to related fields such as quantum electrodynamics and relativistic statistical mechanics

  8. Statistics of peaks in cosmological nonlinear density fields

    International Nuclear Information System (INIS)

    Suginohara, Tatsushi; Suto, Yasushi.

    1990-06-01

    Distribution of the high-density peaks in the universe is examined using N-body simulations. Nonlinear evolution of the underlying density field significantly changes the statistical properties of the peaks, compared with the analytic results valid for the random Gaussian field. In particular, the abundances and correlations of the initial density peaks are discussed in the context of biased galaxy formation theory. (author)

  9. A torque-measuring micromotor provides operator independent measurements marking four different density areas in maxillae.

    Science.gov (United States)

    Di Stefano, Danilo Alessio; Arosio, Paolo; Piattelli, Adriano; Perrotti, Vittoria; Iezzi, Giovanna

    2015-02-01

    Bone density at implant placement site is a key factor to obtain the primary stability of the fixture, which, in turn, is a prognostic factor for osseointegration and long-term success of an implant supported rehabilitation. Recently, an implant motor with a bone density measurement probe has been introduced. The aim of the present study was to test the objectiveness of the bone densities registered by the implant motor regardless of the operator performing them. A total of 3704 bone density measurements, performed by means of the implant motor, were registered by 39 operators at different implant sites during routine activity. Bone density measurements were grouped according to their distribution across the jaws. Specifically, four different areas were distinguished: a pre-antral (between teeth from first right maxillary premolar to first left maxillary premolar) and a sub-antral (more distally) zone in the maxilla, and an interforaminal (between and including teeth from first left mandibular premolar to first right mandibular premolar) and a retroforaminal (more distally) zone in the lower one. A statistical comparison was performed to check the inter-operators variability of the collected data. The device produced consistent and operator-independent bone density values at each tooth position, showing a reliable bone-density measurement. The implant motor demonstrated to be a helpful tool to properly plan implant placement and loading irrespective of the operator using it.

  10. A look at the links between drainage density and flood statistics

    Directory of Open Access Journals (Sweden)

    A. Montanari

    2009-07-01

    Full Text Available We investigate the links between the drainage density of a river basin and selected flood statistics, namely, mean, standard deviation, coefficient of variation and coefficient of skewness of annual maximum series of peak flows. The investigation is carried out through a three-stage analysis. First, a numerical simulation is performed by using a spatially distributed hydrological model in order to highlight how flood statistics change with varying drainage density. Second, a conceptual hydrological model is used in order to analytically derive the dependence of flood statistics on drainage density. Third, real world data from 44 watersheds located in northern Italy were analysed. The three-level analysis seems to suggest that a critical value of the drainage density exists for which a minimum is attained in both the coefficient of variation and the absolute value of the skewness coefficient. Such minima in the flood statistics correspond to a minimum of the flood quantile for a given exceedance probability (i.e., recurrence interval. Therefore, the results of this study may provide useful indications for flood risk assessment in ungauged basins.

  11. Projected evolution superoperators and the density operator

    International Nuclear Information System (INIS)

    Turner, R.E.; Dahler, J.S.; Snider, R.F.

    1982-01-01

    The projection operator method of Zwanzig and Feshbach is used to construct the time dependent density operator associated with a binary scattering event. The formula developed to describe this time dependence involves time-ordered cosine and sine projected evolution (memory) superoperators. Both Schroedinger and interaction picture results are presented. The former is used to demonstrate the equivalence of the time dependent solution of the von Neumann equation and the more familiar frequency dependent Laplace transform solution. For two particular classes of projection superoperators projected density operators are shown to be equivalent to projected wave functions. Except for these two special cases, no projected wave function analogs of projected density operators exist. Along with the decoupled-motions approximation, projected interaction picture density operators are applied to inelastic scattering events. Simple illustrations are provided of how this formalism is related to previously established results for two-state processes, namely, the theory of resonant transfer events, the first order Magnus approximation, and the Landau-Zener theory

  12. Quantum Statistical Operator and Classically Chaotic Hamiltonian ...

    African Journals Online (AJOL)

    Quantum Statistical Operator and Classically Chaotic Hamiltonian System. ... Journal of the Nigerian Association of Mathematical Physics ... In a Hamiltonian system von Neumann Statistical Operator is used to tease out the quantum consequence of (classical) chaos engendered by the nonlinear coupling of system to its ...

  13. Operation statistics of KEKB

    International Nuclear Information System (INIS)

    Kawasumi, Takeshi; Funakoshi, Yoshihiro

    2008-01-01

    KEKB accelerator has been operated since December 1998. We achieved the design peak luminosity of 10.00/nb/s. The present record is 17.12/nb/s. Detailed data of the KEKB Operation is important to evaluate the KEKB performance and to suggest the direction of the performance enhancement. We have classified all KEKB machine time into the following seven categories (1) Physics Run (2) Machine Study (3) Machine Tuning (4) Beam Tuning (5) Trouble (6) Maintenance (7) Others, to estimate the accelerator availability. In this paper we report the operation statistics of the KEKB accelerator. (author)

  14. 14 CFR Section 19 - Uniform Classification of Operating Statistics

    Science.gov (United States)

    2010-01-01

    ... Statistics Section 19 Section 19 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... AIR CARRIERS Operating Statistics Classifications Section 19 Uniform Classification of Operating Statistics ...

  15. Particle-hole state densities for statistical multi-step compound reactions

    International Nuclear Information System (INIS)

    Oblozinsky, P.

    1986-01-01

    An analytical relation is derived for the density of particle-hole bound states applying the equidistant-spacing approximation and the Darwin-Fowler statistical method. The Pauli exclusion principle as well as the finite depth of the potential well are taken into account. The set of densities needed for calculations of multi-step compound reactions is completed by deriving the densities of accessible final states for escape and damping. (orig.)

  16. Projection operator techniques in nonequilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Grabert, H.

    1982-01-01

    This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)

  17. Completely contained and remotely operated digital density meter

    International Nuclear Information System (INIS)

    Goergen, C.R.

    1979-10-01

    A completely contained and remotely operated density determination system having unique features was designed, fabricated, and installed at the Savannah River Plant. The system, based on a Mettler calculating digital density meter, provides more precise and accurate results than the falling drop technique for measuring densities. The system is fast, simple, easy to operate, and has demonstrated both reliability and durability

  18. Statistical density modification using local pattern matching

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    Statistical density modification can make use of local patterns of density found in protein structures to improve crystallographic phases. A method for improving crystallographic phases is presented that is based on the preferential occurrence of certain local patterns of electron density in macromolecular electron-density maps. The method focuses on the relationship between the value of electron density at a point in the map and the pattern of density surrounding this point. Patterns of density that can be superimposed by rotation about the central point are considered equivalent. Standard templates are created from experimental or model electron-density maps by clustering and averaging local patterns of electron density. The clustering is based on correlation coefficients after rotation to maximize the correlation. Experimental or model maps are also used to create histograms relating the value of electron density at the central point to the correlation coefficient of the density surrounding this point with each member of the set of standard patterns. These histograms are then used to estimate the electron density at each point in a new experimental electron-density map using the pattern of electron density at points surrounding that point and the correlation coefficient of this density to each of the set of standard templates, again after rotation to maximize the correlation. The method is strengthened by excluding any information from the point in question from both the templates and the local pattern of density in the calculation. A function based on the origin of the Patterson function is used to remove information about the electron density at the point in question from nearby electron density. This allows an estimation of the electron density at each point in a map, using only information from other points in the process. The resulting estimates of electron density are shown to have errors that are nearly independent of the errors in the original map using

  19. Statistical operation of nuclear power plants

    International Nuclear Information System (INIS)

    Gauzit, Maurice; Wilmart, Yves

    1976-01-01

    A comparison of the statistical operating results of nuclear power stations as issued in the literature shows that the values given for availability and the load factor often differ considerably from each other. This may be due to different definitions given to these terms or even to a poor translation from one language into another. A critical analysis of these terms as well as the choice of a parameter from which it is possible to have a quantitative idea of the actual quality of the operation obtained is proposed. The second section gives, on an homogenous basis and from the results supplied by 83 nuclear power stations now in operation, a statistical analysis of their operating results: in particular, the two light water lines, during 1975, as well as the evolution in terms of age, of the units or the starting conditions of the units during their first two operating years. Test values thus obtained are compared also to those taken 'a priori' as hypothesis in some economic studies [fr

  20. Statistical mechanics of low-density parity-check codes

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 2268502 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)

    2004-02-13

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  1. Statistical mechanics of low-density parity-check codes

    International Nuclear Information System (INIS)

    Kabashima, Yoshiyuki; Saad, David

    2004-01-01

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  2. Local Finite Density Theory, Statistical Blocking and Color Superconductivity

    OpenAIRE

    Ying, S.

    2000-01-01

    The motivation for the development of a local finite density theory is discussed. One of the problems related to an instability in the baryon number fluctuation of the chiral symmetry breaking phase of the quark system in the local theory is shown to exist. Such an instability problem is removed by taking into account the statistical blocking effects for the quark propagator, which depends on a macroscopic {\\em statistical blocking parameter} $\\epsilon$. This new frame work is then applied to...

  3. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    Science.gov (United States)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  4. Experimental investigation of statistical density function of decaying radioactive sources

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1991-01-01

    The validity of the Poisson and the λ P(k) modified Poisson statistical density functions of observing k events in a short time interval is investigated experimentally in radioactive decay detection for various measuring times. The experiments to measure radioactive decay were performed with 89m Y, using a multichannel analyzer. According to the results, Poisson statistics adequately describes the counting experiment for short measuring times. (author) 13 refs.; 4 figs

  5. High density operation on the HT-7 superconducting tokamak

    International Nuclear Information System (INIS)

    Xiang Gao

    2000-01-01

    The structure of the operation region has been studied in the HT-7 superconducting tokamak, and progress on the extension of the HT-7 ohmic discharge operation region is reported. A density corresponding to 1.2 times the Greenwald limit was achieved by RF boronization. The density limit appears to be connected to the impurity content and the edge parameters, so the best results are obtained with very clean plasmas and peaked electron density profiles. The peaking factors of electron density profiles for different current and line averaged densities were observed. The density behaviour and the fuelling efficiency for gas puffing (20-30%), pellet injection (70-80%) and molecular beam injection (40-50%) were studied. The core crash sawteeth and MHD behaviour, which were induced by an injected pellet, were observed and the events correlated with the change of current profile and reversed magnetic shear. The MARFE phenomena on HT-7 are summarized. The best correlation has been found between the total input ohmic power and the product of the edge line averaged density and Z eff . HT-7 could be easily operated in the high density region MARFE-free using RF boronization. (author)

  6. Nuclear Level Densities for Modeling Nuclear Reactions: An Efficient Approach Using Statistical Spectroscopy

    International Nuclear Information System (INIS)

    Calvin W. Johnson

    2005-01-01

    The general goal of the project is to develop and implement computer codes and input files to compute nuclear densities of state. Such densities are important input into calculations of statistical neutron capture, and are difficult to access experimentally. In particular, we will focus on calculating densities for nuclides in the mass range A ∼ 50-100. We use statistical spectroscopy, a moments method based upon a microscopic framework, the interacting shell model. Second year goals and milestones: Develop two or three competing interactions (based upon surface-delta, Gogny, and NN-scattering) suitable for application to nuclei up to A = 100. Begin calculations for nuclides with A = 50-70

  7. Securing co-operation from persons supplying statistical data

    Science.gov (United States)

    Aubenque, M. J.; Blaikley, R. M.; Harris, F. Fraser; Lal, R. B.; Neurdenburg, M. G.; Hernández, R. de Shelly

    1954-01-01

    Securing the co-operation of persons supplying information required for medical statistics is essentially a problem in human relations, and an understanding of the motivations, attitudes, and behaviour of the respondents is necessary. Before any new statistical survey is undertaken, it is suggested by Aubenque and Harris that a preliminary review be made so that the maximum use is made of existing information. Care should also be taken not to burden respondents with an overloaded questionnaire. Aubenque and Harris recommend simplified reporting. Complete population coverage is not necessary. Neurdenburg suggests that the co-operation and support of such organizations as medical associations and social security boards are important and that propaganda should be directed specifically to the groups whose co-operation is sought. Informal personal contacts are valuable and desirable, according to Blaikley, but may have adverse effects if the right kind of approach is not made. Financial payments as an incentive in securing co-operation are opposed by Neurdenburg, who proposes that only postage-free envelopes or similar small favours be granted. Blaikley and Harris, on the other hand, express the view that financial incentives may do much to gain the support of those required to furnish data; there are, however, other incentives, and full use should be made of the natural inclinations of respondents. Compulsion may be necessary in certain instances, but administrative rather than statutory measures should be adopted. Penalties, according to Aubenque, should be inflicted only when justified by imperative health requirements. The results of surveys should be made available as soon as possible to those who co-operated, and Aubenque and Harris point out that they should also be of practical value to the suppliers of the information. Greater co-operation can be secured from medical persons who have an understanding of the statistical principles involved; Aubenque and

  8. Infinite statistics and the SU(1, 1) phase operator

    International Nuclear Information System (INIS)

    Gerry, Christopher C

    2005-01-01

    A few years ago, Agarwal (1991 Phys. Rev. A 44 8398) showed that the Susskind-Glogower phase operators, expressible in terms of Bose operators, provide a realization of the algebra for particles obeying infinite statistics. In this paper we show that the SU(1, 1) phase operators, constructed in terms of the elements of the su(1, 1) Lie algebra, also provide a realization of the algebra for infinite statistics. There are many realizations of the su(1, 1) algebra in terms of single or multimode bose operators, three of which are discussed along with their corresponding phase states. The Susskind-Glogower phase operator is a special case of the SU(1, 1) phase operator associated with the Holstein-Primakoff realization of su(1, 1). (letter to the editor)

  9. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  10. Wind power statistics and an evaluation of wind energy density

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, M.; Parsa, S.; Majidi, M. [Materials and Energy Research Centre, Tehran (Iran, Islamic Republic of)

    1995-11-01

    In this paper the statistical data of fifty days` wind speed measurements at the MERC- solar site are used to find out the wind energy density and other wind characteristics with the help of the Weibull probability distribution function. It is emphasized that the Weibull and Rayleigh probability functions are useful tools for wind energy density estimation but are not quite appropriate for properly fitting the actual wind data of low mean speed, short-time records. One has to use either the actual wind data (histogram) or look for a better fit by other models of the probability function. (Author)

  11. Spectral density and a family of Dirac operators

    International Nuclear Information System (INIS)

    Niemi, A.J.

    1985-01-01

    The spectral density for a class Dirac operators is investigated by relating its even and odd parts to the Riemann zeta-function and to the eta-invariant by Atiyah, Padoti and Singer. Asymptotic expansions are studied and a 'hidden' supersymmetry is revealed and used to relate the Dirac operator to a supersymmetric quantum mechanics. A general method for the computation of the odd spectral density is developed, and various applications are discussed. In particular the connection to the fermion number and a relation between the odd spectral density and some ratios of Jost functions and relative phase shifts are pointed out. Chiral symmetry breaking is investigated using methods analogous to those applied in the investigation of the fermion number, and related to supersymmetry breaking in the corresponding quantum mechanical model. (orig.)

  12. Statistical algorithm for automated signature analysis of power spectral density data

    International Nuclear Information System (INIS)

    Piety, K.R.

    1977-01-01

    A statistical algorithm has been developed and implemented on a minicomputer system for on-line, surveillance applications. Power spectral density (PSD) measurements on process signals are the performance signatures that characterize the ''health'' of the monitored equipment. Statistical methods provide a quantitative basis for automating the detection of anomalous conditions. The surveillance algorithm has been tested on signals from neutron sensors, proximeter probes, and accelerometers to determine its potential for monitoring nuclear reactors and rotating machinery

  13. Nonequilibrium Statistical Operator Method and Generalized Kinetic Equations

    Science.gov (United States)

    Kuzemsky, A. L.

    2018-01-01

    We consider some principal problems of nonequilibrium statistical thermodynamics in the framework of the Zubarev nonequilibrium statistical operator approach. We present a brief comparative analysis of some approaches to describing irreversible processes based on the concept of nonequilibrium Gibbs ensembles and their applicability to describing nonequilibrium processes. We discuss the derivation of generalized kinetic equations for a system in a heat bath. We obtain and analyze a damped Schrödinger-type equation for a dynamical system in a heat bath. We study the dynamical behavior of a particle in a medium taking the dissipation effects into account. We consider the scattering problem for neutrons in a nonequilibrium medium and derive a generalized Van Hove formula. We show that the nonequilibrium statistical operator method is an effective, convenient tool for describing irreversible processes in condensed matter.

  14. On nonequilibrium many-body systems. 1: The nonequilibrium statistical operator method

    International Nuclear Information System (INIS)

    Algarte, A.C.S.; Vasconcellos, A.R.; Luzzi, R.; Sampaio, A.J.C.

    1985-01-01

    The theoretical aspects involved in the treatment of many-body systems strongly departed from equilibrium are discussed. The nonequilibrium statistical operator (NSO) method is considered in detail. Using Jaynes' maximum entropy formalism complemented with an ad hoc hypothesis a nonequilibrium statistical operator is obtained. This approach introduces irreversibility from the outset and we recover statistical operators like those of Green-Mori and Zubarev as particular cases. The connection with Generalized Thermodynamics and the construction of nonlinear transport equations are briefly described. (Author) [pt

  15. Statistical properties of kinetic and total energy densities in reverberant spaces

    DEFF Research Database (Denmark)

    Jacobsen, Finn; Molares, Alfonso Rodriguez

    2010-01-01

    Many acoustical measurements, e.g., measurement of sound power and transmission loss, rely on determining the total sound energy in a reverberation room. The total energy is usually approximated by measuring the mean-square pressure (i.e., the potential energy density) at a number of discrete....... With the advent of a three-dimensional particle velocity transducer, it has become somewhat easier to measure total rather than only potential energy density in a sound field. This paper examines the ensemble statistics of kinetic and total sound energy densities in reverberant enclosures theoretically...... positions. The idea of measuring the total energy density instead of the potential energy density on the assumption that the former quantity varies less with position than the latter goes back to the 1930s. However, the phenomenon was not analyzed until the late 1970s and then only for the region of high...

  16. Structure and representation of correlation functions and the density matrix for a statistical wave field in optics

    International Nuclear Information System (INIS)

    Sudarshan, E.C.G.; Mukunda, N.

    1978-03-01

    A systematic structure analysis of the correlation functions of statistical quantum optics is carried out. From a suitably defined auxiliary two-point function identification of the excited modes in the wave field is found. The relative simplicity of the higher order correlation functions emerges as a by-product and the conditions under which these are made pure are derived. These results depend in a crucial manner on the notion of coherence indices aand of unimodular coherence indices. A new class of approximate expressions for the density operator of a statistical wave field is worked out based on discrete characteristic sets. These are even more economical than the diagonal coherent state representations. An appreciation of the subtleties of quantum theory obtains. Certain implications for the physics of light beams are cited. 28 references

  17. Density scaling and quasiuniversality of flow-event statistics for athermal plastic flows

    DEFF Research Database (Denmark)

    Lerner, Edan; Bailey, Nicholas; Dyre, J. C.

    2014-01-01

    Athermal steady-state plastic flows were simulated for the Kob-Andersen binary Lennard-Jones system and its repulsive version in which the sign of the attractive terms is changed to a plus. Properties evaluated include the distributions of energy drops, stress drops, and strain intervals between...... the flow events. We show that simulations at a single density in conjunction with an equilibrium-liquid simulation at the same density allow one to predict the plastic flow-event statistics at other densities. This is done by applying the recently established “hidden scale invariance” of simple liquids...

  18. The role of statistics in operations research: Some personal re ections

    African Journals Online (AJOL)

    The role of statistics in operations research: Some personal re ections. ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING AJOL ... Statistics has a very important role to play in Operations Research (OR), yet many ...

  19. A STATISTICAL STUDY OF THE MASS AND DENSITY STRUCTURE OF INFRARED DARK CLOUDS

    International Nuclear Information System (INIS)

    Peretto, N.; Fuller, G. A.

    2010-01-01

    How and when the mass distribution of stars in the Galaxy is set is one of the main issues of modern astronomy. Here, we present a statistical study of mass and density distributions of infrared dark clouds (IRDCs) and fragments within them. These regions are pristine molecular gas structures and progenitors of stars and so provide insights into the initial conditions of star formation. This study makes use of an IRDC catalog, the largest sample of IRDC column density maps to date, containing a total of ∼11,000 IRDCs with column densities exceeding N H 2 = 1x10 22 cm -2 and over 50,000 single-peaked IRDC fragments. The large number of objects constitutes an important strength of this study, allowing a detailed analysis of the completeness of the sample and so statistically robust conclusions. Using a statistical approach to assigning distances to clouds, the mass and density distributions of the clouds and the fragments within them are constructed. The mass distributions show a steepening of the slope when switching from IRDCs to fragments, in agreement with previous results of similar structures. IRDCs and fragments are divided into unbound/bound objects by assuming Larson's relation and calculating their virial parameter. IRDCs are mostly gravitationally bound, while a significant fraction of the fragments are not. The density distribution of gravitationally unbound fragments shows a steep characteristic slope such as ΔN/Δlog(n) ∝ n -4.0±0.5 , rather independent of the range of fragment mass. However, the incompleteness limit at a number density of ∼10 3 cm -3 does not allow us to exclude a potential lognormal density distribution. In contrast, gravitationally bound fragments show a characteristic density peak at n ≅ 10 4 cm -3 but the shape of the density distributions changes with the range of fragment masses. An explanation for this could be the differential dynamical evolution of the fragment density with respect to their mass as more massive

  20. Quantum mechanics as applied mathematical statistics

    International Nuclear Information System (INIS)

    Skala, L.; Cizek, J.; Kapsa, V.

    2011-01-01

    Basic mathematical apparatus of quantum mechanics like the wave function, probability density, probability density current, coordinate and momentum operators, corresponding commutation relation, Schroedinger equation, kinetic energy, uncertainty relations and continuity equation is discussed from the point of view of mathematical statistics. It is shown that the basic structure of quantum mechanics can be understood as generalization of classical mechanics in which the statistical character of results of measurement of the coordinate and momentum is taken into account and the most important general properties of statistical theories are correctly respected.

  1. Quantum-statistical kinetic equations

    International Nuclear Information System (INIS)

    Loss, D.; Schoeller, H.

    1989-01-01

    Considering a homogeneous normal quantum fluid consisting of identical interacting fermions or bosons, the authors derive an exact quantum-statistical generalized kinetic equation with a collision operator given as explicit cluster series where exchange effects are included through renormalized Liouville operators. This new result is obtained by applying a recently developed superoperator formalism (Liouville operators, cluster expansions, symmetrized projectors, P q -rule, etc.) to nonequilibrium systems described by a density operator ρ(t) which obeys the von Neumann equation. By means of this formalism a factorization theorem is proven (being essential for obtaining closed equations), and partial resummations (leading to renormalized quantities) are performed. As an illustrative application, the quantum-statistical versions (including exchange effects due to Fermi-Dirac or Bose-Einstein statistics) of the homogeneous Boltzmann (binary collisions) and Choh-Uhlenbeck (triple collisions) equations are derived

  2. Operation and control of high density tokamak reactors

    International Nuclear Information System (INIS)

    Attenberger, S.E.; McAlees, D.G.

    1976-01-01

    The incentive for high density operation of a tokamak reactor is discussed. The plasma size required to attain ignition is determined. Ignition is found to be possible in a relatively small system provided other design criteria are met. These criteria are described and the technology developments and operating procedures required by them are outlined. The parameters for such a system and its dynamic behavior during the operating cycle are also discussed

  3. Exact statistical results for binary mixing and reaction in variable density turbulence

    Science.gov (United States)

    Ristorcelli, J. R.

    2017-02-01

    We report a number of rigorous statistical results on binary active scalar mixing in variable density turbulence. The study is motivated by mixing between pure fluids with very different densities and whose density intensity is of order unity. Our primary focus is the derivation of exact mathematical results for mixing in variable density turbulence and we do point out the potential fields of application of the results. A binary one step reaction is invoked to derive a metric to asses the state of mixing. The mean reaction rate in variable density turbulent mixing can be expressed, in closed form, using the first order Favre mean variables and the Reynolds averaged density variance, ⟨ρ2⟩ . We show that the normalized density variance, ⟨ρ2⟩ , reflects the reduction of the reaction due to mixing and is a mix metric. The result is mathematically rigorous. The result is the variable density analog, the normalized mass fraction variance ⟨c2⟩ used in constant density turbulent mixing. As a consequence, we demonstrate that use of the analogous normalized Favre variance of the mass fraction, c″ ⁣2˜ , as a mix metric is not theoretically justified in variable density turbulence. We additionally derive expressions relating various second order moments of the mass fraction, specific volume, and density fields. The central role of the density specific volume covariance ⟨ρ v ⟩ is highlighted; it is a key quantity with considerable dynamical significance linking various second order statistics. For laboratory experiments, we have developed exact relations between the Reynolds scalar variance ⟨c2⟩ its Favre analog c″ ⁣2˜ , and various second moments including ⟨ρ v ⟩ . For moment closure models that evolve ⟨ρ v ⟩ and not ⟨ρ2⟩ , we provide a novel expression for ⟨ρ2⟩ in terms of a rational function of ⟨ρ v ⟩ that avoids recourse to Taylor series methods (which do not converge for large density differences). We have derived

  4. Wigner Function of Density Operator for Negative Binomial Distribution

    International Nuclear Information System (INIS)

    Xu Xinglei; Li Hongqi

    2008-01-01

    By using the technique of integration within an ordered product (IWOP) of operator we derive Wigner function of density operator for negative binomial distribution of radiation field in the mixed state case, then we derive the Wigner function of squeezed number state, which yields negative binomial distribution by virtue of the entangled state representation and the entangled Wigner operator

  5. Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function

    Science.gov (United States)

    Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.

    2011-01-01

    In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.

  6. Statistical measurement of power spectrum density of large aperture optical component

    International Nuclear Information System (INIS)

    Xu Jiancheng; Xu Qiao; Chai Liqun

    2010-01-01

    According to the requirement of ICF, a method based on statistical theory has been proposed to measure the power spectrum density (PSD) of large aperture optical components. The method breaks the large-aperture wavefront into small regions, and obtains the PSD of the large-aperture wavefront by weighted averaging of the PSDs of the regions, where the weight factor is each region's area. Simulation and experiment demonstrate the effectiveness of the proposed method. They also show that, the obtained PSDs of the large-aperture wavefront by statistical method and sub-aperture stitching method fit well, when the number of small regions is no less than 8 x 8. The statistical method is not sensitive to translation stage's errors and environment instabilities, thus it is appropriate for PSD measurement during the process of optical fabrication. (authors)

  7. Statistical inference of level densities from resolved resonance parameters

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1983-08-01

    Level densities are most directly obtained by counting the resonances observed in the resolved resonance range. Even in the measurements, however, weak levels are invariably missed so that one has to estimate their number and add it to the raw count. The main categories of missinglevel estimators are discussed in the present review, viz. (I) ladder methods including those based on the theory of Hamiltonian matrix ensembles (Dyson-Mehta statistics), (II) methods based on comparison with artificial cross section curves (Monte Carlo simulation, Garrison's autocorrelation method), (III) methods exploiting the observed neutron width distribution by means of Bayesian or more approximate procedures such as maximum-likelihood, least-squares or moment methods, with various recipes for the treatment of detection thresholds and resolution effects. The language of mathematical statistics is employed to clarify the basis of, and the relationship between, the various techniques. Recent progress in the treatment of resolution effects, detection thresholds and p-wave admixture is described. (orig.) [de

  8. Operation and control of high density tokamak reactors

    International Nuclear Information System (INIS)

    Attenberger, S.E.; McAlees, D.G.

    1976-01-01

    The incentive for high density operation of a tokamak reactor was discussed. It is found that high density permits ignition in a relatively small, moderately elongated plasma with a moderate magnetic field strength. Under these conditions, neutron wall loadings approximately 4 MW/m 2 must be tolerated. The sensitivity analysis with respect to impurity effects shows that impurity control will most likely be necessary to achieve the desired plasma conditions. The charge exchange sputtered impurities are found to have an important effect so that maintaining a low neutral density in the plasma is critical. If it is assumed that neutral beams will be used to heat the plasma to ignition, high energy injection is required (approximately 250 keV) when heating is accompished at full density. A scenario is outlined where the ignition temperature is established at low density and then the fueling rate is increased to attain ignition. This approach may permit beams with energies being developed for use in TFTR to be successfully used to heat a high density device of the type described here to ignition

  9. Use of a mixture statistical model in studying malaria vectors density.

    Directory of Open Access Journals (Sweden)

    Olayidé Boussari

    Full Text Available Vector control is a major step in the process of malaria control and elimination. This requires vector counts and appropriate statistical analyses of these counts. However, vector counts are often overdispersed. A non-parametric mixture of Poisson model (NPMP is proposed to allow for overdispersion and better describe vector distribution. Mosquito collections using the Human Landing Catches as well as collection of environmental and climatic data were carried out from January to December 2009 in 28 villages in Southern Benin. A NPMP regression model with "village" as random effect is used to test statistical correlations between malaria vectors density and environmental and climatic factors. Furthermore, the villages were ranked using the latent classes derived from the NPMP model. Based on this classification of the villages, the impacts of four vector control strategies implemented in the villages were compared. Vector counts were highly variable and overdispersed with important proportion of zeros (75%. The NPMP model had a good aptitude to predict the observed values and showed that: i proximity to freshwater body, market gardening, and high levels of rain were associated with high vector density; ii water conveyance, cattle breeding, vegetation index were associated with low vector density. The 28 villages could then be ranked according to the mean vector number as estimated by the random part of the model after adjustment on all covariates. The NPMP model made it possible to describe the distribution of the vector across the study area. The villages were ranked according to the mean vector density after taking into account the most important covariates. This study demonstrates the necessity and possibility of adapting methods of vector counting and sampling to each setting.

  10. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  11. Statistical theory of electron densities

    International Nuclear Information System (INIS)

    Pratt, L.R.; Hoffman, G.G.; Harris, R.A.

    1988-01-01

    An optimized Thomas--Fermi theory is proposed which retains the simplicity of the original theory and is a suitable reference theory for Monte Carlo density functional treatments of condensed materials. The key ingredient of the optimized theory is a neighborhood sampled potential which contains effects of the inhomogeneities in the one-electron potential. In contrast to the traditional Thomas--Fermi approach, the optimized theory predicts a finite electron density in the vicinity of a nucleus. Consideration of the example of an ideal electron gas subject to a central Coulomb field indicates that implementation of the approach is straightforward. The optimized theory is found to fail completely when a classically forbidden region is approached. However, these circumstances are not of primary interest for calculations of interatomic forces. It is shown how the energy functional of the density may be constructed by integration of a generalized Hellmann--Feynman relation. This generalized Hellmann--Feynman relation proves to be equivalent to the variational principle of density functional quantum mechanics, and, therefore, the present density theory can be viewed as a variational consequence of the constructed energy functional

  12. A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals

    KAUST Repository

    Mohamed, Mamdouh S.

    2015-05-18

    The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoretical analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kröner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.

  13. A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals

    KAUST Repository

    Mohamed, Mamdouh S.; Larson, Ben C.; Tischler, Jon Z.; El-Azab, Anter

    2015-01-01

    The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoretical analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kröner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.

  14. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    Science.gov (United States)

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  15. Wind energy statistics

    International Nuclear Information System (INIS)

    Holttinen, H.; Tammelin, B.; Hyvoenen, R.

    1997-01-01

    The recording, analyzing and publishing of statistics of wind energy production has been reorganized in cooperation of VTT Energy, Finnish Meteorological (FMI Energy) and Finnish Wind Energy Association (STY) and supported by the Ministry of Trade and Industry (KTM). VTT Energy has developed a database that contains both monthly data and information on the wind turbines, sites and operators involved. The monthly production figures together with component failure statistics are collected from the operators by VTT Energy, who produces the final wind energy statistics to be published in Tuulensilmae and reported to energy statistics in Finland and abroad (Statistics Finland, Eurostat, IEA). To be able to verify the annual and monthly wind energy potential with average wind energy climate a production index in adopted. The index gives the expected wind energy production at various areas in Finland calculated using real wind speed observations, air density and a power curve for a typical 500 kW-wind turbine. FMI Energy has produced the average figures for four weather stations using the data from 1985-1996, and produces the monthly figures. (orig.)

  16. Breakdown of the Siegert theorem and the many-body charge density operators

    International Nuclear Information System (INIS)

    Hyuga, H.; Ohtsubo, H.

    1978-01-01

    The exchange charge density operator is studied in the two-boson exchange model with consistent treatment of the exchange current and nuclear wave functions. A non-vanishing exchange charge density operator even in the static limit, which leads to the breakdown of the Siegert theorem, is found. (Auth.)

  17. Nuclear Level Densities for Modeling Nuclear Reactions: An Efficient Approach Using Statistical Spectroscopy: Annual Scientific Report July 2004

    International Nuclear Information System (INIS)

    Calvin W. Johnson

    2004-01-01

    The general goal of the project is to develop and implement computer codes and input files to compute nuclear densities of state. Such densities are important input into calculations of statistical neutron capture, and are difficult to access experimentally. In particular, we will focus on calculating densities for nuclides in the mass range A ?????? 50 - 100. We use statistical spectroscopy, a moments method based upon a microscopic framework, the interacting shell model. In this report we present our progress for the past year

  18. Probability density cloud as a geometrical tool to describe statistics of scattered light.

    Science.gov (United States)

    Yaitskova, Natalia

    2017-04-01

    First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.

  19. Experimental study of high density foods for the Space Operations Center

    Science.gov (United States)

    Ahmed, S. M.

    1981-01-01

    The experimental study of high density foods for the Space Operations Center is described. A sensory evaluation of the high density foods was conducted first to test the acceptability of the products. A shelf-life study of the high density foods was also conducted for three different time lengths at three different temperatures. The nutritional analysis of the high density foods is at present incomplete.

  20. Spectra of random operators with absolutely continuous integrated density of states

    International Nuclear Information System (INIS)

    Rio, Rafael del

    2014-01-01

    The structure of the spectrum of random operators is studied. It is shown that if the density of states measure of some subsets of the spectrum is zero, then these subsets are empty. In particular follows that absolute continuity of the integrated density of states implies singular spectra of ergodic operators is either empty or of positive measure. Our results apply to Anderson and alloy type models, perturbed Landau Hamiltonians, almost periodic potentials, and models which are not ergodic

  1. Spectra of random operators with absolutely continuous integrated density of states

    Energy Technology Data Exchange (ETDEWEB)

    Rio, Rafael del, E-mail: delrio@iimas.unam.mx, E-mail: delriomagia@gmail.com [Departamento de Fisica Matematica, Instituto de Investigaciones en Matemáticas Aplicadas y en Sistemas, Universidad Nacional Autónoma de México, C.P. 04510, México D.F. (Mexico)

    2014-04-15

    The structure of the spectrum of random operators is studied. It is shown that if the density of states measure of some subsets of the spectrum is zero, then these subsets are empty. In particular follows that absolute continuity of the integrated density of states implies singular spectra of ergodic operators is either empty or of positive measure. Our results apply to Anderson and alloy type models, perturbed Landau Hamiltonians, almost periodic potentials, and models which are not ergodic.

  2. Statistical study of density fluctuations in the tore supra tokamak

    International Nuclear Information System (INIS)

    Devynck, P.; Fenzi, C.; Garbet, X.; Laviron, C.

    1998-03-01

    It is believed that the radial anomalous transport in tokamaks is caused by plasma turbulence. Using infra-red laser scattering technique on the Tore Supra tokamak, statistical properties of the density fluctuations are studied as a function of the scales in ohmic as well as additional heating regimes using the lower hybrid or the ion cyclotron frequencies. The probability distributions are compared to a Gaussian in order to estimate the role of intermittency which is found to be negligible. The temporal behaviour of the three-dimensional spectrum is thoroughly discussed; its multifractal character is reflected in the singularity spectrum. The autocorrelation coefficient as well as their long-time incoherence and statistical independence. We also put forward the existence of fluctuations transfer between two distinct but close wavenumbers. A rather clearer image is thus obtained about the way energy is transferred through the turbulent scales. (author)

  3. Statistical analysis of first period of operation of FTU Tokamak

    International Nuclear Information System (INIS)

    Crisanti, F.; Apruzzese, G.; Frigione, D.; Kroegler, H.; Lovisetto, L.; Mazzitelli, G.; Podda, S.

    1996-09-01

    On the FTU Tokamak the plasma physics operations started on the 20/4/90. The first plasma had a plasma current Ip=0.75 MA for about a second. The experimental phase lasted until 7/7/94, when a long shut-down begun for installing the toroidal limiter in the inner side of the vacuum vessel. In these four years of operations plasma experiments have been successfully exploited, e.g. experiments of single and multiple pellet injections; full current drive up to Ip=300 KA was obtained by using waves at the frequency of the Lower Hybrid; analysis of ohmic plasma parameters with different materials (from the low Z silicon to high Z tungsten) as plasma facing element was performed. In this work a statistical analysis of the full period of operation is presented. Moreover, a comparison with the statistical data from other Tokamaks is attempted

  4. Statistical factors affecting the success of nuclear operations

    International Nuclear Information System (INIS)

    Sunder, S.; Stephenson, J.R.; Hochman, D.

    1999-01-01

    In this article, the authors present a statistical analysis to determine the operational, financial, technical, and managerial factors that most significantly affect the success of nuclear operations. The study analyzes data for over 70 nuclear plants and 40 operating companies over a period of five years in order to draw conclusions that they hope will be of interest to utility companies and public utility commissions as they seek ways to improve rates of success in nuclear operations. Some of these conclusions will not be surprising--for example, that older plants have heavier maintenance requirements--but others are less intuitive. For instance, the observation that operators of fewer plants have lower costs suggests that any experience curve benefits associated with managing multiple nuclear facilities is overshadowed by the logistic problems of multiple facilities. After presenting a brief history of nuclear power in America, the authors outline the motivations of the study and the methodology of their analysis. They end the article with the results of the study and discuss some of the managerial implications of these findings

  5. Statistical mechanics of high-density bond percolation

    Science.gov (United States)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  6. Statistical process control: separating signal from noise in emergency department operations.

    Science.gov (United States)

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Durability of Low Platinum Fuel Cells Operating at High Power Density

    Energy Technology Data Exchange (ETDEWEB)

    Polevaya, Olga [Nuvera Fuel Cells Inc.; Blanchet, Scott [Nuvera Fuel Cells Inc.; Ahluwalia, Rajesh [Argonne National Lab; Borup, Rod [Los-Alamos National Lab; Mukundan, Rangachary [Los-Alamos National Lab

    2014-03-19

    Understanding and improving the durability of cost-competitive fuel cell stacks is imperative to successful deployment of the technology. Stacks will need to operate well beyond today’s state-of-the-art rated power density with very low platinum loading in order to achieve the cost targets set forth by DOE ($15/kW) and ultimately be competitive with incumbent technologies. An accelerated cost-reduction path presented by Nuvera focused on substantially increasing power density to address non-PGM material costs as well as platinum. The study developed a practical understanding of the degradation mechanisms impacting durability of fuel cells with low platinum loading (≤0.2mg/cm2) operating at high power density (≥1.0W/cm2) and worked out approaches for improving the durability of low-loaded, high-power stack designs. Of specific interest is the impact of combining low platinum loading with high power density operation, as this offers the best chance of achieving long-term cost targets. A design-of-experiments approach was utilized to reveal and quantify the sensitivity of durability-critical material properties to high current density at two levels of platinum loading (the more conventional 0.45 mgPt.cm–1 and the much lower 0.2 mgPt.cm–2) across several cell architectures. We studied the relevance of selected component accelerated stress tests (AST) to fuel cell operation in power producing mode. New stress tests (NST) were designed to investigate the sensitivity to the addition of electrical current on the ASTs, along with combined humidity and load cycles and, eventually, relate to the combined city/highway drive cycle. Changes in the cathode electrochemical surface area (ECSA) and average oxygen partial pressure on the catalyst layer with aging under AST and NST protocols were compared based on the number of completed cycles. Studies showed elevated sensitivity of Pt growth to the potential limits and the initial particle size distribution. The ECSA loss

  8. Density limit in FTU tokamak during Ohmic operation

    International Nuclear Information System (INIS)

    Frigione, D.; Pieroni, L.

    1993-01-01

    The understanding of the physical mechanisms that regulate the density limit in a Tokamak is very important in view of a future fusion reactor. On one hand density enters as a factor in the figure of merit needed to achieve a burning plasma, and on the other hand a high edge density is a prerequisite for avoiding excessive erosion of the first walls and to limit the impurity influx into the hot plasma core. Furthermore a reactor should work in a safe zone of the operation parameters in order to avoid disruptive instabilities. The density limit problem has been tackled since the 70's, but so far a unique physics picture has not still emerged. In the last few years, due to the availability of better diagnostics, especially for the plasma edge, the use of pellet injectors to fuel the plasma and the experience gained on many different Tokamak, a consensus has been reached on the edge density as the real parameter responsible for the density limit. There are still two main mechanisms invoked to explain this limit: one refers to the power balance between the heat conducted and/or convected across the plasma radius and the power lost by impurity line radiation at the edge. When the latter overcomes the former, shrinking of the current channel occurs, which leads to instabilities due to tearing modes (usually the m/n=2/1) and then to disruption. The other explanation, for now valid for divertor machines, is based on the particle and energy balance in the scrape off layer (SOL). The limit in the edge density is then associated with the thermal collapse of the divertor plasma. In this work we describe the experiments on the density limit in FTU with Ohmic heating, the reason why we also believe that the limit is on the edge density, and discuss its relation to a simple model based on the SOL power balance valid for a limiter Tokamak. (author) 7 refs., 4 figs

  9. Feedback control of plasma density and heating power for steady state operation in LHD

    Energy Technology Data Exchange (ETDEWEB)

    Kamio, Shuji, E-mail: kamio@nifs.ac.jp; Kasahara, Hiroshi; Seki, Tetsuo; Saito, Kenji; Seki, Ryosuke; Nomura, Goro; Mutoh, Takashi

    2015-12-15

    Highlights: • We upgraded a control system for steady state operation in LHD. • This system contains gas fueling system and ICRF power control system. • Automatic power boost system is also attached for stable operation. • As a result, we achieved the long pulse up to 48 min in the electron density of more than 1 × 10{sup 19} m{sup −3}. - Abstract: For steady state operation, the feedback control of plasma density and heating power system was developed in the Large Helical Device (LHD). In order to achieve a record of the long pulse discharge, stable plasma density and heating power are needed. This system contains the radio frequency (RF) heating power control, interlocks, gas fueling, automatic RF phase control, ion cyclotron range of frequency (ICRF) antenna position control, and graphical user interface (GUI). Using the density control system, the electron density was controlled to the target density and using the RF heating power control system, the RF power injection could be stable. As a result of using this system, we achieved the long pulse up to 48 min in the electron density of more than 1 × 10{sup 19} m{sup −3}. Further, the ICRF hardware experienced no critical accidents during the 17th LHD experiment campaign in 2013.

  10. Exploration of one-dimensional plasma current density profile for K-DEMO steady-state operation

    Energy Technology Data Exchange (ETDEWEB)

    Kang, J.S. [Seoul National University, Seoul 151-742 (Korea, Republic of); Jung, L. [National Fusion Research Institute, Daejeon (Korea, Republic of); Byun, C.-S.; Na, D.H.; Na, Y.-S. [Seoul National University, Seoul 151-742 (Korea, Republic of); Hwang, Y.S., E-mail: yhwang@snu.ac.kr [Seoul National University, Seoul 151-742 (Korea, Republic of)

    2016-11-01

    Highlights: • One-dimensional current density and its optimization for the K-DEMO are explored. • Plasma current density profile is calculated with an integrated simulation code. • The impact of self and external heating profiles is considered self-consistently. • Current density is identified as a reference profile by minimizing heating power. - Abstract: Concept study for Korean demonstration fusion reactor (K-DEMO) is in progress, and basic design parameters are proposed by targeting high magnetic field operation with ITER-sized machine. High magnetic field operation is a favorable approach to enlarge relative plasma performance without increasing normalized beta or plasma current. Exploration of one-dimensional current density profile and its optimization process for the K-DEMO steady-state operation are reported in this paper. Numerical analysis is conducted with an integrated plasma simulation code package incorporating a transport code with equilibrium and current drive modules. Operation regimes are addressed with zero-dimensional system analysis. One-dimensional plasma current density profile is calculated based on equilibrium, bootstrap current analysis, and thermal transport analysis. The impact of self and external heating profiles on those parameters is considered self-consistently, where thermal power balance and 100% non-inductive current drive are the main constraints during the whole exploration procedure. Current and pressure profiles are identified as a reference steady-state profile by minimizing the external heating power with desired fusion power.

  11. Remotely operable compact instruments for measuring atmospheric CO2 and CH4 column densities at surface monitoring sites

    Directory of Open Access Journals (Sweden)

    I. Morino

    2010-08-01

    Full Text Available Remotely operable compact instruments for measuring atmospheric CO2 and CH4 column densities were developed in two independent systems: one utilizing a grating-based desktop optical spectrum analyzer (OSA with a resolution enough to resolve rotational lines of CO2 and CH4 in the regions of 1565–1585 and 1674–1682 nm, respectively; the other is an application of an optical fiber Fabry-Perot interferometer (FFPI to obtain the CO2 column density. Direct sunlight was collimated via a small telescope installed on a portable sun tracker and then transmitted through an optical fiber into the OSA or the FFPI for optical analysis. The near infrared spectra of the OSA were retrieved by a least squares spectral fitting algorithm. The CO2 and CH4 column densities deduced were in excellent agreement with those measured by a Fourier transform spectrometer with high resolution. The rovibronic lines in the wavelength region of 1570–1575 nm were analyzed by the FFPI. The I0 and I values in the Beer-Lambert law equation to obtain CO2 column density were deduced by modulating temperature of the FFPI, which offered column CO2 with the statistical error less than 0.2% for six hours measurement.

  12. Microstructure characterisation of solid oxide electrolysis cells operated at high current density

    DEFF Research Database (Denmark)

    Bowen, Jacob R.; Bentzen, Janet Jonna; Chen, Ming

    degradation of cell components in relation to the loss of electrochemical performance specific to the mode of operation. Thus descriptive microstructure characterization methods are required in combination with electrochemical characterization methods to decipher degradation mechanisms. In the present work......High temperature solid oxide cells can be operated either as fuel cells or electrolysis cells for efficient power generation or production of hydrogen from steam or synthesis gas (H2 + CO) from steam and CO2 respectively. When operated under harsh conditions, they often exhibit microstructural...... quantified using the mean linear intercept method as a function of current density and correlated to increases in serial resistance. The above structural changes are then compared in terms of electrode degradation observed during the co-electrolysis of steam and CO2 at current densities up to -1.5 A cm-2...

  13. Theoretical remarks on the statistics of three discriminants in Piety's automated signature analysis of PSD [Power Spectral Density] data

    International Nuclear Information System (INIS)

    Behringer, K.; Spiekerman, G.

    1984-01-01

    Piety (1977) proposed an automated signature analysis of power spectral density data. Eight statistical decision discriminants are introduced. For nearly all the discriminants, improved confidence statements can be made. The statistical characteristics of the last three discriminants, which are applications of non-parametric tests, are considered. (author)

  14. Density by Moduli and Lacunary Statistical Convergence

    Directory of Open Access Journals (Sweden)

    Vinod K. Bhardwaj

    2016-01-01

    Full Text Available We have introduced and studied a new concept of f-lacunary statistical convergence, where f is an unbounded modulus. It is shown that, under certain conditions on a modulus f, the concepts of lacunary strong convergence with respect to a modulus f and f-lacunary statistical convergence are equivalent on bounded sequences. We further characterize those θ for which Sθf=Sf, where Sθf and Sf denote the sets of all f-lacunary statistically convergent sequences and f-statistically convergent sequences, respectively. A general description of inclusion between two arbitrary lacunary methods of f-statistical convergence is given. Finally, we give an Sθf-analog of the Cauchy criterion for convergence and a Tauberian theorem for Sθf-convergence is also proved.

  15. Density functional representation of quantum chemistry. II. Local quantum field theories of molecular matter in terms of the charge density operator do not work

    International Nuclear Information System (INIS)

    Primas, H.; Schleicher, M.

    1975-01-01

    A comprehensive review of the attempts to rephrase molecular quantum mechanics in terms of the particle density operator and the current density or phase density operator is given. All pertinent investigations which have come to attention suffer from severe mathematical inconsistencies and are not adequate to the few-body problem of quantum chemistry. The origin of the failure of these attempts is investigated, and it is shown that a realization of a local quantum field theory of molecular matter in terms of observables would presuppose the solution of many highly nontrivial mathematical problems

  16. Time Evolution of the Wigner Operator as a Quasi-density Operator in Amplitude Dessipative Channel

    Science.gov (United States)

    Yu, Zhisong; Ren, Guihua; Yu, Ziyang; Wei, Chenhuinan; Fan, Hongyi

    2018-06-01

    For developing quantum mechanics theory in phase space, we explore how the Wigner operator {Δ } (α ,α ^{\\ast } )≡ {1}/{π } :e^{-2(α ^{\\ast } -α ^{\\dag })(α -α )}:, when viewed as a quasi-density operator correponding to the Wigner quasiprobability distribution, evolves in a damping channel. with the damping constant κ. We derive that it evolves into 1/T + 1:\\exp 2/T + 1[-(α^{\\ast} e^{-κ t}-a^{\\dag} )(α e^{-κ t}-a)]: where T ≡ 1 - e - 2 κ t . This in turn helps to directly obtain the final state ρ( t) out of the dessipative channel from the initial classical function corresponding to initial ρ(0). Throught the work, the method of integration within ordered product (IWOP) of operators is employed.

  17. Reason of method of density functional in classical and quantum statistical mechanisms

    International Nuclear Information System (INIS)

    Dinariev, O.Yu.

    2000-01-01

    Interaction between phenomenological description of a multi-component mixture on the basis of entropy functional with members, square in terms of component density gradients and temperature, on the one hand, and description in the framework of classical and quantum statistical mechanics, on the other hand, was investigated. Explicit expressions for the entropy functional in the classical and quantum theory were derived. Then a square approximation for the case of minor disturbances of uniform state was calculated. In the approximation the addends square in reference to the gradient were singlet out. It permits calculation of the relevant phenomenological coefficients from the leading principles [ru

  18. Definition and density operator for unpolarized fermion state

    International Nuclear Information System (INIS)

    Prakash, H.

    1981-04-01

    The unpolarized state of fermions is defined as one which does not change in rotations in the spin space. It is shown that, for a fermion field with a specified value of momentum of particles, the density operator is of the form, rho = (1-2a-b)|0,0> 1 , n 2 > is the occupation number state having occupancies n 1 and n 2 in the two spin modes, and a and b are positive quantities which are less than one and give 1-2a-b>=0. (author)

  19. From statistic mechanic outside equilibrium to transport equations

    International Nuclear Information System (INIS)

    Balian, R.

    1995-01-01

    This lecture notes give a synthetic view on the foundations of non-equilibrium statistical mechanics. The purpose is to establish the transport equations satisfied by the relevant variables, starting from the microscopic dynamics. The Liouville representation is introduced, and a projection associates with any density operator , for given choice of relevant observables, a reduced density operator. An exact integral-differential equation for the relevant variables is thereby derived. A short-memory approximation then yields the transport equations. A relevant entropy which characterizes the coarseness of the description is associated with each level of description. As an illustration, the classical gas, with its three levels of description and with the Chapman-Enskog method, is discussed. (author). 3 figs., 5 refs

  20. Effects of tillage operations and plant density on leaf spot disease ...

    African Journals Online (AJOL)

    Two seasons experiments conducted in 2002 and 2003 revealed that Tillage operations significantly influenced leafspot disease severity; Percentage lodging 3.14; 2.08 and Grain yield 3.02; 3.84 in 2002 and 2003 respectively. Plant density also had significant difference on leafspot disease severity; Percentage lodging ...

  1. Quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    Quantum mechanics can emerge from classical statistics. A typical quantum system describes an isolated subsystem of a classical statistical ensemble with infinitely many classical states. The state of this subsystem can be characterized by only a few probabilistic observables. Their expectation values define a density matrix if they obey a 'purity constraint'. Then all the usual laws of quantum mechanics follow, including Heisenberg's uncertainty relation, entanglement and a violation of Bell's inequalities. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. Born's rule for quantum mechanical probabilities follows from the probability concept for a classical statistical ensemble. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem. As an illustration, we discuss a classical statistical implementation of a quantum computer.

  2. Quantum formalism for classical statistics

    Science.gov (United States)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  3. Connection between perturbation theory, projection-operator techniques, and statistical linearization for nonlinear systems

    International Nuclear Information System (INIS)

    Budgor, A.B.; West, B.J.

    1978-01-01

    We employ the equivalence between Zwanzig's projection-operator formalism and perturbation theory to demonstrate that the approximate-solution technique of statistical linearization for nonlinear stochastic differential equations corresponds to the lowest-order β truncation in both the consolidated perturbation expansions and in the ''mass operator'' of a renormalized Green's function equation. Other consolidated equations can be obtained by selectively modifying this mass operator. We particularize the results of this paper to the Duffing anharmonic oscillator equation

  4. Computer-aided assessment of breast density: comparison of supervised deep learning and feature-based statistical learning.

    Science.gov (United States)

    Li, Songfeng; Wei, Jun; Chan, Heang-Ping; Helvie, Mark A; Roubidoux, Marilyn A; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir M; Samala, Ravi K

    2018-01-09

    Breast density is one of the most significant factors that is associated with cancer risk. In this study, our purpose was to develop a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammograms (DMs). The input 'for processing' DMs was first log-transformed, enhanced by a multi-resolution preprocessing scheme, and subsampled to a pixel size of 800 µm  ×  800 µm from 100 µm  ×  100 µm. A deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD) by using a domain adaptation resampling method. The PD was estimated as the ratio of the dense area to the breast area based on the PMD. The DCNN approach was compared to a feature-based statistical learning approach. Gray level, texture and morphological features were extracted and a least absolute shrinkage and selection operator was used to combine the features into a feature-based PMD. With approval of the Institutional Review Board, we retrospectively collected a training set of 478 DMs and an independent test set of 183 DMs from patient files in our institution. Two experienced mammography quality standards act radiologists interactively segmented PD as the reference standard. Ten-fold cross-validation was used for model selection and evaluation with the training set. With cross-validation, DCNN obtained a Dice's coefficient (DC) of 0.79  ±  0.13 and Pearson's correlation (r) of 0.97, whereas feature-based learning obtained DC  =  0.72  ±  0.18 and r  =  0.85. For the independent test set, DCNN achieved DC  =  0.76  ±  0.09 and r  =  0.94, while feature-based learning achieved DC  =  0.62  ±  0.21 and r  =  0.75. Our DCNN approach was significantly better and more robust than the feature-based learning approach for automated PD estimation on DMs, demonstrating its potential use for automated density reporting as well as

  5. Computer-aided assessment of breast density: comparison of supervised deep learning and feature-based statistical learning

    Science.gov (United States)

    Li, Songfeng; Wei, Jun; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2018-01-01

    Breast density is one of the most significant factors that is associated with cancer risk. In this study, our purpose was to develop a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammograms (DMs). The input ‘for processing’ DMs was first log-transformed, enhanced by a multi-resolution preprocessing scheme, and subsampled to a pixel size of 800 µm  ×  800 µm from 100 µm  ×  100 µm. A deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD) by using a domain adaptation resampling method. The PD was estimated as the ratio of the dense area to the breast area based on the PMD. The DCNN approach was compared to a feature-based statistical learning approach. Gray level, texture and morphological features were extracted and a least absolute shrinkage and selection operator was used to combine the features into a feature-based PMD. With approval of the Institutional Review Board, we retrospectively collected a training set of 478 DMs and an independent test set of 183 DMs from patient files in our institution. Two experienced mammography quality standards act radiologists interactively segmented PD as the reference standard. Ten-fold cross-validation was used for model selection and evaluation with the training set. With cross-validation, DCNN obtained a Dice’s coefficient (DC) of 0.79  ±  0.13 and Pearson’s correlation (r) of 0.97, whereas feature-based learning obtained DC  =  0.72  ±  0.18 and r  =  0.85. For the independent test set, DCNN achieved DC  =  0.76  ±  0.09 and r  =  0.94, while feature-based learning achieved DC  =  0.62  ±  0.21 and r  =  0.75. Our DCNN approach was significantly better and more robust than the feature-based learning approach for automated PD estimation on DMs, demonstrating its potential use for automated density reporting as

  6. Quantum statistics of dense gases and nonideal plasmas

    CERN Document Server

    Ebeling, Werner; Filinov, Vladimir

    2017-01-01

    The aim of this book is the pedagogical exploration of the basic principles of quantum-statistical thermodynamics as applied to various states of matter – ranging from rare gases to astrophysical matter with high-energy density. The reader will learn in this work that thermodynamics and quantum statistics are still the concepts on which even the most advanced research is operating - despite of a flood of modern concepts, classical entities like temperature, pressure, energy and entropy are shown to remain fundamental. The physics of gases, plasmas and high-energy density matter is still a growing field and even though solids and liquids dominate our daily life, more than 99 percent of the visible Universe is in the state of gases and plasmas and the overwhelming part of matter exists at extreme conditions connected with very large energy densities, such as in the interior of stars. This text, combining material from lectures and advanced seminars given by the authors over many decades, is a must-have intr...

  7. Statistical Engineering in Air Traffic Management Research

    Science.gov (United States)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  8. The statistics of maxima in primordial density perturbations

    International Nuclear Information System (INIS)

    Peacock, J.A.; Heavens, A.F.

    1985-01-01

    An investigation has been made of the hypothesis that protogalaxies/protoclusters form at the sites of maxima in a primordial field of normally distributed density perturbations. Using a mixture of analytic and numerical techniques, the properties of the maxima, have been studied. The results provide a natural mechanism for biased galaxy formation in which galaxies do not necessarily follow the large-scale density. Methods for obtained the true autocorrelation function of the density field and implications for Microwave Background studies are discussed. (author)

  9. Degradation of Solid Oxide Electrolysis Cells Operated at High Current Densities

    DEFF Research Database (Denmark)

    Tao, Youkun; Ebbesen, Sune Dalgaard; Mogensen, Mogens Bjerg

    2014-01-01

    In this work the durability of solid oxide cells for co-electrolysis of steam and carbon dioxide (45 % H2O + 45 % CO2 + 10 % H2) at high current densities was investigated. The tested cells are Ni-YSZ electrode supported, with a YSZ electrolyte and either a LSM-YSZ or LSCF-CGO oxygen electrode....... A current density of -1.5 and -2.0 A/cm2 was applied to the cell and the gas conversion was 45 % and 60 %, respectively. The cells were operated for a period of up to 700 hours. The electrochemical analysis revealed significant performance degradation for the ohmic process, oxygen ion interfacial transfer...

  10. Global quantum discord and matrix product density operators

    Science.gov (United States)

    Huang, Hai-Lin; Cheng, Hong-Guang; Guo, Xiao; Zhang, Duo; Wu, Yuyin; Xu, Jian; Sun, Zhao-Yu

    2018-06-01

    In a previous study, we have proposed a procedure to study global quantum discord in 1D chains whose ground states are described by matrix product states [Z.-Y. Sun et al., Ann. Phys. 359, 115 (2015)]. In this paper, we show that with a very simple generalization, the procedure can be used to investigate quantum mixed states described by matrix product density operators, such as quantum chains at finite temperatures and 1D subchains in high-dimensional lattices. As an example, we study the global discord in the ground state of a 2D transverse-field Ising lattice, and pay our attention to the scaling behavior of global discord in 1D sub-chains of the lattice. We find that, for any strength of the magnetic field, global discord always shows a linear scaling behavior as the increase of the length of the sub-chains. In addition, global discord and the so-called "discord density" can be used to indicate the quantum phase transition in the model. Furthermore, based upon our numerical results, we make some reliable predictions about the scaling of global discord defined on the n × n sub-squares in the lattice.

  11. Statistical analysis of disruptions in JET

    International Nuclear Information System (INIS)

    De Vries, P.C.; Johnson, M.F.; Segui, I.

    2009-01-01

    The disruption rate (the percentage of discharges that disrupt) in JET was found to drop steadily over the years. Recent campaigns (2005-2007) show a yearly averaged disruption rate of only 6% while from 1991 to 1995 this was often higher than 20%. Besides the disruption rate, the so-called disruptivity, or the likelihood of a disruption depending on the plasma parameters, has been determined. The disruptivity of plasmas was found to be significantly higher close to the three main operational boundaries for tokamaks; the low-q, high density and β-limit. The frequency at which JET operated close to the density-limit increased six fold over the last decade; however, only a small reduction in disruptivity was found. Similarly the disruptivity close to the low-q and β-limit was found to be unchanged. The most significant reduction in disruptivity was found far from the operational boundaries, leading to the conclusion that the improved disruption rate is due to a better technical capability of operating JET, instead of safer operations close to the physics limits. The statistics showed that a simple protection system was able to mitigate the forces of a large fraction of disruptions, although it has proved to be at present more difficult to ameliorate the heat flux.

  12. DAMPING OF ELECTRON DENSITY STRUCTURES AND IMPLICATIONS FOR INTERSTELLAR SCINTILLATION

    International Nuclear Information System (INIS)

    Smith, K. W.; Terry, P. W.

    2011-01-01

    The forms of electron density structures in kinetic Alfven wave (KAW) turbulence are studied in connection with scintillation. The focus is on small scales L ∼ 10 8 -10 10 cm where the KAW regime is active in the interstellar medium, principally within turbulent H II regions. Scales at 10 times the ion gyroradius and smaller are inferred to dominate scintillation in the theory of Boldyrev et al. From numerical solutions of a decaying KAW turbulence model, structure morphology reveals two types of localized structures, filaments and sheets, and shows that they arise in different regimes of resistive and diffusive damping. Minimal resistive damping yields localized current filaments that form out of Gaussian-distributed initial conditions. When resistive damping is large relative to diffusive damping, sheet-like structures form. In the filamentary regime, each filament is associated with a non-localized magnetic and density structure, circularly symmetric in cross section. Density and magnetic fields have Gaussian statistics (as inferred from Gaussian-valued kurtosis) while density gradients are strongly non-Gaussian, more so than current. This enhancement of non-Gaussian statistics in a derivative field is expected since gradient operations enhance small-scale fluctuations. The enhancement of density gradient kurtosis over current kurtosis is not obvious, yet it suggests that modest density fluctuations may yield large scintillation events during pulsar signal propagation. In the sheet regime the same statistical observations hold, despite the absence of localized filamentary structures. Probability density functions are constructed from statistical ensembles in both regimes, showing clear formation of long, highly non-Gaussian tails.

  13. Communication: satisfying fermionic statistics in the modeling of open time-dependent quantum systems with one-electron reduced density matrices.

    Science.gov (United States)

    Head-Marsden, Kade; Mazziotti, David A

    2015-02-07

    For an open, time-dependent quantum system, Lindblad derived the most general modification of the quantum Liouville equation in the Markovian approximation that models environmental effects while preserving the non-negativity of the system's density matrix. While Lindblad's modification is correct for N-electron density matrices, solution of the Liouville equation with a Lindblad operator causes the one-electron reduced density matrix (1-RDM) to violate the Pauli exclusion principle. Consequently, after a short time, the 1-RDM is not representable by an ensemble N-electron density matrix (not ensemble N-representable). In this communication, we derive the necessary and sufficient constraints on the Lindbladian matrix within the Lindblad operator to ensure that the 1-RDM remains N-representable for all time. The theory is illustrated by considering the relaxation of an excitation in several molecules F2, N2, CO, and BeH2 subject to environmental noise.

  14. Statistical separability and the impossibility of the superluminal quantum communication

    International Nuclear Information System (INIS)

    Zhang Qiren

    2004-01-01

    The authors analyse the relation and the difference between the quantum correlation of two points in space and the communication between them. The statistical separability of two points in the space is defined and proven. From this statistical separability, authors prove that the superluminal quantum communication between different points is impossible. To emphasis the compatibility between the quantum theory and the relativity, authors write the von Neumann equation of density operator evolution in the multi-time form. (author)

  15. Current density distribution mapping in PEM fuel cells as an instrument for operational measurements

    Energy Technology Data Exchange (ETDEWEB)

    Geske, M.; Heuer, M.; Heideck, G.; Styczynski, Z. A. [Otto-von-Guericke University Magdeburg, Chair Electric Power Networks and Renewable Energy Sources, Magdeburg (Germany)

    2010-07-01

    A newly developed measurement system for current density distribution mapping has enabled a new approach for operational measurements in proton exchange membrane fuel cells (PEMFC). Taking into account previously constructed measurement systems, a method based on a multi layer printed circuit board was chosen for the development of the new system. This type of system consists of a sensor, a special electronic device and the control and visualization PC. For the acquisition of the current density distribution values, a sensor device was designed and installed within a multilayer printed circuit board with integrated shunt resistors. Varying shunt values can be taken into consideration with a newly developed and evaluated calibration method. The sensor device was integrated in a PEM fuel cell stack to prove the functionality of the whole measurement system. A software application was implemented to visualize and save the measurement values. Its functionality was verified by operational measurements within a PEMFC system. Measurement accuracy and possible negative reactions of the sensor device during PEMFC operation are discussed in detail in this paper. The developed system enables operational measurements for different operating phases of PEM fuel cells. Additionally, this can be seen as a basis for new opportunities of optimization for fuel cell design and operation modes. (author)

  16. Current Density Distribution Mapping in PEM Fuel Cells as An Instrument for Operational Measurements

    Directory of Open Access Journals (Sweden)

    Martin Geske

    2010-04-01

    Full Text Available A newly developed measurement system for current density distribution mapping has enabled a new approach for operational measurements in proton exchange membrane fuel cells (PEMFC. Taking into account previously constructed measurement systems, a method based on a multi layer printed circuit board was chosen for the development of the new system. This type of system consists of a sensor, a special electronic device and the control and visualization PC. For the acquisition of the current density distribution values, a sensor device was designed and installed within a multilayer printed circuit board with integrated shunt resistors. Varying shunt values can be taken into consideration with a newly developed and evaluated calibration method. The sensor device was integrated in a PEM fuel cell stack to prove the functionality of the whole measurement system. A software application was implemented to visualize and save the measurement values. Its functionality was verified by operational measurements within a PEMFC system. Measurement accuracy and possible negative reactions of the sensor device during PEMFC operation are discussed in detail in this paper. The developed system enables operational measurements for different operating phases of PEM fuel cells. Additionally, this can be seen as a basis for new opportunities of optimization for fuel cell design and operation modes.

  17. Operation of a semiconductor opening switch at ultrahigh current densities

    International Nuclear Information System (INIS)

    Lyubutin, S. K.; Rukin, S. N.; Slovikovsky, B. G.; Tsyranov, S. N.

    2012-01-01

    The operation of a semiconductor opening switch (SOS diode) at cutoff current densities of tens of kA/cm 2 is studied. In experiments, the maximum reverse current density reached 43 kA/cm 2 for ∼40 ns. Experimental data on SOS diodes with a p + -p-n-n + structure and a p-n junction depth from 145 to 180 μm are presented. The dynamics of electron-hole plasma in the diode at pumping and current cutoff stages is studied by numerical simulation methods. It is shown that current cutoff is associated with the formation of an electric field region in a thin (∼45 μm) layer of the structure’s heavily doped p-region, in which the acceptor concentration exceeds 10 16 cm −3 , and the current cutoff process depends weakly on the p-n junction depth.

  18. Common approximations for density operators may lead to imaginary entropy

    International Nuclear Information System (INIS)

    Lendi, K.; Amaral Junior, M.R. do

    1983-01-01

    The meaning and validity of usual second order approximations for density operators are illustrated with the help of a simple exactly soluble two-level model in which all relevant quantities can easily be controlled. This leads to exact upper bound error estimates which help to select more precisely permissible correlation times as frequently introduced if stochastic potentials are present. A final consideration of information entropy reveals clearly the limitations of this kind of approximation procedures. (Author) [pt

  19. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  20. Improvement of the environmental and operational characteristics of vehicles through decreasing the motor fuel density.

    Science.gov (United States)

    Magaril, Elena

    2016-04-01

    The environmental and operational characteristics of motor transport, one of the main consumers of motor fuel and source of toxic emissions, soot, and greenhouse gases, are determined to a large extent by the fuel quality which is characterized by many parameters. Fuel density is one of these parameters and it can serve as an indicator of fuel quality. It has been theoretically substantiated that an increased density of motor fuel has a negative impact both on the environmental and operational characteristics of motor transport. The use of fuels with a high density leads to an increase in carbonization within the engine, adversely affecting the vehicle performance and increasing environmental pollution. A program of technological measures targeted at reducing the density of the fuel used was offered. It includes a solution to the problem posed by changes in the refining capacities ratio and the temperature range of gasoline and diesel fuel boiling, by introducing fuel additives and adding butanes to the gasoline. An environmental tax has been developed which allows oil refineries to have a direct impact on the production of fuels with improved environmental performance, taking into account the need to minimize the density of the fuel within a given category of quality.

  1. A Statistical Study of Eiscat Electron and Ion Temperature Measurements In The E-region

    Science.gov (United States)

    Hussey, G.; Haldoupis, C.; Schlegel, K.; Bösinger, T.

    Motivated by the large EISCAT data base, which covers over 15 years of common programme operation, and previous statistical work with EISCAT data (e.g., C. Hal- doupis, K. Schlegel, and G. Hussey, Auroral E-region electron density gradients mea- sured with EISCAT, Ann. Geopshysicae, 18, 1172-1181, 2000), a detailed statistical analysis of electron and ion EISCAT temperature measurements has been undertaken. This study was specifically concerned with the statistical dependence of heating events with other ambient parameters such as the electric field and electron density. The re- sults showed previously reported dependences such as the electron temperature being directly correlated with the ambient electric field and inversely related to the electron density. However, these correlations were found to be also dependent upon altitude. There was also evidence of the so called "Schlegel effect" (K. Schlegel, Reduced effective recombination coefficient in the disturbed polar E-region, J. Atmos. Terr. Phys., 44, 183-185, 1982); that is, the heated electron gas leads to increases in elec- tron density through a reduction in the recombination rate. This paper will present the statistical heating results and attempt to offer physical explanations and interpretations of the findings.

  2. High density operation in pulsator

    International Nuclear Information System (INIS)

    Klueber, O.; Cannici, B.; Engelhardt, W.; Gernhardt, J.; Glock, E.; Karger, F.; Lisitano, G.; Mayer, H.M.; Meisel, D.; Morandi, P.

    1976-03-01

    This report summarizes the results of experiments at high electron densities (>10 14 cm -3 ) which have been achieved by pulsed gas inflow during the discharge. At these densities a regime is established which is characterized by βsub(p) > 1, nsub(i) approximately nsub(e), Tsub(i) approximately Tsub(e) and tausub(E) proportional to nsub(e). Thus the toroidal magnetic field contributes considerably to the plasma confinement and the ions constitute almost half of the plasma pressure. Furthermore, the confinement is appreciably improved and the plasma becomes impermeable to hot neutrals. (orig.) [de

  3. Dynamic Statistical Models for Pyroclastic Density Current Generation at Soufrière Hills Volcano

    Science.gov (United States)

    Wolpert, Robert L.; Spiller, Elaine T.; Calder, Eliza S.

    2018-05-01

    To mitigate volcanic hazards from pyroclastic density currents, volcanologists generate hazard maps that provide long-term forecasts of areas of potential impact. Several recent efforts in the field develop new statistical methods for application of flow models to generate fully probabilistic hazard maps that both account for, and quantify, uncertainty. However a limitation to the use of most statistical hazard models, and a key source of uncertainty within them, is the time-averaged nature of the datasets by which the volcanic activity is statistically characterized. Where the level, or directionality, of volcanic activity frequently changes, e.g. during protracted eruptive episodes, or at volcanoes that are classified as persistently active, it is not appropriate to make short term forecasts based on longer time-averaged metrics of the activity. Thus, here we build, fit and explore dynamic statistical models for the generation of pyroclastic density current from Soufrière Hills Volcano (SHV) on Montserrat including their respective collapse direction and flow volumes based on 1996-2008 flow datasets. The development of this approach allows for short-term behavioral changes to be taken into account in probabilistic volcanic hazard assessments. We show that collapses from the SHV lava dome follow a clear pattern, and that a series of smaller flows in a given direction often culminate in a larger collapse and thereafter directionality of the flows change. Such models enable short term forecasting (weeks to months) that can reflect evolving conditions such as dome and crater morphology changes and non-stationary eruptive behavior such as extrusion rate variations. For example, the probability of inundation of the Belham Valley in the first 180 days of a forecast period is about twice as high for lava domes facing Northwest toward that valley as it is for domes pointing East toward the Tar River Valley. As rich multi-parametric volcano monitoring dataset become

  4. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  5. Solution of the statistical bootstrap with Bose statistics

    International Nuclear Information System (INIS)

    Engels, J.; Fabricius, K.; Schilling, K.

    1977-01-01

    A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density

  6. Statistics for demodulation RFI in inverting operational amplifier circuits

    Science.gov (United States)

    Sutu, Y.-H.; Whalen, J. J.

    An investigation was conducted with the objective to determine statistical variations for RFI demodulation responses in operational amplifier (op amp) circuits. Attention is given to the experimental procedures employed, a three-stage op amp LED experiment, NCAP (Nonlinear Circuit Analysis Program) simulations of demodulation RFI in 741 op amps, and a comparison of RFI in four op amp types. Three major recommendations for future investigations are presented on the basis of the obtained results. One is concerned with the conduction of additional measurements of demodulation RFI in inverting amplifiers, while another suggests the employment of an automatic measurement system. It is also proposed to conduct additional NCAP simulations in which parasitic effects are accounted for more thoroughly.

  7. Statistical correlations in an ideal gas of particles obeying fractional exclusion statistics.

    Science.gov (United States)

    Pellegrino, F M D; Angilella, G G N; March, N H; Pucci, R

    2007-12-01

    After a brief discussion of the concepts of fractional exchange and fractional exclusion statistics, we report partly analytical and partly numerical results on thermodynamic properties of assemblies of particles obeying fractional exclusion statistics. The effect of dimensionality is one focal point, the ratio mu/k_(B)T of chemical potential to thermal energy being obtained numerically as a function of a scaled particle density. Pair correlation functions are also presented as a function of the statistical parameter, with Friedel oscillations developing close to the fermion limit, for sufficiently large density.

  8. Positive-Operator Valued Measure (POVM Quantization

    Directory of Open Access Journals (Sweden)

    Jean Pierre Gazeau

    2014-12-01

    Full Text Available We present a general formalism for giving a measure space paired with a separable Hilbert space a quantum version based on a normalized positive operator-valued measure. The latter are built from families of density operators labeled by points of the measure space. We especially focus on various probabilistic aspects of these constructions. Simple ormore elaborate examples illustrate the procedure: circle, two-sphere, plane and half-plane. Links with Positive-Operator Valued Measure (POVM quantum measurement and quantum statistical inference are sketched.

  9. The Canopy Graph and Level Statistics for Random Operators on Trees

    International Nuclear Information System (INIS)

    Aizenman, Michael; Warzel, Simone

    2006-01-01

    For operators with homogeneous disorder, it is generally expected that there is a relation between the spectral characteristics of a random operator in the infinite setup and the distribution of the energy gaps in its finite volume versions, in corresponding energy ranges. Whereas pure point spectrum of the infinite operator goes along with Poisson level statistics, it is expected that purely absolutely continuous spectrum would be associated with gap distributions resembling the corresponding random matrix ensemble. We prove that on regular rooted trees, which exhibit both spectral types, the eigenstate point process has always Poissonian limit. However, we also find that this does not contradict the picture described above if that is carefully interpreted, as the relevant limit of finite trees is not the infinite homogenous tree graph but rather a single-ended 'canopy graph.' For this tree graph, the random Schroedinger operator is proven here to have only pure-point spectrum at any strength of the disorder. For more general single-ended trees it is shown that the spectrum is always singular - pure point possibly with singular continuous component which is proven to occur in some cases

  10. Non-linearity consideration when analyzing reactor noise statistical characteristics. [BWR

    Energy Technology Data Exchange (ETDEWEB)

    Kebadze, B V; Adamovski, L A

    1975-06-01

    Statistical characteristics of boiling water reactor noise in the vicinity of stability threshold are studied. The reactor is considered as a non-linear system affected by random perturbations. To solve a non-linear problem the principle of statistical linearization is used. It is shown that the halfwidth of resonance peak in neutron power noise spectrum density as well as the reciprocal of noise dispersion, which are used in predicting a stable operation theshold, are different from zero both within and beyond the stability boundary the determination of which was based on linear criteria.

  11. Spectral statistics of chaotic many-body systems

    International Nuclear Information System (INIS)

    Dubertrand, Rémy; Müller, Sebastian

    2016-01-01

    We derive a trace formula that expresses the level density of chaotic many-body systems as a smooth term plus a sum over contributions associated to solutions of the nonlinear Schrödinger (or Gross–Pitaevski) equation. Our formula applies to bosonic systems with discretised positions, such as the Bose–Hubbard model, in the semiclassical limit as well as in the limit where the number of particles is taken to infinity. We use the trace formula to investigate the spectral statistics of these systems, by studying interference between solutions of the nonlinear Schrödinger equation. We show that in the limits taken the statistics of fully chaotic many-particle systems becomes universal and agrees with predictions from the Wigner–Dyson ensembles of random matrix theory. The conditions for Wigner–Dyson statistics involve a gap in the spectrum of the Frobenius–Perron operator, leaving the possibility of different statistics for systems with weaker chaotic properties. (paper)

  12. Density fluctuation measurements via reflectometry on DIII-D during L- and H-mode operation

    International Nuclear Information System (INIS)

    Doyle, E.J.; Lehecka, T.; Luhmann, N.C. Jr.; Peebles, W.A.; Philipona, R.

    1990-01-01

    The unique ability of reflectometers to provide radial density fluctuation measurements with high spatial resolution (of the order of ≤ centimeters, is ideally suited to the study of the edge plasma modifications associated with H-mode operation. Consequently, attention has been focused on the study of these phenomena since an improved understanding of the physics of H-mode plasmas is essential if a predictive capability for machine performance is to be developed. In addition, DIII-D is ideally suited for such studies since it is a major device noted for its robust H-mode operation and excellent basic plasma profile diagnostic information. The reflectometer system normally used for fluctuation studies is an O-mode, homodyne, system utilizing 7 discrete channels spanning 15-75 GHz, with corresponding critical densities of 2.8x10 18 to 7x10 19 m -3 . The Gunn diode sources in this system are only narrowly tunable in frequency, so the critical densities are essentially fixed. An X-mode system, utilizing a frequency tunable BWO source, has also been used to obtain fluctuation data, and in particular, to 'fill in the gaps' between the discrete O-mode channels. (author) 12 refs., 5 figs

  13. Electrical Conductivity of Charged Particle Systems and Zubarev's Nonequilibrium Statistical Operator Method

    Science.gov (United States)

    Röpke, G.

    2018-01-01

    One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.

  14. Summary of Key Operating Statistics: Data Collected from the 2009 Annual Institutional Report

    Science.gov (United States)

    Accrediting Council for Independent Colleges and Schools, 2010

    2010-01-01

    The Accrediting Council for Independent Colleges and Schools (ACICS) provides the Summary of Key Operating Statistics (KOS) as an annual review of the performance and key measurements of the more than 800 private post-secondary institutions we accredit. This edition of the KOS contains information based on the 2009 Annual Institutional Reports…

  15. Statistical Physics and Light-Front Quantization

    Energy Technology Data Exchange (ETDEWEB)

    Raufeisen, J

    2004-08-12

    Light-front quantization has important advantages for describing relativistic statistical systems, particularly systems for which boost invariance is essential, such as the fireball created in a heavy ion collisions. In this paper the authors develop light-front field theory at finite temperature and density with special attention to quantum chromodynamics. They construct the most general form of the statistical operator allowed by the Poincare algebra and show that there are no zero-mode related problems when describing phase transitions. They then demonstrate a direct connection between densities in light-front thermal field theory and the parton distributions measured in hard scattering experiments. The approach thus generalizes the concept of a parton distribution to finite temperature. In light-front quantization, the gauge-invariant Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and have a much simpler spinor structure than the equal-time fermion propagator. From the Green's function, the authors introduce the new concept of a light-front density matrix, whose matrix elements are related to forward and to off-diagonal parton distributions. Furthermore, they explain how thermodynamic quantities can be calculated in discretized light-cone quantization, which is applicable at high chemical potential and is not plagued by the fermion-doubling problems.

  16. Structural characterization and condition for measurement statistics preservation of a unital quantum operation

    International Nuclear Information System (INIS)

    Lee, Kai-Yan; Fung, Chi-Hang Fred; Chau, H F

    2013-01-01

    We investigate the necessary and sufficient condition for a convex cone of positive semidefinite operators to be fixed by a unital quantum operation ϕ acting on finite-dimensional quantum states. By reducing this problem to the problem of simultaneous diagonalization of the Kraus operators associated with ϕ, we can completely characterize the kinds of quantum states that are fixed by ϕ. Our work has several applications. It gives a simple proof of the structural characterization of a unital quantum operation that acts on finite-dimensional quantum states—a result not explicitly mentioned in earlier studies. It also provides a necessary and sufficient condition for determining what kind of measurement statistics is preserved by a unital quantum operation. Finally, our result clarifies and extends the work of Størmer by giving a proof of a reduction theorem on the unassisted and entanglement-assisted classical capacities, coherent information, and minimal output Renyi entropy of a unital channel acting on a finite-dimensional quantum state. (paper)

  17. Effect of low density H-mode operation on edge and divertor plasma parameters

    International Nuclear Information System (INIS)

    Maingi, R.; Mioduszewski, P.K.; Cuthbertson, J.W.

    1994-07-01

    We present a study of the impact of H-mode operation at low density on divertor plasma parameters on the DIII-D tokamak. The line-average density in H-mode was scanned by variation of the particle exhaust rate, using the recently installed divertor cryo-condensation pump. The maximum decrease (50%) in line-average electron density was accompanied by a factor of 2 increase in the edge electron temperature, and 10% and 20% reductions in the measured core and divertor radiated power, respectively. The measured total power to the inboard divertor target increased by a factor of 3, with the major contribution coming from a factor of 5 increase in the peak heat flux very close to the inner strike point. The measured increase in power at the inboard divertor target was approximately equal to the measured decrease in core and divertor radiation

  18. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  19. Electron density and temperature in NIO1 RF source operated in oxygen and argon

    Science.gov (United States)

    Barbisan, M.; Zaniol, B.; Cavenago, M.; Pasqualotto, R.; Serianni, G.; Zanini, M.

    2017-08-01

    The NIO1 experiment, built and operated at Consorzio RFX, hosts an RF negative ion source, from which it is possible to produce a beam of maximum 130 mA in H- ions, accelerated up to 60 kV. For the preliminary tests of the extraction system the source has been operated in oxygen, whose high electronegativity allows to reach useful levels of extracted beam current. The efficiency of negative ions extraction is strongly influenced by the electron density and temperature close to the Plasma Grid, i.e. the grid of the acceleration system which faces the source. To support the tests, these parameters have been measured by means of the Optical Emission Spectroscopy diagnostic. This technique has involved the use of an oxygen-argon mixture to produce the plasma in the source. The intensities of specific Ar I and Ar II lines have been measured along lines of sight close to the Plasma Grid, and have been interpreted with the ADAS package to get the desired information. This work will describe the diagnostic hardware, the analysis method and the measured values of electron density and temperature, as function of the main source parameters (RF power, pressure, bias voltage and magnetic filter field). The main results show that not only electron density but also electron temperature increase with RF power; both decrease with increasing magnetic filter field. Variations of source pressure and plasma grid bias voltage appear to affect only electron temperature and electron density, respectively.

  20. Emergence of quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C

    2009-01-01

    The conceptual setting of quantum mechanics is subject to an ongoing debate from its beginnings until now. The consequences of the apparent differences between quantum statistics and classical statistics range from the philosophical interpretations to practical issues as quantum computing. In this note we demonstrate how quantum mechanics can emerge from classical statistical systems. We discuss conditions and circumstances for this to happen. Quantum systems describe isolated subsystems of classical statistical systems with infinitely many states. While infinitely many classical observables 'measure' properties of the subsystem and its environment, the state of the subsystem can be characterized by the expectation values of only a few probabilistic observables. They define a density matrix, and all the usual laws of quantum mechanics follow. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem.

  1. On a decomposition theorem for density operators of a pure quantum state

    International Nuclear Information System (INIS)

    Giannoni, M.J.

    1979-03-01

    Conditions for the existence of a decomposition of a hermitian projector rho into two hermitian and time reversal invariant operators r/rho 0 and chi under the form rho=esup(i,chi)rho 0 esup(-i,chi) are investigated. Sufficient conditions are given, and an explicit construction of a decomposition is performed when they are fulfilled. A stronger theorem of existence and unicity is studied. All the proofs are valid for any p-body reduced density operator of a pure state of a system of bosons as well as fermions. The decomposition studied in this work has already been used in Nuclear Physics, and may be of interest in other fields of Physics

  2. Effective pile-up density as a measure of the experimental data quality for High-Luminosity LHC operational scenarios.

    CERN Document Server

    Medina Medrano, Luis Eduardo; Arduini, Gianluigi; Napsuciale, Mauro

    2018-01-01

    The High-Luminosity LHC (HL-LHC) experiments will operate at unprecedented level of event pile-up from proton-proton collisions at 14TeV center-of-mass energy. In this paper we study the performance of the baseline and a series of alternative scenarios in terms of the delivered integrated luminosity and its quality (pile-up density). A new figure-of-merit is introduced, the effective pile-up density, a concept that reflects the expected detector efficiency in the reconstruction of event vertices for a given operational scenario, acting as a link between the machine and experimental slides. Alternative scenarios have been proposed either to improve the baseline performance, or tot provide operational schemes in the case of particular limitations. Simulations of the evolution of optimum fills with the latest set of parameters of the HL-LHC are performed with β* - levelling, and results are discussed in terms of both the integrated luminosity and the effective pile-up density. The crab kissing scheme, a propose...

  3. Global statistics of liquid water content and effective number density of water clouds over ocean derived from combined CALIPSO and MODIS measurements

    Science.gov (United States)

    Hu, Y.; Vaughan, M.; McClain, C.; Behrenfeld, M.; Maring, H.; Anderson, D.; Sun-Mack, S.; Flittner, D.; Huang, J.; Wielicki, B.; Minnis, P.; Weimer, C.; Trepte, C.; Kuehn, R.

    2007-03-01

    This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water content and effective number density are presented.

  4. Energy-density field approach for low- and medium-frequency vibroacoustic analysis of complex structures using a statistical computational model

    Science.gov (United States)

    Kassem, M.; Soize, C.; Gagliardini, L.

    2009-06-01

    In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.

  5. Image Denoising via Bayesian Estimation of Statistical Parameter Using Generalized Gamma Density Prior in Gaussian Noise Model

    Science.gov (United States)

    Kittisuwan, Pichid

    2015-03-01

    The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.

  6. Treatment of automotive industry oily wastewater by electrocoagulation: statistical optimization of the operational parameters.

    Science.gov (United States)

    GilPavas, Edison; Molina-Tirado, Kevin; Gómez-García, Miguel Angel

    2009-01-01

    An electrocoagulation process was used for the treatment of oily wastewater generated from an automotive industry in Medellín (Colombia). An electrochemical cell consisting of four parallel electrodes (Fe and Al) in bipolar configuration was implemented. A multifactorial experimental design was used for evaluating the influence of several parameters including: type and arrangement of electrodes, pH, and current density. Oil and grease removal was defined as the response variable for the statistical analysis. Additionally, the BOD(5), COD, and TOC were monitored during the treatment process. According to the results, at the optimum parameter values (current density = 4.3 mA/cm(2), distance between electrodes = 1.5 cm, Fe as anode, and pH = 12) it was possible to reach a c.a. 95% oils removal, COD and mineralization of 87.4% and 70.6%, respectively. A final biodegradability (BOD(5)/COD) of 0.54 was reached.

  7. A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Kristensen, Kasper; Lewy, Peter

    2014-01-01

    Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP) statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes...

  8. Marine Traffic Density Over Port Klang, Malaysia Using Statistical Analysis of AIS Data: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Masnawi MUSTAFFA

    2016-12-01

    Full Text Available Port Klang Malaysia is the 13th busiest port in the world, the capacity at the port expected to be able to meet the demand until 2018. It is one of the busiest ports in the world and also the busiest port in Malaysia. Even though there are statistics published by Port Klang Authority showing that a lot of ships using this port, this number is only based on ships that entering Port Klang. Therefore, no study has been done to investigate on how dense the traffic is in Port Klang, Malaysia the surrounding sea including Strait of Malacca . This paper has investigated on traffic density over Port Klang Malaysia and its surrounding sea using statistical analysis from AIS data. As a preliminary study, this study only collected AIS data for 7 days to represent daily traffic weekly. As a result, an hourly number of vessels, daily number of vessels, vessels classification and sizes and also traffic paths will be plotted.

  9. Radiation transport in statistically inhomogeneous rocks

    International Nuclear Information System (INIS)

    Lukhminskij, B.E.

    1975-01-01

    A study has been made of radiation transfer in statistically inhomogeneous rocks. Account has been taken of the statistical character of rock composition through randomization of density. Formulas are summarized for sigma-distribution, homogeneous density, the Simpson and Cauchy distributions. Consideration is given to the statistics of mean square ranges in a medium, simulated by the jump Markov random function. A quantitative criterion of rock heterogeneity is proposed

  10. Scalar and configuration traces of operators in large spectroscopic spaces

    International Nuclear Information System (INIS)

    Chang, B.D.; Wong, S.S.M.

    1978-01-01

    In statistical spectroscopic calculations, the primary input is the trace of products of powers of Hamiltonian and excitation operators. The lack of a systematic approach to trace evaluation has been in the past one of the major difficulties in the applications of statistical spectroscopic methods. A general method with a simple derivation is described here to evaluate the scalar and configuration traces for operators expressed either in the m-scheme or fully coupled JT scheme. It is shown to be an effective method by actually programming it on a computer. Implications on the future applications of statistical spectroscopy in the area of level density, strength function and perturbation theory are also briefly discussed. (Auth.)

  11. [Development of a software standardizing optical density with operation settings related to several limitations].

    Science.gov (United States)

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  12. Statistical implications in Monte Carlo depletions - 051

    International Nuclear Information System (INIS)

    Zhiwen, Xu; Rhodes, J.; Smith, K.

    2010-01-01

    As a result of steady advances of computer power, continuous-energy Monte Carlo depletion analysis is attracting considerable attention for reactor burnup calculations. The typical Monte Carlo analysis is set up as a combination of a Monte Carlo neutron transport solver and a fuel burnup solver. Note that the burnup solver is a deterministic module. The statistical errors in Monte Carlo solutions are introduced into nuclide number densities and propagated along fuel burnup. This paper is towards the understanding of the statistical implications in Monte Carlo depletions, including both statistical bias and statistical variations in depleted fuel number densities. The deterministic Studsvik lattice physics code, CASMO-5, is modified to model the Monte Carlo depletion. The statistical bias in depleted number densities is found to be negligible compared to its statistical variations, which, in turn, demonstrates the correctness of the Monte Carlo depletion method. Meanwhile, the statistical variation in number densities generally increases with burnup. Several possible ways of reducing the statistical errors are discussed: 1) to increase the number of individual Monte Carlo histories; 2) to increase the number of time steps; 3) to run additional independent Monte Carlo depletion cases. Finally, a new Monte Carlo depletion methodology, called the batch depletion method, is proposed, which consists of performing a set of independent Monte Carlo depletions and is thus capable of estimating the overall statistical errors including both the local statistical error and the propagated statistical error. (authors)

  13. Infrared thermography for wood density estimation

    Science.gov (United States)

    López, Gamaliel; Basterra, Luis-Alfonso; Acuña, Luis

    2018-03-01

    Infrared thermography (IRT) is becoming a commonly used technique to non-destructively inspect and evaluate wood structures. Based on the radiation emitted by all objects, this technique enables the remote visualization of the surface temperature without making contact using a thermographic device. The process of transforming radiant energy into temperature depends on many parameters, and interpreting the results is usually complicated. However, some works have analyzed the operation of IRT and expanded its applications, as found in the latest literature. This work analyzes the effect of density on the thermodynamic behavior of timber to be determined by IRT. The cooling of various wood samples has been registered, and a statistical procedure that enables one to quantitatively estimate the density of timber has been designed. This procedure represents a new method to physically characterize this material.

  14. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  15. Statistical analysis of field data for aircraft warranties

    Science.gov (United States)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  16. Matrix product operators, matrix product states, and ab initio density matrix renormalization group algorithms

    Science.gov (United States)

    Chan, Garnet Kin-Lic; Keselman, Anna; Nakatani, Naoki; Li, Zhendong; White, Steven R.

    2016-07-01

    Current descriptions of the ab initio density matrix renormalization group (DMRG) algorithm use two superficially different languages: an older language of the renormalization group and renormalized operators, and a more recent language of matrix product states and matrix product operators. The same algorithm can appear dramatically different when written in the two different vocabularies. In this work, we carefully describe the translation between the two languages in several contexts. First, we describe how to efficiently implement the ab initio DMRG sweep using a matrix product operator based code, and the equivalence to the original renormalized operator implementation. Next we describe how to implement the general matrix product operator/matrix product state algebra within a pure renormalized operator-based DMRG code. Finally, we discuss two improvements of the ab initio DMRG sweep algorithm motivated by matrix product operator language: Hamiltonian compression, and a sum over operators representation that allows for perfect computational parallelism. The connections and correspondences described here serve to link the future developments with the past and are important in the efficient implementation of continuing advances in ab initio DMRG and related algorithms.

  17. Nuclear level density

    International Nuclear Information System (INIS)

    Cardoso Junior, J.L.

    1982-10-01

    Experimental data show that the number of nuclear states increases rapidly with increasing excitation energy. The properties of highly excited nuclei are important for many nuclear reactions, mainly those that go via processes of the compound nucleus type. In this case, it is sufficient to know the statistical properties of the nuclear levels. First of them is the function of nuclear levels density. Several theoretical models which describe the level density are presented. The statistical mechanics and a quantum mechanics formalisms as well as semi-empirical results are analysed and discussed. (Author) [pt

  18. Refilling process in the plasmasphere: a 3-D statistical characterization based on Cluster density observations

    Directory of Open Access Journals (Sweden)

    G. Lointier

    2013-02-01

    Full Text Available The Cluster mission offers an excellent opportunity to investigate the evolution of the plasma population in a large part of the inner magnetosphere, explored near its orbit's perigee, over a complete solar cycle. The WHISPER sounder, on board each satellite of the mission, is particularly suitable to study the electron density in this region, between 0.2 and 80 cm−3. Compiling WHISPER observations during 1339 perigee passes distributed over more than three years of the Cluster mission, we present first results of a statistical analysis dedicated to the study of the electron density morphology and dynamics along and across magnetic field lines between L = 2 and L = 10. In this study, we examine a specific topic: the refilling of the plasmasphere and trough regions during extended periods of quiet magnetic conditions. To do so, we survey the evolution of the ap index during the days preceding each perigee crossing and sort out electron density profiles along the orbit according to three classes, namely after respectively less than 2 days, between 2 and 4 days, and more than 4 days of quiet magnetic conditions (ap ≤ 15 nT following an active episode (ap > 15 nT. This leads to three independent data subsets. Comparisons between density distributions in the 3-D plasmasphere and trough regions at the three stages of quiet magnetosphere provide novel views about the distribution of matter inside the inner magnetosphere during several days of low activity. Clear signatures of a refilling process inside an expended plasmasphere in formation are noted. A plasmapause-like boundary, at L ~ 6 for all MLT sectors, is formed after 3 to 4 days and expends somewhat further after that. In the outer part of the plasmasphere (L ~ 8, latitudinal profiles of median density values vary essentially according to the MLT sector considered rather than according to the refilling duration. The shape of these density profiles indicates that magnetic flux tubes are not

  19. A new electron density model of the plasmasphere for operational applications and services

    Science.gov (United States)

    Jakowski, Norbert; Hoque, Mohammed Mainul

    2018-03-01

    The Earth's plasmasphere contributes essentially to total electron content (TEC) measurements from ground or satellite platforms. Furthermore, as an integral part of space weather, associated plasmaspheric phenomena must be addressed in conjunction with ionosphere weather monitoring by operational space weather services. For supporting space weather services and mitigation of propagation errors in Global Navigation Satellite Systems (GNSS) applications we have developed the empirical Neustrelitz plasmasphere model (NPSM). The model consists of an upper L shell dependent part and a lower altitude dependent part, both described by specific exponential decays. Here the McIllwain parameter L defines the geomagnetic field lines in a centered dipole model for the geomagnetic field. The coefficients of the developed approaches are successfully fitted to numerous electron density data derived from dual frequency GPS measurements on-board the CHAMP satellite mission from 2000 to 2005. The data are utilized for fitting up to the L shell L = 3 because a previous validation has shown a good agreement with IMAGE/RPI measurements up to this value. Using the solar radio flux index F10.7 as the only external parameter, the operation of the model is robust, with 40 coefficients fast and sufficiently accurate to be used as a background model for estimating TEC or electron density profiles in near real time GNSS applications and services. In addition to this, the model approach is sensitive to ionospheric coupling resulting in anomalies such as the Nighttime Winter Anomaly and the related Mid-Summer Nighttime Anomaly and even shows a slight plasmasphere compression of the dayside plasmasphere due to solar wind pressure. Modelled electron density and TEC values agree with estimates reported in the literature in similar cases.

  20. Advanced calibration, adjustment, and operation of a density and sound speed analyzer

    International Nuclear Information System (INIS)

    Fortin, Tara J.; Laesecke, Arno; Freund, Malte; Outcalt, Stephanie

    2013-01-01

    Highlights: ► Detail important considerations for reference quality measurements of thermophysical property data with benchtop instruments. ► Density and speed of sound of isooctane and speed of sound of toluene at (278 K to 343 K) and atmospheric pressure. ► Experimental data compared to available literature data and equations of state. - Abstract: Benchtop measurement systems have emerged as powerful tools in the ongoing quest for thermophysical property data. We demonstrate that these instruments can yield results of high quality if operated in an informed manner. The importance of sample purity, reproducibility over repeatability, expanded calibration and adjustment protocols, and rigorous uncertainty estimates are emphasized. We report measurement results at ambient atmospheric pressure and temperatures from 343 K to 278 K, including expanded uncertainty estimates, for the density and speed of sound of isooctane and for the speed of sound of toluene. These data are useful for validating the performance of such instruments.

  1. The density of states for almost periodic Schroedinger operators and the frequency module: a counter-example

    International Nuclear Information System (INIS)

    Bellissard, J.

    1981-07-01

    We exhibit an example of a one-dimensional discrete Schroedinger operator with an almost periodic potential for which the steps of the density of states do not belong to the frequency module. This example is suggested by the K-theory

  2. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  3. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan

    2008-01-01

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  4. Extension of electron cyclotron heating at ASDEX Upgrade with respect to high density operation

    Directory of Open Access Journals (Sweden)

    Schubert Martin

    2017-01-01

    Full Text Available The ASDEX Upgrade electron cyclotron resonance heating operates at 105 GHz and 140 GHz with flexible launching geometry and polarization. In 2016 four Gyrotrons with 10 sec pulse length and output power close to 1 MW per unit were available. The system is presently being extended to eight similar units in total. High heating power and high plasma density operation will be a part of the future ASDEX Upgrade experiment program. For the electron cyclotron resonance heating, an O-2 mode scheme is proposed, which is compatible with the expected high plasma densities. It may, however, suffer from incomplete single-pass absorption. The situation can be improved significantly by installing holographic mirrors on the inner column, which allow for a second pass of the unabsorbed fraction of the millimetre wave beam. Since the beam path in the plasma is subject to refraction, the beam position on the holographic mirror has to be controlled. Thermocouples built into the mirror surface are used for this purpose. As a protective measure, the tiles of the heat shield on the inner column were modified in order to increase the shielding against unabsorbed millimetre wave power.

  5. Whole brain analysis of postmortem density changes of grey and white matter on computed tomography by statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nishiyama, Yuichi; Mori, Hiroshi; Katsube, Takashi; Kitagaki, Hajime [Shimane University Faculty of Medicine, Department of Radiology, Izumo-shi, Shimane (Japan); Kanayama, Hidekazu; Tada, Keiji; Yamamoto, Yasushi [Shimane University Hospital, Department of Radiology, Izumo-shi, Shimane (Japan); Takeshita, Haruo [Shimane University Faculty of Medicine, Department of Legal Medicine, Izumo-shi, Shimane (Japan); Kawakami, Kazunori [Fujifilm RI Pharma, Co., Ltd., Tokyo (Japan)

    2017-06-15

    This study examined the usefulness of statistical parametric mapping (SPM) for investigating postmortem changes on brain computed tomography (CT). This retrospective study included 128 patients (23 - 100 years old) without cerebral abnormalities who underwent unenhanced brain CT before and after death. The antemortem CT (AMCT) scans and postmortem CT (PMCT) scans were spatially normalized using our original brain CT template, and postmortem changes of CT values (in Hounsfield units; HU) were analysed by the SPM technique. Compared with AMCT scans, 58.6 % and 98.4 % of PMCT scans showed loss of the cerebral sulci and an unclear grey matter (GM)-white matter (WM) interface, respectively. SPM analysis revealed a significant decrease in cortical GM density within 70 min after death on PMCT scans, suggesting cytotoxic brain oedema. Furthermore, there was a significant increase in the density of the WM, lenticular nucleus and thalamus more than 120 min after death. The SPM technique demonstrated typical postmortem changes on brain CT scans, and revealed that the unclear GM-WM interface on early PMCT scans is caused by a rapid decrease in cortical GM density combined with a delayed increase in WM density. SPM may be useful for assessment of whole brain postmortem changes. (orig.)

  6. Whole brain analysis of postmortem density changes of grey and white matter on computed tomography by statistical parametric mapping

    International Nuclear Information System (INIS)

    Nishiyama, Yuichi; Mori, Hiroshi; Katsube, Takashi; Kitagaki, Hajime; Kanayama, Hidekazu; Tada, Keiji; Yamamoto, Yasushi; Takeshita, Haruo; Kawakami, Kazunori

    2017-01-01

    This study examined the usefulness of statistical parametric mapping (SPM) for investigating postmortem changes on brain computed tomography (CT). This retrospective study included 128 patients (23 - 100 years old) without cerebral abnormalities who underwent unenhanced brain CT before and after death. The antemortem CT (AMCT) scans and postmortem CT (PMCT) scans were spatially normalized using our original brain CT template, and postmortem changes of CT values (in Hounsfield units; HU) were analysed by the SPM technique. Compared with AMCT scans, 58.6 % and 98.4 % of PMCT scans showed loss of the cerebral sulci and an unclear grey matter (GM)-white matter (WM) interface, respectively. SPM analysis revealed a significant decrease in cortical GM density within 70 min after death on PMCT scans, suggesting cytotoxic brain oedema. Furthermore, there was a significant increase in the density of the WM, lenticular nucleus and thalamus more than 120 min after death. The SPM technique demonstrated typical postmortem changes on brain CT scans, and revealed that the unclear GM-WM interface on early PMCT scans is caused by a rapid decrease in cortical GM density combined with a delayed increase in WM density. SPM may be useful for assessment of whole brain postmortem changes. (orig.)

  7. Statistical modeling of Earth's plasmasphere

    Science.gov (United States)

    Veibell, Victoir

    The behavior of plasma near Earth's geosynchronous orbit is of vital importance to both satellite operators and magnetosphere modelers because it also has a significant influence on energy transport, ion composition, and induced currents. The system is highly complex in both time and space, making the forecasting of extreme space weather events difficult. This dissertation examines the behavior and statistical properties of plasma mass density near geosynchronous orbit by using both linear and nonlinear models, as well as epoch analyses, in an attempt to better understand the physical processes that precipitates and drives its variations. It is shown that while equatorial mass density does vary significantly on an hourly timescale when a drop in the disturbance time scale index ( Dst) was observed, it does not vary significantly between the day of a Dst event onset and the day immediately following. It is also shown that increases in equatorial mass density were not, on average, preceded or followed by any significant change in the examined solar wind or geomagnetic variables, including Dst, despite prior results that considered a few selected events and found a notable influence. It is verified that equatorial mass density and and solar activity via the F10.7 index have a strong correlation, which is stronger over longer timescales such as 27 days than it is over an hourly timescale. It is then shown that this connection seems to affect the behavior of equatorial mass density most during periods of strong solar activity leading to large mass density reactions to Dst drops for high values of F10.7. It is also shown that equatorial mass density behaves differently before and after events based on the value of F10.7 at the onset of an equatorial mass density event or a Dst event, and that a southward interplanetary magnetic field at onset leads to slowed mass density growth after event onset. These behavioral differences provide insight into how solar and geomagnetic

  8. SEGMENTATION AND CLASSIFICATION OF CERVICAL CYTOLOGY IMAGES USING MORPHOLOGICAL AND STATISTICAL OPERATIONS

    Directory of Open Access Journals (Sweden)

    S Anantha Sivaprakasam

    2017-02-01

    Full Text Available Cervical cancer that is a disease, in which malignant (cancer cells form in the tissues of the cervix, is one of the fourth leading causes of cancer death in female community worldwide. The cervical cancer can be prevented and/or cured if it is diagnosed in the pre-cancerous lesion stage or earlier. A common physical examination technique widely used in the screening is called Papanicolaou test or Pap test which is used to detect the abnormality of the cell. Due to intricacy of the cell nature, automating of this procedure is still a herculean task for the pathologist. This paper addresses solution for the challenges in terms of a simple and novel method to segment and classify the cervical cell automatically. The primary step of this procedure is pre-processing in which de-nosing, de-correlation operation and segregation of colour components are carried out, Then, two new techniques called Morphological and Statistical Edge based segmentation and Morphological and Statistical Region Based segmentation Techniques- put forward in this paper, and that are applied on the each component of image to segment the nuclei from cervical image. Finally, all segmented colour components are combined together to make a final segmentation result. After extracting the nuclei, the morphological features are extracted from the nuclei. The performance of two techniques mentioned above outperformed than standard segmentation techniques. Besides, Morphological and Statistical Edge based segmentation is outperformed than Morphological and Statistical Region based Segmentation. Finally, the nuclei are classified based on the morphological value The segmentation accuracy is echoed in classification accuracy. The overall segmentation accuracy is 97%.

  9. Using features of local densities, statistics and HMM toolkit (HTK for offline Arabic handwriting text recognition

    Directory of Open Access Journals (Sweden)

    El Moubtahij Hicham

    2017-12-01

    Full Text Available This paper presents an analytical approach of an offline handwritten Arabic text recognition system. It is based on the Hidden Markov Models (HMM Toolkit (HTK without explicit segmentation. The first phase is preprocessing, where the data is introduced in the system after quality enhancements. Then, a set of characteristics (features of local densities and features statistics are extracted by using the technique of sliding windows. Subsequently, the resulting feature vectors are injected to the Hidden Markov Model Toolkit (HTK. The simple database “Arabic-Numbers” and IFN/ENIT are used to evaluate the performance of this system. Keywords: Hidden Markov Models (HMM Toolkit (HTK, Sliding windows

  10. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    Science.gov (United States)

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  11. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae habitat and population densities

    Directory of Open Access Journals (Sweden)

    Khalifa M. Al-Kindi

    2017-08-01

    Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  12. The dynamics of variable-density turbulence

    International Nuclear Information System (INIS)

    Sandoval, D.L.

    1995-11-01

    The dynamics of variable-density turbulent fluids are studied by direct numerical simulation. The flow is incompressible so that acoustic waves are decoupled from the problem, and implying that density is not a thermodynamic variable. Changes in density occur due to molecular mixing. The velocity field, is in general, divergent. A pseudo-spectral numerical technique is used to solve the equations of motion. Three-dimensional simulations are performed using a grid size of 128 3 grid points. Two types of problems are studied: (1) the decay of isotropic, variable-density turbulence, and (2) buoyancy-generated turbulence in a fluid with large density fluctuations. In the case of isotropic, variable-density turbulence, the overall statistical decay behavior, for the cases studied, is relatively unaffected by the presence of density variations when the initial density and velocity fields are statistically independent. The results for this case are in quantitative agreement with previous numerical and laboratory results. In this case, the initial density field has a bimodal probability density function (pdf) which evolves in time towards a Gaussian distribution. The pdf of the density field is symmetric about its mean value throughout its evolution. If the initial velocity and density fields are statistically dependent, however, the decay process is significantly affected by the density fluctuations. For the case of buoyancy-generated turbulence, variable-density departures from the Boussinesq approximation are studied. The results of the buoyancy-generated turbulence are compared with variable-density model predictions. Both a one-point (engineering) model and a two-point (spectral) model are tested against the numerical data. Some deficiencies in these variable-density models are discussed and modifications are suggested

  13. Density limit in ASDEX discharges with peaked density profiles

    International Nuclear Information System (INIS)

    Staebler, A.; Niedermeyer, H.; Loch, R.; Mertens, V.; Mueller, E.R.; Soeldner, F.X.; Wagner, F.

    1989-01-01

    Results concerning the density limit in OH and NI-heated ASDEX discharges with the usually observed broad density profiles have been reported earlier: In ohmic discharges with high q a (q-cylindrical is used throughout this paper) the Murakami parameter (n e R/B t ) is a good scaling parameter. At the high densities edge cooling is observed causing the plasma to shrink until an m=2-instability terminates the discharge. When approaching q a =2 the density limit is no longer proportional to I p ; a minimum exists in n e,max (q a ) at q a ∼2.15. With NI-heating the density limit increases less than proportional to the heating power; the behaviour during the pre-disruptive phase is rather similar to the one of OH discharges. There are specific operating regimes on ASDEX leading to discharges with strongly peaked density profiles: the improved ohmic confinement regime, counter neutral injection, and multipellet injection. These regimes are characterized by enhanced energy and particle confinement. The operational limit in density for these discharges is, therefore, of great interest having furthermore in mind that high central densities are favourable in achieving high fusion yields. In addition, further insight into the mechanisms of the density limit observed in tokamaks may be obtained by comparing plasmas with rather different density profiles at their maximum attainable densities. 7 refs., 2 figs

  14. Chapter 7: High-Density H-Mode Operation in ASDEX Upgrade

    International Nuclear Information System (INIS)

    Stober, Joerg Karl; Lang, Peter Thomas; Mertens, Vitus

    2003-01-01

    Recent results are reported on the maximum achievable H-mode density and the behavior of pedestal density and central density peaking as this limit is approached. The maximum achievable H-mode density roughly scales as the Greenwald density, though a dependence on B t is clearly observed. In contrast to the stiff temperature profiles, the density profiles seem to allow more shape variation and especially with high-field-side pellet-injection, strongly peaked profiles with good confinement have been achieved. Also, spontaneous density peaking at high densities is observed in ASDEX Upgrade, which is related to the generally observed large time constants for the density profile equilibration. The equilibrated density profile shapes depend strongly on the heat-flux profile in the sense that central heating leads to significantly flatter profiles

  15. Wigner Function:from Ensemble Average of Density Operator to Its One Matrix Element in Entangled Pure States

    Institute of Scientific and Technical Information of China (English)

    FAN Hong-Yi

    2002-01-01

    We show that the Wigner function W = Tr(△ρ) (an ensemble average of the density operator ρ, △ is theWigner operator) can be expressed as a matrix element of ρ in the entangled pure states. In doing so, converting fromquantum master equations to time-evolution equation of the Wigner functions seems direct and concise. The entangledstates are defined in the enlarged Fock space with a fictitious freedom.

  16. A reciprocal of Coleman's theorem and the quantum statistics of systems with spontaneous symmetry breaking

    International Nuclear Information System (INIS)

    Chaichian, M.; Montonen, C.; Perez Rojas, H.

    1991-01-01

    The completely different conservation properties of charges associated to unbroken and broken symmetries are discussed. The impossibility of establishing a conservation law for nondegenerate Hilbert space representations in the broken case leads to a reciprocal of Coleman's theorem. The quantum statistical implication is that these charges cannot be introduced as conserved operators in the density matrix. (orig.)

  17. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  18. ϕ-statistically quasi Cauchy sequences

    Directory of Open Access Journals (Sweden)

    Bipan Hazarika

    2016-04-01

    Full Text Available Let P denote the space whose elements are finite sets of distinct positive integers. Given any element σ of P, we denote by p(σ the sequence {pn(σ} such that pn(σ=1 for n ∈ σ and pn(σ=0 otherwise. Further Ps={σ∈P:∑n=1∞pn(σ≤s}, i.e. Ps is the set of those σ whose support has cardinality at most s. Let (ϕn be a non-decreasing sequence of positive integers such that nϕn+1≤(n+1ϕn for all n∈N and the class of all sequences (ϕn is denoted by Φ. Let E⊆N. The number δϕ(E=lims→∞1ϕs|{k∈σ,σ∈Ps:k∈E}| is said to be the ϕ-density of E. A sequence (xn of points in R is ϕ-statistically convergent (or Sϕ-convergent to a real number ℓ for every ε > 0 if the set {n∈N:|xn−ℓ|≥ɛ} has ϕ-density zero. We introduce ϕ-statistically ward continuity of a real function. A real function is ϕ-statistically ward continuous if it preserves ϕ-statistically quasi Cauchy sequences where a sequence (xn is called to be ϕ-statistically quasi Cauchy (or Sϕ-quasi Cauchy when (Δxn=(xn+1−xn is ϕ-statistically convergent to 0. i.e. a sequence (xn of points in R is called ϕ-statistically quasi Cauchy (or Sϕ-quasi Cauchy for every ε > 0 if {n∈N:|xn+1−xn|≥ɛ} has ϕ-density zero. Also we introduce the concept of ϕ-statistically ward compactness and obtain results related to ϕ-statistically ward continuity, ϕ-statistically ward compactness, statistically ward continuity, ward continuity, ward compactness, ordinary compactness, uniform continuity, ordinary continuity, δ-ward continuity, and slowly oscillating continuity.

  19. Statistical quantization of GUT models and phase diagrams of W condensation for the Universe with finite fermion density

    International Nuclear Information System (INIS)

    Kalashnikov, O.K.; Razumov, L.V.; Perez Rojas, H.

    1990-01-01

    The problems of statistical quantization for grand-unified-theory models are studied using as an example the Weinberg-Salam model with finite fermion density under the conditions of neutral and electric charge conservation. The relativistic R γ gauge with an arbitrary parameter is used and the one-loop effective potential together with its extremum equations are found. We demonstrate (and this is our main result) that the thermodynamic potential obtained from the effective one, after the mass shell for ξ is used, remains gauge dependent if all temperature ranges (not only the leading high-temperature terms) are considered. The contradiction detected within the calculational scheme is eliminated after the redefinition of the model studied is made with the aid of the terms which are proportional to the ''non-Abelian'' chemical potential and equal to zero identically when the unitary gauge is fixed. The phase diagrams of the W condensation are established and all their peculiarities are displayed. We found for the universe with a zero neutral charge density that the W condensate occurs at any small fermion density ρ and appears at first near the point of symmetry restoration. For all ρ≠0 this condensate exists only in the finite-temperature domain and evaporates completely or partially when T goes to zero

  20. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  1. Operational specification and forecasting advances for Dst, LEO thermospheric densities, and aviation radiation dose and dose rate

    Science.gov (United States)

    Tobiska, W. Kent

    Space weather’s effects upon the near-Earth environment are due to dynamic changes in the energy transfer processes from the Sun’s photons, particles, and fields. Of the space environment domains that are affected by space weather, the magnetosphere, thermosphere, and even troposphere are key regions that are affected. Space Environment Technologies (SET) has developed and is producing innovative space weather applications. Key operational systems for providing timely information about the effects of space weather on these domains are SET’s Magnetosphere Alert and Prediction System (MAPS), LEO Alert and Prediction System (LAPS), and Automated Radiation Measurements for Aviation Safety (ARMAS) system. MAPS provides a forecast Dst index out to 6 days through the data-driven, redundant data stream Anemomilos algorithm. Anemomilos uses observational proxies for the magnitude, location, and velocity of solar ejecta events. This forecast index is used by satellite operations to characterize upcoming geomagnetic storms, for example. In addition, an ENLIL/Rice Dst prediction out to several days has also been developed and will be described. LAPS is the SET fully redundant operational system providing recent history, current epoch, and forecast solar and geomagnetic indices for use in operational versions of the JB2008 thermospheric density model. The thermospheric densities produced by that system, driven by the LAPS data, are forecast to 72-hours to provide the global mass densities for satellite operators. ARMAS is a project that has successfully demonstrated the operation of a micro dosimeter on aircraft to capture the real-time radiation environment due to Galactic Cosmic Rays and Solar Energetic Particles. The dose and dose-rates are captured on aircraft, downlinked in real-time via the Iridium satellites, processed on the ground, incorporated into the most recent NAIRAS global radiation climatology data runs, and made available to end users via the web and

  2. The effect of retained intramedullary nails on tibial bone mineral density.

    Science.gov (United States)

    Allen, J C; Lindsey, R W; Hipp, J A; Gugala, Z; Rianon, N; LeBlanc, A

    2008-07-01

    Intramedullary nailing has become a standard treatment for adult tibial shaft fractures. Retained intramedullary nails have been associated with stress shielding, although their long-term effect on decreasing tibial bone mineral density is currently unclear. The purpose of this study was to determine if retained tibial intramedullary nails decrease tibial mineral density in patients with successfully treated fractures. Patients treated with statically locked intramedullary nails for isolated, unilateral tibia shaft fractures were studied. Inclusion required that fracture had healed radiographically and that the patient returned to the pre-injury activity level. Data on patient demographic, fracture type, surgical technique, implant, and post-operative functional status were tabulated. Dual energy X-ray absorptiometry was used to measure bone mineral density in selected regions of the affected tibia and the contralateral intact tibia. Image reconstruction software was employed to ensure symmetry of the studied regions. Twenty patients (mean age 43; range 22-77 years) were studied at a mean of 29 months (range 5-60 months) following intramedullary nailing. There was statistically significant reduction of mean bone mineral density in tibiae with retained intramedullary nails (1.02 g/cm(2) versus 1.06 g/cm(2); P=0.04). A significantly greater decrease in bone mineral density was detected in the reamed versus non-reamed tibiae (-7% versus +6%, respectively; P<0.05). The present study demonstrates a small, but statistically significant overall bone mineral density decrease in healed tibiae with retained nails. Intramedullary reaming appears to be a factor potentiating the reduction of tibia bone mineral density in long-term nail retention.

  3. Matrix product density operators: Renormalization fixed points and boundary theories

    Energy Technology Data Exchange (ETDEWEB)

    Cirac, J.I. [Max-Planck-Institut für Quantenoptik, Hans-Kopfermann-Str. 1, D-85748 Garching (Germany); Pérez-García, D., E-mail: dperezga@ucm.es [Departamento de Análisis Matemático, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain); ICMAT, Nicolas Cabrera, Campus de Cantoblanco, 28049 Madrid (Spain); Schuch, N. [Max-Planck-Institut für Quantenoptik, Hans-Kopfermann-Str. 1, D-85748 Garching (Germany); Verstraete, F. [Department of Physics and Astronomy, Ghent University (Belgium); Vienna Center for Quantum Technology, University of Vienna (Austria)

    2017-03-15

    We consider the tensors generating matrix product states and density operators in a spin chain. For pure states, we revise the renormalization procedure introduced in (Verstraete et al., 2005) and characterize the tensors corresponding to the fixed points. We relate them to the states possessing zero correlation length, saturation of the area law, as well as to those which generate ground states of local and commuting Hamiltonians. For mixed states, we introduce the concept of renormalization fixed points and characterize the corresponding tensors. We also relate them to concepts like finite correlation length, saturation of the area law, as well as to those which generate Gibbs states of local and commuting Hamiltonians. One of the main result of this work is that the resulting fixed points can be associated to the boundary theories of two-dimensional topological states, through the bulk-boundary correspondence introduced in (Cirac et al., 2011).

  4. Robust statistical reconstruction for charged particle tomography

    Science.gov (United States)

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  5. Dynamic statistical information theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel

  6. Lidar measurements of plume statistics

    DEFF Research Database (Denmark)

    Ejsing Jørgensen, Hans; Mikkelsen, T.

    1993-01-01

    of measured crosswind concentration profiles, the following statistics were obtained: 1) Mean profile, 2) Root mean square profile, 3) Fluctuation intensities,and 4)Intermittency factors. Furthermore, some experimentally determined probability density functions (pdf's) of the fluctuations are presented. All...... the measured statistics are referred to a fixed and a 'moving' frame of reference, the latter being defined as a frame of reference from which the (low frequency) plume meander is removed. Finally, the measured statistics are compared with statistics on concentration fluctuations obtained with a simple puff...

  7. Deep convolutional neural network for mammographic density segmentation

    Science.gov (United States)

    Wei, Jun; Li, Songfeng; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir; Samala, Ravi K.

    2018-02-01

    Breast density is one of the most significant factors for cancer risk. In this study, we proposed a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammography (DM). The deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD). PD was calculated as the ratio of the dense area to the breast area based on the probability of each pixel belonging to dense region or fatty region at a decision threshold of 0.5. The DCNN estimate was compared to a feature-based statistical learning approach, in which gray level, texture and morphological features were extracted from each ROI and the least absolute shrinkage and selection operator (LASSO) was used to select and combine the useful features to generate the PMD. The reference PD of each image was provided by two experienced MQSA radiologists. With IRB approval, we retrospectively collected 347 DMs from patient files at our institution. The 10-fold cross-validation results showed a strong correlation r=0.96 between the DCNN estimation and interactive segmentation by radiologists while that of the feature-based statistical learning approach vs radiologists' segmentation had a correlation r=0.78. The difference between the segmentation by DCNN and by radiologists was significantly smaller than that between the feature-based learning approach and radiologists (p approach has the potential to replace radiologists' interactive thresholding in PD estimation on DMs.

  8. Statistical mechanics of anyons

    International Nuclear Information System (INIS)

    Arovas, D.P.

    1985-01-01

    We study the statistical mechanics of a two-dimensional gas of free anyons - particles which interpolate between Bose-Einstein and Fermi-Dirac character. Thermodynamic quantities are discussed in the low-density regime. In particular, the second virial coefficient is evaluated by two different methods and is found to exhibit a simple, periodic, but nonanalytic behavior as a function of the statistics determining parameter. (orig.)

  9. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  10. Statistical analysis of first period of operation of FTU Tokamak; Analisi statistica del primo periodo di operazioni del Tokamak FTU

    Energy Technology Data Exchange (ETDEWEB)

    Crisanti, F; Apruzzese, G; Frigione, D; Kroegler, H; Lovisetto, L; Mazzitelli, G; Podda, S [ENEA, Centro Ricerche Frascati, Rome (Italy). Dip. Energia

    1996-09-01

    On the FTU Tokamak the plasma physics operations started on the 20/4/90. The first plasma had a plasma current Ip=0.75 MA for about a second. The experimental phase lasted until 7/7/94, when a long shut-down begun for installing the toroidal limiter in the inner side of the vacuum vessel. In these four years of operations plasma experiments have been successfully exploited, e.g. experiments of single and multiple pellet injections; full current drive up to Ip=300 KA was obtained by using waves at the frequency of the Lower Hybrid; analysis of ohmic plasma parameters with different materials (from the low Z silicon to high Z tungsten) as plasma facing element was performed. In this work a statistical analysis of the full period of operation is presented. Moreover, a comparison with the statistical data from other Tokamaks is attempted.

  11. Level densities

    International Nuclear Information System (INIS)

    Ignatyuk, A.V.

    1998-01-01

    For any applications of the statistical theory of nuclear reactions it is very important to obtain the parameters of the level density description from the reliable experimental data. The cumulative numbers of low-lying levels and the average spacings between neutron resonances are usually used as such data. The level density parameters fitted to such data are compiled in the RIPL Starter File for the tree models most frequently used in practical calculations: i) For the Gilber-Cameron model the parameters of the Beijing group, based on a rather recent compilations of the neutron resonance and low-lying level densities and included into the beijing-gc.dat file, are chosen as recommended. As alternative versions the parameters provided by other groups are given into the files: jaeri-gc.dat, bombay-gc.dat, obninsk-gc.dat. Additionally the iljinov-gc.dat, and mengoni-gc.dat files include sets of the level density parameters that take into account the damping of shell effects at high energies. ii) For the backed-shifted Fermi gas model the beijing-bs.dat file is selected as the recommended one. Alternative parameters of the Obninsk group are given in the obninsk-bs.dat file and those of Bombay in bombay-bs.dat. iii) For the generalized superfluid model the Obninsk group parameters included into the obninsk-bcs.dat file are chosen as recommended ones and the beijing-bcs.dat file is included as an alternative set of parameters. iv) For the microscopic approach to the level densities the files are: obninsk-micro.for -FORTRAN 77 source for the microscopical statistical level density code developed in Obninsk by Ignatyuk and coworkers, moller-levels.gz - Moeller single-particle level and ground state deformation data base, moller-levels.for -retrieval code for Moeller single-particle level scheme. (author)

  12. Safeguarding subcriticality during loading and shuffling operations in the higher density of the RSG-GAS's silicide core

    International Nuclear Information System (INIS)

    Sembiring, T.M.; Kuntoro, I.

    2003-01-01

    The core conversion program of the RSG-GAS reactor is to convert the all-oxide to all-silicide core. The silicide equilibrium core with fuel meat density of 3.55 gU cm -3 is an optimal core for RSG-GAS reactor and it can significantly increase the operation cycle length from 25 to 32 full power days. Nevertheless, the subcriticality of the shutdown core and the shutdown margin are lower than of the oxide core. Therefore, the deviation of subcriticality condition in the higher silicide core caused by the fuel loading and shuffling error should be reanalysed. The objective of this work is to analyse the sufficiency of the subcriticality condition of the shutdown core to face the worst condition caused by an error during loading and shuffling operations. The calculations were carried out using the 2-dimensional multigroup neutron diffusion code of Batan-FUEL. In the fuel handling error, the calculated results showed that the subcriticality condition of the shutdown higher density silicide equilibrium core of RSG-GAS can be maintained. Therefore, all fuel management steps are fixed in the present reactor operation manual can be applied in the higher silicide equilibrium core of RSG-GAS reactor. (author)

  13. Global statistics of liquid water content and effective number density of water clouds over ocean derived from combined CALIPSO and MODIS measurements

    OpenAIRE

    Y. Hu; M. Vaughan; C. McClain; M. Behrenfeld; H. Maring; D. Anderson; S. Sun-Mack; D. Flittner; J. Huang; B. Wielicki; P. Minnis; C. Weimer; C. Trepte; R. Kuehn

    2007-01-01

    International audience; This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water...

  14. Impact of a high density GPS network on the operational forecast

    Directory of Open Access Journals (Sweden)

    C. Faccani

    2005-01-01

    Full Text Available Global Positioning System Zenith Total Delay (GPS ZTD can provide information about the water vapour in atmosphere. Its assimilation into the analysis used to initialize a model can then improve the weather forecast, giving the right amount of moisture and reducing the model spinup. In the last year, an high density GPS network has been created on the Basilicata region (south of Italy by the Italian Space Agency in the framework of a national project named MAGIC2. MAGIC2 is the Italian follow on of the EC project MAGIC has. Daily operational data assimilation experiments are performed since December 2003. The results show that the assimilation of GPS ZTD improves the forecast especially during the transition from winter to spring even if a no very high model resolution (9km is used.

  15. Statistical symmetries in physics

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1994-01-01

    Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs

  16. Statistical comparisons of Savannah River anemometer data applied to quality control of instrument networks

    International Nuclear Information System (INIS)

    Porch, W.M.; Dickerson, M.H.

    1976-08-01

    Continuous monitoring of extensive meteorological instrument arrays is a requirement in the study of important mesoscale atmospheric phenomena. The phenomena include pollution transport prediction from continuous area sources, or one time releases of toxic materials and wind energy prospecting in areas of topographic enhancement of the wind. Quality control techniques that can be applied to these data to determine if the instruments are operating within their prescribed tolerances were investigated. Savannah River Plant data were analyzed with both independent and comparative statistical techniques. The independent techniques calculate the mean, standard deviation, moments about the mean, kurtosis, skewness, probability density distribution, cumulative probability and power spectra. The comparative techniques include covariance, cross-spectral analysis and two dimensional probability density. At present the calculating and plotting routines for these statistical techniques do not reside in a single code so it is difficult to ascribe independent memory size and computation time accurately. However, given the flexibility of a data system which includes simple and fast running statistics at the instrument end of the data network (ASF) and more sophisticated techniques at the computational end (ACF) a proper balance will be attained. These techniques are described in detail and preliminary results are presented

  17. Introduction to Statistics - eNotes

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Møller, Jan Kloppenborg; Andersen, Elisabeth Wreford

    2015-01-01

    Online textbook used in the introductory statistics courses at DTU. It provides a basic introduction to applied statistics for engineers. The necessary elements from probability theory are introduced (stochastic variable, density and distribution function, mean and variance, etc.) and thereafter...... the most basic statistical analysis methods are presented: Confidence band, hypothesis testing, simulation, simple and muliple regression, ANOVA and analysis of contingency tables. Examples with the software R are included for all presented theory and methods....

  18. Local-scale modelling of density-driven flow for the phases of repository operation and post-closure at Beberg

    International Nuclear Information System (INIS)

    Jaquet, O.; Siegel, P.

    2004-09-01

    A hydrogeological model was developed for Beberg with the aim of evaluating the impact of a repository (for the operational and post-closure phases) while accounting for the effects of density-driven flow. Two embedded scales were taken into account for this modelling study: a local scale at which the granitic medium was considered as a continuum and a repository scale, where the medium is fractured and therefore was regarded to be discrete. The following step-wise approach was established to model density-driven flow at both repository and local scale: (a) modelling fracture networks at the repository scale, (b) upscaling the hydraulic properties to a continuum at local scale and (c) modelling density-driven flow to evaluate repository impact at local scale. The results demonstrate the strong impact of the repository on the flow field during the phase of operation. The distribution of the salt concentration is affected by a large upcoming effect with increased relative concentration and by the presence of fracture zones carrying freshwater from the surface. The concentrations obtained for the reference case, expressed in terms of percentage with respect to the maximum (prescribed) value in the model, are as follows: ca 30% for the phase of desaturation, and ca 20% for the resaturation phase. For the reference case, the impact of repository operations appears no longer visible after a resaturation period of about 20 years after repository closure; under resaturation conditions, evidence of the operational phase has already disappeared in terms of the observed hydraulic and concentration fields. Sensitivity calculations have proven the importance of explicitly discretising repository tunnels when assessing resaturation time and maximum concentration values. Furthermore, the definition of a fixed potential as boundary condition along the model's top surface is likely to provide underestimated values for the maximum concentration and overestimated flow rates in the

  19. Origin of structure: statistical characterization of the primordial density fluctuations and the collapse of the wave function

    Energy Technology Data Exchange (ETDEWEB)

    León, Gabriel [Departamento de Física, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Ciudad Universitaria - Pab. I, Buenos Aires 1428 (Argentina); Sudarsky, Daniel, E-mail: gleon@df.uba.ar, E-mail: sudarsky@nucleares.unam.mx [Instituto de Ciencias Nucleares, Universidad Nacional Autónoma de México, México D.F. 04510, México (Mexico)

    2015-06-01

    The statistical properties of the primordial density perturbations has been considered in the past decade as a powerful probe of the physical processes taking place in the early universe. Within the inflationary paradigm, the properties of the bispectrum are one of the keys that serves to discriminate among competing scenarios concerning the details of the origin of cosmological perturbations. However, all of the scenarios, based on the conventional approach to the so-called ''quantum-to-classical transition'' during inflation, lack the ability to point out the precise physical mechanism responsible for generating the inhomogeneity and anisotropy of our universe starting from and exactly homogeneous and isotropic vacuum state associated with the early inflationary regime. In past works, we have shown that the proposals involving a spontaneous dynamical reduction of the quantum state provide plausible explanations for the birth of said primordial inhomogeneities and anisotropies. In the present manuscript we show that, when considering within the context of such proposals, the characterization of the spectrum and bispectrum turn out to be quite different from those found in the traditional approach, and in particular, some of the statistical features, must be treated in a different way leading to some rather different conclusions.

  20. A multivariate statistical methodology for detection of degradation and failure trends using nuclear power plant operational data

    International Nuclear Information System (INIS)

    Samanta, P.K.; Teichmann, T.

    1990-01-01

    In this paper, a multivariate statistical method is presented and demonstrated as a means for analyzing nuclear power plant transients (or events) and safety system performance for detection of malfunctions and degradations within the course of the event based on operational data. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to detect failure trends and patterns and so can lead to prevention of conditions with serious safety implications

  1. Weighted A-statistical convergence for sequences of positive linear operators.

    Science.gov (United States)

    Mohiuddine, S A; Alotaibi, Abdullah; Hazarika, Bipan

    2014-01-01

    We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.

  2. Effects of heat and water transport on the performance of polymer electrolyte membrane fuel cell under high current density operation

    International Nuclear Information System (INIS)

    Tabuchi, Yuichiro; Shiomi, Takeshi; Aoki, Osamu; Kubo, Norio; Shinohara, Kazuhiko

    2010-01-01

    Key challenges to the acceptance of polymer electrolyte membrane fuel cells (PEMFCs) for automobiles are the cost reduction and improvement in its power density for compactness. In order to get the solution, the further improvement in a fuel cell performance is required. In particular, under higher current density operation, water and heat transport in PEMFCs has considerable effects on the cell performance. In this study, the impact of heat and water transport on the cell performance under high current density was investigated by experimental evaluation of liquid water distribution and numerical validation. Liquid water distribution in MEA between rib and channel area is evaluated by neutron radiography. In order to neglect the effect of liquid water in gas channels and reactant species concentration distribution in the flow direction, the differential cell was used in this study. Experimental results suggested that liquid water under the channel was dramatically changed with rib/channel width. From the numerical study, it is found that the change of liquid water distribution was significantly affected by temperature distribution in MEA between rib and channel area. In addition, not only heat transport but also water transport through the membrane also significantly affected the cell performance under high current density operation.

  3. Statistics of AUV's Missions for Operational Ocean Observation at the South Brazilian Bight.

    Science.gov (United States)

    dos Santos, F. A.; São Tiago, P. M.; Oliveira, A. L. S. C.; Barmak, R. B.; Miranda, T. C.; Guerra, L. A. A.

    2016-02-01

    The high costs and logistics limitations of ship-based data collection represent an obstacle for a persistent in-situ data collection. Satellite-operated Autonomous Underwater Vehicles (AUV's) or gliders (as these AUV's are generally known by the scientific community) are presented as an inexpensive and reliable alternative to perform long-term and real-time ocean monitoring of important parameters such as temperature, salinity, water-quality and acoustics. This work is focused on the performance statistics and the reliability for continuous operation of a fleet of seven gliders navigating in Santos Basin - Brazil, since March 2013. The gliders performance were evaluated by the number of standby days versus the number of operating days, the number of interrupted missions due to (1) equipment failure, (2) weather, (3) accident versus the number of successful missions and the amount and quality of data collected. From the start of the operations in March 2013 to the preparation of this work (July 2015), a total of 16 glider missions were accomplished, operating during 728 of the 729 days passed since then. From this total, 11 missions were successful, 3 missions were interrupted due to equipment failure and 2 gliders were lost. Most of the identified issues were observed in the communication with the glider (when recovery was necessary) or the optode sensors (when remote settings solved the problem). The average duration of a successful mission was 103 days while interrupted ones ended on average in 7 days. The longest mission lasted for 139 days, performing 859 continuous profiles and covering a distance of 2734 Km. The 2 projects performed together 6856 dives, providing an average of 9,5 profiles per day or one profile every 2,5 hours each day during 2 consecutive years.

  4. Quasi-steady-state operation around operational limit in HT-7

    International Nuclear Information System (INIS)

    Li, J.; Xie, J.K.; Wan, B.N.; Luo, J.R.; Gao, X.; Zhao, Y.; Yang, Y.; Kuang, G.L.; Bao, Y.; Ding, B.J.; Wan, Y.X.

    2001-01-01

    Efforts have been made on HT-7 tokamak for extending the stable operation boundaries. Extensive RF boronization and siliconization have been used and wider operational Hugill diagram was obtained. Transit density reached 1.3 time of Greenwald density limit in ohmic discharges. Stationary high performance discharge with q a =2.1 has been obtained after siliconization. Confinement improvement was obtained due to the significant reduction of electron thermal diffusivity χ e in the out region of the plasma. Improved confinement phase was also observed by LHCD under the density range 70%∼120% of Greenwald density limit. The weak hollow current density profile was attribute to off-axis LHW power deposition. Code simulations and measurements showed a good agreement of off-axis LHW deposition. Supersonic molecular beam injection has been successfully used to get stable high-density operation in the range of Greenwald density limit. (author)

  5. The relation between herbivore density and relative resource ...

    African Journals Online (AJOL)

    The relation between kudu density and the relative density of habitat patches in each landscape was significant, with exponential models producing more significant statistics than linear models. Regressions of resource density against animal density are useful to understand 'carrying capacity' for wild herbivores, and ...

  6. Weighted A-Statistical Convergence for Sequences of Positive Linear Operators

    Directory of Open Access Journals (Sweden)

    S. A. Mohiuddine

    2014-01-01

    Full Text Available We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.

  7. Implication of nonintegral occupation number and Fermi-Dirac statistics in the local-spin-density approximation applied to finite systems

    International Nuclear Information System (INIS)

    Dhar, S.

    1989-01-01

    In electronic-structure calculations for finite systems using the local-spin-density (LSD) approximation, it is assumed that the eigenvalues of the Kohn-Sham equation should obey Fermi-Dirac (FD) statistics. In order to comply with this assumption for some of the transition-metal atoms, a nonintegral occupation number is used which also minimizes the total energy. It is shown here that for finite systems it is not necessary that the eigenvalues of the Kohn-Sham equation obey FD statistics. It is also shown that the Kohn-Sham exchange potential used in all LSD models is correct only for integer occupation number. With a noninteger occupation number the LSD exchange potential will be smaller than that given by the Kohn-Sham potential. Ab initio self-consistent spin-polarized calculations have been performed numerically for the total energy of an iron atom. It is found that the ground state belongs to the 3d 6 4s 2 configuration. The ionization potentials of all the Fe/sup n/ + ions are reported and are in agreement with experiment

  8. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    Science.gov (United States)

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  9. Statistical thermodynamics understanding the properties of macroscopic systems

    CERN Document Server

    Fai, Lukong Cornelius

    2012-01-01

    Basic Principles of Statistical PhysicsMicroscopic and Macroscopic Description of StatesBasic PostulatesGibbs Ergodic AssumptionGibbsian EnsemblesExperimental Basis of Statistical MechanicsDefinition of Expectation ValuesErgodic Principle and Expectation ValuesProperties of Distribution FunctionRelative Fluctuation of an Additive Macroscopic ParameterLiouville TheoremGibbs Microcanonical EnsembleMicrocanonical Distribution in Quantum MechanicsDensity MatrixDensity Matrix in Energy RepresentationEntropyThermodynamic FunctionsTemperatureAdiabatic ProcessesPressureThermodynamic IdentityLaws of Th

  10. Statistical aspects of nuclear structure

    International Nuclear Information System (INIS)

    Parikh, J.C.

    1977-01-01

    The statistical properties of energy levels and a statistical approach to transition strengths are discussed in relation to nuclear structure studies at high excitation energies. It is shown that the calculations can be extended to the ground state domain also. The discussion is based on the study of random matrix theory of level density and level spacings, using the Gaussian Orthogonal Ensemble (GOE) concept. The short range and long range correlations are also studied statistically. The polynomial expansion method is used to obtain excitation strengths. (A.K.)

  11. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  12. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  13. Educating America's Workforce: Summary of Key Operating Statistics. Data collected from the 2007 and 2008 Annual Institutional Reports

    Science.gov (United States)

    Accrediting Council for Independent Colleges and Schools, 2009

    2009-01-01

    This special edition of the Key Operating Statistics (KOS) contains information based on the 2007 and 2008 Annual Institutional Reports (AIR) submitted by ACICS-accredited institutions. The AIR is submitted on September 15 each year by ACICS-accredited institutions. It reflects activity during a reporting year that begins on July 1 and concludes…

  14. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  15. Density by moduli and Wijsman lacunary statistical convergence of sequences of sets

    Directory of Open Access Journals (Sweden)

    Vinod K Bhardwaj

    2017-01-01

    Full Text Available Abstract The main object of this paper is to introduce and study a new concept of f-Wijsman lacunary statistical convergence of sequences of sets, where f is an unbounded modulus. The definition of Wijsman lacunary strong convergence of sequences of sets is extended to a definition of Wijsman lacunary strong convergence with respect to a modulus for sequences of sets and it is shown that, under certain conditions on a modulus f, the concepts of Wijsman lacunary strong convergence with respect to a modulus f and f-Wijsman lacunary statistical convergence are equivalent on bounded sequences. We further characterize those θ for which WS θ f = WS f $\\mathit{WS}_{\\theta}^{f} = \\mathit{WS}^{f}$ , where WS θ f $\\mathit{WS}_{\\theta}^{f}$ and WS f $\\mathit{WS}^{f}$ denote the sets of all f-Wijsman lacunary statistically convergent sequences and f-Wijsman statistically convergent sequences, respectively.

  16. Adjustments of the TaD electron density reconstruction model with GNSS-TEC parameters for operational application purposes

    Directory of Open Access Journals (Sweden)

    Belehaki Anna

    2012-12-01

    Full Text Available Validation results on the latest version of TaD model (TaDv2 show realistic reconstruction of the electron density profiles (EDPs with an average error of 3 TECU, similar to the error obtained from GNSS-TEC calculated paremeters. The work presented here has the aim to further improve the accuracy of the TaD topside reconstruction, adjusting the TEC parameter calculated from TaD model with the TEC parameter calculated by GNSS transmitting RINEX files provided by receivers co-located with the Digisondes. The performance of the new version is tested during a storm period demonstrating further improvements in respect to the previous version. Statistical comparison of modeled and observed TEC confirms the validity of the proposed adjustment. A significant benefit of the proposed upgrade is that it facilitates the real-time implementation of TaD. The model needs a reliable measure of the scale height at the peak height, which is supposed to be provided by Digisondes. Oftenly, the automatic scaling software fails to correctly calculate the scale height at the peak, Hm, due to interferences in the receiving signal. Consequently the model estimated topside scale height is wrongly calculated leading to unrealistic results for the modeled EDP. The proposed TEC adjustment forces the model to correctly reproduce the topside scale height, despite the inaccurate values of Hm. This adjustment is very important for the application of TaD in an operational environment.

  17. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-01-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the resutls on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monople giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excelent agreement with recent experimental data, showing that the decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  18. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-02-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the results on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monopole giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excellent agreement with recent experimental data, showing that decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  19. 517 DWELLING DENSITY VARIABILITY ACROSS GOVERNMENT ...

    African Journals Online (AJOL)

    Osondu

    confidence level, apartment type had no significant effect on dwelling density in ... words: dwelling density, home spaces, housing units, multifamily apartments ... spaces for work, Obateru (2005) defined .... of Statistics Year Book, 2008; Seeling et al., ... stress. The bedroom and habitable room indicators show similar trend.

  20. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  1. Optimizing refiner operation with statistical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, G [Noranda Research Centre, Pointe Claire, PQ (Canada)

    1997-02-01

    The impact of refining conditions on the energy efficiency of the process and on the handsheet quality of a chemi-mechanical pulp was studied as part of a series of pilot scale refining trials. Statistical models of refiner performance were constructed from these results and non-linear optimization of process conditions were conducted. Optimization results indicated that increasing the ratio of specific energy applied in the first stage led to a reduction of some 15 per cent in the total energy requirement. The strategy can also be used to obtain significant increases in pulp quality for a given energy input. 20 refs., 6 tabs.

  2. Automated Material Accounting Statistics System at Rockwell Hanford Operations

    International Nuclear Information System (INIS)

    Eggers, R.F.; Giese, E.W.; Kodman, G.P.

    1986-01-01

    The Automated Material Accounting Statistics System (AMASS) was developed under the sponsorship of the U.S. Nuclear Regulatory Commission. The AMASS was developed when it was realized that classical methods of error propagation, based only on measured quantities, did not properly control false alarm rate and that errors other than measurement errors affect inventory differences. The classical assumptions that (1) the mean value of the inventory difference (ID) for a particular nuclear material processing facility is zero, and (2) the variance of the inventory difference is due only to errors in measured quantities are overly simplistic. The AMASS provides a valuable statistical tool for estimating the true mean value and variance of the ID data produced by a particular material balance area. In addition it provides statistical methods of testing both individual and cumulative sums of IDs, taking into account the estimated mean value and total observed variance of the ID

  3. Statistical Modeling for the Effect of Rotor Speed, Yarn Twist and Linear Density on Production and Quality Characteristics of Rotor Spun Yarn

    Directory of Open Access Journals (Sweden)

    Farooq Ahmed Arain

    2012-01-01

    Full Text Available The aim of this study was to develop a statistical model for the effect of RS (Rotor Speed, YT (Yarn Twist and YLD (Yarn Linear Density on production and quality characteristics of rotor spun yarn. Cotton yarns of 30, 35 and 40 tex were produced on rotor spinning machine at different rotor speeds (i.e. 70000, 80000, 90000 and 100000 rpm and with different twist levels (i.e. 450, 500, 550, 600 and 700 tpm. Yarn production (g/hr and quality characteristics were determined for all the experiments. Based on the results, models were developed using response surface regression on MINITAB�16 statistical tool. The developed models not only characterize the intricate relationships among the factors but may also be used to predict the yarn production and quality characteristics at any level of factors within the range of experimental values.

  4. HI column density distribution function at z=0 : Connection to damped Ly alpha statistics

    NARCIS (Netherlands)

    Zwaan, Martin; Verheijen, MAW; Briggs, FH

    We present a measurement of the HI column density distribution function f(N-HI) at the present epoch for column densities > 10(20) cm(-2). These high column densities compare to those measured in damped Ly alpha lines seen in absorption against background quasars. Although observationally rare, it

  5. Density dependence of the nuclear energy-density functional

    Science.gov (United States)

    Papakonstantinou, Panagiota; Park, Tae-Sun; Lim, Yeunhwan; Hyun, Chang Ho

    2018-01-01

    Background: The explicit density dependence in the coupling coefficients entering the nonrelativistic nuclear energy-density functional (EDF) is understood to encode effects of three-nucleon forces and dynamical correlations. The necessity for the density-dependent coupling coefficients to assume the form of a preferably small fractional power of the density ρ is empirical and the power is often chosen arbitrarily. Consequently, precision-oriented parametrizations risk overfitting in the regime of saturation and extrapolations in dilute or dense matter may lose predictive power. Purpose: Beginning with the observation that the Fermi momentum kF, i.e., the cubic root of the density, is a key variable in the description of Fermi systems, we first wish to examine if a power hierarchy in a kF expansion can be inferred from the properties of homogeneous matter in a domain of densities, which is relevant for nuclear structure and neutron stars. For subsequent applications we want to determine a functional that is of good quality but not overtrained. Method: For the EDF, we fit systematically polynomial and other functions of ρ1 /3 to existing microscopic, variational calculations of the energy of symmetric and pure neutron matter (pseudodata) and analyze the behavior of the fits. We select a form and a set of parameters, which we found robust, and examine the parameters' naturalness and the quality of resulting extrapolations. Results: A statistical analysis confirms that low-order terms such as ρ1 /3 and ρ2 /3 are the most relevant ones in the nuclear EDF beyond lowest order. It also hints at a different power hierarchy for symmetric vs. pure neutron matter, supporting the need for more than one density-dependent term in nonrelativistic EDFs. The functional we propose easily accommodates known or adopted properties of nuclear matter near saturation. More importantly, upon extrapolation to dilute or asymmetric matter, it reproduces a range of existing microscopic

  6. Density functional theory

    International Nuclear Information System (INIS)

    Das, M.P.

    1984-07-01

    The state of the art of the density functional formalism (DFT) is reviewed. The theory is quantum statistical in nature; its simplest version is the well-known Thomas-Fermi theory. The DFT is a powerful formalism in which one can treat the effect of interactions in inhomogeneous systems. After some introductory material, the DFT is outlined from the two basic theorems, and various generalizations of the theorems appropriate to several physical situations are pointed out. Next, various approximations to the density functionals are presented and some practical schemes, discussed; the approximations include an electron gas of almost constant density and an electron gas of slowly varying density. Then applications of DFT in various diverse areas of physics (atomic systems, plasmas, liquids, nuclear matter) are mentioned, and its strengths and weaknesses are pointed out. In conclusion, more recent developments of DFT are indicated

  7. Generalized t-statistic for two-group classification.

    Science.gov (United States)

    Komori, Osamu; Eguchi, Shinto; Copas, John B

    2015-06-01

    In the classic discriminant model of two multivariate normal distributions with equal variance matrices, the linear discriminant function is optimal both in terms of the log likelihood ratio and in terms of maximizing the standardized difference (the t-statistic) between the means of the two distributions. In a typical case-control study, normality may be sensible for the control sample but heterogeneity and uncertainty in diagnosis may suggest that a more flexible model is needed for the cases. We generalize the t-statistic approach by finding the linear function which maximizes a standardized difference but with data from one of the groups (the cases) filtered by a possibly nonlinear function U. We study conditions for consistency of the method and find the function U which is optimal in the sense of asymptotic efficiency. Optimality may also extend to other measures of discriminatory efficiency such as the area under the receiver operating characteristic curve. The optimal function U depends on a scalar probability density function which can be estimated non-parametrically using a standard numerical algorithm. A lasso-like version for variable selection is implemented by adding L1-regularization to the generalized t-statistic. Two microarray data sets in the study of asthma and various cancers are used as motivating examples. © 2014, The International Biometric Society.

  8. Central depression in nucleonic densities: Trend analysis in the nuclear density functional theory approach

    Science.gov (United States)

    Schuetrumpf, B.; Nazarewicz, W.; Reinhard, P.-G.

    2017-08-01

    Background: The central depression of nucleonic density, i.e., a reduction of density in the nuclear interior, has been attributed to many factors. For instance, bubble structures in superheavy nuclei are believed to be due to the electrostatic repulsion. In light nuclei, the mechanism behind the density reduction in the interior has been discussed in terms of shell effects associated with occupations of s orbits. Purpose: The main objective of this work is to reveal mechanisms behind the formation of central depression in nucleonic densities in light and heavy nuclei. To this end, we introduce several measures of the internal nucleonic density. Through the statistical analysis, we study the information content of these measures with respect to nuclear matter properties. Method: We apply nuclear density functional theory with Skyrme functionals. Using the statistical tools of linear least square regression, we inspect correlations between various measures of central depression and model parameters, including nuclear matter properties. We study bivariate correlations with selected quantities as well as multiple correlations with groups of parameters. Detailed correlation analysis is carried out for 34Si for which a bubble structure has been reported recently, 48Ca, and N =82 , 126, and 184 isotonic chains. Results: We show that the central depression in medium-mass nuclei is very sensitive to shell effects, whereas for superheavy systems it is firmly driven by the electrostatic repulsion. An appreciable semibubble structure in proton density is predicted for 294Og, which is currently the heaviest nucleus known experimentally. Conclusion: Our correlation analysis reveals that the central density indicators in nuclei below 208Pb carry little information on parameters of nuclear matter; they are predominantly driven by shell structure. On the other hand, in the superheavy nuclei there exists a clear relationship between the central nucleonic density and symmetry energy.

  9. Operation Statistics of the CERN Accelerators Complex for 2003

    CERN Document Server

    CERN. Geneva; Baird, S A; Rey, A; Steerenberg, R; CERN. Geneva. AB Department

    2004-01-01

    This report gives an overview of the performance of the different Accelerators (Linacs, PS Booster, PS, AD and SPS) of the CERN Accelerator Complex for 2003. It includes scheduled activities, beam availabilities, beam intensities and an analysis of faults and breakdowns by system and by beam. MORE INFORATION by using the OP Statistics Tool: http://eLogbook.web.cern.ch/eLogbook/statistics.php and on the SPS HomePage: http://ab-div-op-sps.web.cern.ch/ab-div-op-sps/SPSss.html

  10. A statistical model to estimate the local vulnerability to severe weather

    Science.gov (United States)

    Pardowitz, Tobias

    2018-06-01

    We present a spatial analysis of weather-related fire brigade operations in Berlin. By comparing operation occurrences to insured losses for a set of severe weather events we demonstrate the representativeness and usefulness of such data in the analysis of weather impacts on local scales. We investigate factors influencing the local rate of operation occurrence. While depending on multiple factors - which are often not available - we focus on publicly available quantities. These include topographic features, land use information based on satellite data and information on urban structure based on data from the OpenStreetMap project. After identifying suitable predictors such as housing coverage or local density of the road network we set up a statistical model to be able to predict the average occurrence frequency of local fire brigade operations. Such model can be used to determine potential hotspots for weather impacts even in areas or cities where no systematic records are available and can thus serve as a basis for a broad range of tools or applications in emergency management and planning.

  11. A unified treatment of dynamics and scattering in classical and quantum statistical mechanics

    International Nuclear Information System (INIS)

    Prugovecki, E.

    1978-01-01

    The common formal features of classical and quantum statistical mechanics are investigated at three separate levels: at the level of L 2 spaces of wave-packets on GAMMA-space, of Liouville spaces B 2 consisting of density operators constructed from such wave-packets, and of phase-space representation spaces P of GAMMA distribution functions. It is shown that at the last level the formal similarities become so outstanding that all key quantities in P-space, such as Liouville operators, Hamiltonian functions, position and momentum observables, etc., are represented by expressions which to the zeroth order in (h/2π) coincide in the classical and quantum case, and in some instances coincide completely. Scattering theory on the B 2 Liouville spaces takes on the same formal appearance for classical and quantum statistical mechanics, and to the zeroth order in (h/2π) it coincides in both cases. This makes possible the formulation of a classical approximation to quantum scattering, and of a computational scheme for determining rhosup(out) from rhosup(in) for successive order of (h/2π). (Auth.)

  12. Quantum theory with an energy operator defined as a quartic form of the momentum

    Energy Technology Data Exchange (ETDEWEB)

    Bezák, Viktor, E-mail: bezak@fmph.uniba.sk

    2016-09-15

    Quantum theory of the non-harmonic oscillator defined by the energy operator proposed by Yurke and Buks (2006) is presented. Although these authors considered a specific problem related to a model of transmission lines in a Kerr medium, our ambition is not to discuss the physical substantiation of their model. Instead, we consider the problem from an abstract, logically deductive, viewpoint. Using the Yurke–Buks energy operator, we focus attention on the imaginary-time propagator. We derive it as a functional of the Mehler kernel and, alternatively, as an exact series involving Hermite polynomials. For a statistical ensemble of identical oscillators defined by the Yurke–Buks energy operator, we calculate the partition function, average energy, free energy and entropy. Using the diagonal element of the canonical density matrix of this ensemble in the coordinate representation, we define a probability density, which appears to be a deformed Gaussian distribution. A peculiarity of this probability density is that it may reveal, when plotted as a function of the position variable, a shape with two peaks located symmetrically with respect to the central point.

  13. A statistical study of high coronal densities from X-ray line-ratios of Mg XI

    Science.gov (United States)

    Linford, G. A.; Lemen, J. R.; Strong, K. T.

    1991-01-01

    An X-ray line-ratio density diagnostic was applied to 50 Mg XI spectra of flaring active regions on the sun recorded by the Flat Crystal Spectrometer on the SMM. The plasma density is derived from R, the flux ratio of the forbidden to intercombination lines of the He-like ion, Mg XI. The R ratio for Mg XI is only density sensitive when the electron density exceeds a critical value (about 10 to the 12th/cu cm), the low-density limit (LDL). This theoretical value of the low-density limit is uncertain as it depends on complex atomic theory. Reported coronal densities above 10 to the 12th/cu cm are uncommon. In this study, the distribution of R ratio values about the LDL is estimated and the empirical values are derived for the 1st and 2nd moments of this distribution from 50 Mg XI spectra. From these derived parameters, the percentage of observations is derived which indicated densities above this limit.

  14. Beyond quantum microcanonical statistics

    International Nuclear Information System (INIS)

    Fresch, Barbara; Moro, Giorgio J.

    2011-01-01

    Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.

  15. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describe the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of high-sampled full-scale time series measurements...... are consistent, given the inevitabel uncertainties associated with model as well as with the extreme value data analysis. Keywords: Statistical model, extreme wind conditions, statistical analysis, turbulence, wind loading, statistical analysis, turbulence, wind loading, wind shear, wind turbines....

  16. Inference of missing data and chemical model parameters using experimental statistics

    Science.gov (United States)

    Casey, Tiernan; Najm, Habib

    2017-11-01

    A method for determining the joint parameter density of Arrhenius rate expressions through the inference of missing experimental data is presented. This approach proposes noisy hypothetical data sets from target experiments and accepts those which agree with the reported statistics, in the form of nominal parameter values and their associated uncertainties. The data exploration procedure is formalized using Bayesian inference, employing maximum entropy and approximate Bayesian computation methods to arrive at a joint density on data and parameters. The method is demonstrated in the context of reactions in the H2-O2 system for predictive modeling of combustion systems of interest. Work supported by the US DOE BES CSGB. Sandia National Labs is a multimission lab managed and operated by Nat. Technology and Eng'g Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell Intl, for the US DOE NCSA under contract DE-NA-0003525.

  17. Statistical assessment of numerous Monte Carlo tallies

    International Nuclear Information System (INIS)

    Kiedrowski, Brian C.; Solomon, Clell J.

    2011-01-01

    Four tests are developed to assess the statistical reliability of collections of tallies that number in thousands or greater. To this end, the relative-variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality. (author)

  18. High beta tokamak operation in DIII-D limited at low density/collisionality by resistive tearing modes

    International Nuclear Information System (INIS)

    La Haye, R.J.; Lao, L.L.; Strait, E.J.; Taylor, T.S.

    1997-01-01

    The maximum operational high beta in single-null divertor (SND) long pulse tokamak discharges in the DIII-D tokamak with a cross-sectional shape similar to the proposed International Thermonuclear Experimental Reactor (ITER) device is found to be limited by the onset of resistive instabilities that have the characteristics of neoclassically destabilized tearing modes. There is a soft limit due to the onset of an m/n=3/2 rotating tearing mode that saturates at low amplitude and a hard limit at slightly higher beta due to the onset of an m/n=2/1 rotating tearing mode that grows, slows down and locks. By operating at higher density and thus collisionality, the practical beta limit due to resistive tearing modes approaches the ideal magnetohydrodynamic (MHD) limit. (author). 15 refs, 4 figs

  19. Determination of absolute Ba densities during dimming operation of fluorescent lamps by laser-induced fluorescence measurements

    International Nuclear Information System (INIS)

    Hadrath, S; Beck, M; Garner, R C; Lieder, G; Ehlbeck, J

    2007-01-01

    Investigations of fluorescent lamps (FL) are often focused on the electrodes, since the lifetime of the lamps is typically limited by the electrode lifetime and durability. During steady state operation, the work function lowering emitter material, in particular, barium, is lost. Greater barium losses occur under dimming conditions, in which reduced discharge currents lead to increased cathode falls, the result of the otherwise diminished heating of the electrode by the bombarding plasma ions. In this work the barium density near the electrodes of (FL), operating in high frequency dimming mode is investigated using the high-sensitivity method of laser-induced fluorescence. From these measurements we infer barium loss for a range of discharge currents and auxiliary coil heating currents. We show that the Ba loss can very easily be reduced by moderate auxiliary coil heating

  20. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  1. Referenceless Prediction of Perceptual Fog Density and Perceptual Image Defogging.

    Science.gov (United States)

    Choi, Lark Kwon; You, Jaehee; Bovik, Alan Conrad

    2015-11-01

    We propose a referenceless perceptual fog density prediction model based on natural scene statistics (NSS) and fog aware statistical features. The proposed model, called Fog Aware Density Evaluator (FADE), predicts the visibility of a foggy scene from a single image without reference to a corresponding fog-free image, without dependence on salient objects in a scene, without side geographical camera information, without estimating a depth-dependent transmission map, and without training on human-rated judgments. FADE only makes use of measurable deviations from statistical regularities observed in natural foggy and fog-free images. Fog aware statistical features that define the perceptual fog density index derive from a space domain NSS model and the observed characteristics of foggy images. FADE not only predicts perceptual fog density for the entire image, but also provides a local fog density index for each patch. The predicted fog density using FADE correlates well with human judgments of fog density taken in a subjective study on a large foggy image database. As applications, FADE not only accurately assesses the performance of defogging algorithms designed to enhance the visibility of foggy images, but also is well suited for image defogging. A new FADE-based referenceless perceptual image defogging, dubbed DEnsity of Fog Assessment-based DEfogger (DEFADE) achieves better results for darker, denser foggy images as well as on standard foggy images than the state of the art defogging methods. A software release of FADE and DEFADE is available online for public use: http://live.ece.utexas.edu/research/fog/index.html.

  2. Variable density management in riparian reserves: lessons learned from an operational study in managed forests of western Oregon, USA.

    Science.gov (United States)

    Samuel Chan; Paul Anderson; John Cissel; Larry Lateen; Charley Thompson

    2004-01-01

    A large-scale operational study has been undertaken to investigate variable density management in conjunction with riparian buffers as a means to accelerate development of late-seral habitat, facilitate rare species management, and maintain riparian functions in 40-70 year-old headwater forests in western Oregon, USA. Upland variable retention treatments include...

  3. Statistical Design of an Adaptive Synthetic X- Control Chart with Run Rule on Service and Management Operation

    Directory of Open Access Journals (Sweden)

    Shucheng Yu

    2016-01-01

    Full Text Available An improved synthetic X- control chart based on hybrid adaptive scheme and run rule scheme is introduced to enhance the statistical performance of traditional synthetic X- control chart on service and management operation. The proposed scientific hybrid adaptive schemes consider both variable sampling interval and variable sample size scheme. The properties of the proposed chart are obtained using Markov chain approach. An extensive set of numerical results is presented to test the effectiveness of the proposed model in detecting small and moderate shifts in the process mean. The results show that the proposed chart is quicker than the standard synthetic X- chart and CUSUM chart in detecting small and moderate shifts in the process of service and management operation.

  4. Mammography density estimation with automated volumetic breast density measurement

    International Nuclear Information System (INIS)

    Ko, Su Yeon; Kim, Eun Kyung; Kim, Min Jung; Moon, Hee Jung

    2014-01-01

    To compare automated volumetric breast density measurement (VBDM) with radiologists' evaluations based on the Breast Imaging Reporting and Data System (BI-RADS), and to identify the factors associated with technical failure of VBDM. In this study, 1129 women aged 19-82 years who underwent mammography from December 2011 to January 2012 were included. Breast density evaluations by radiologists based on BI-RADS and by VBDM (Volpara Version 1.5.1) were compared. The agreement in interpreting breast density between radiologists and VBDM was determined based on four density grades (D1, D2, D3, and D4) and a binary classification of fatty (D1-2) vs. dense (D3-4) breast using kappa statistics. The association between technical failure of VBDM and patient age, total breast volume, fibroglandular tissue volume, history of partial mastectomy, the frequency of mass > 3 cm, and breast density was analyzed. The agreement between breast density evaluations by radiologists and VBDM was fair (k value = 0.26) when the four density grades (D1/D2/D3/D4) were used and moderate (k value = 0.47) for the binary classification (D1-2/D3-4). Twenty-seven women (2.4%) showed failure of VBDM. Small total breast volume, history of partial mastectomy, and high breast density were significantly associated with technical failure of VBDM (p 0.001 to 0.015). There is fair or moderate agreement in breast density evaluation between radiologists and VBDM. Technical failure of VBDM may be related to small total breast volume, a history of partial mastectomy, and high breast density.

  5. Statistics of polarization speckle: theory versus experiment

    DEFF Research Database (Denmark)

    Wang, Wei; Hanson, Steen Grüner; Takeda, Mitsuo

    2010-01-01

    In this paper, we reviewed our recent work on the statistical properties of polarization speckle, described by stochastic Stokes parameters fluctuating in space. Based on the Gaussian assumption for the random electric field components and polar-interferometer, we investigated theoretically...... and experimentally the statistics of Stokes parameters of polarization speckle, including probability density function of Stokes parameters with the spatial degree of polarization, autocorrelation of Stokes vector and statistics of spatial derivatives for Stokes parameters....

  6. What Exactly is a Parton Density ?

    International Nuclear Information System (INIS)

    Collins, J.C.

    2003-01-01

    I give an account of the definitions of parton densities, both the conventional ones, integrated over parton transverse momentum, and unintegrated transverse-momentum-dependent densities. The aim is to get a precise and correct definition of a parton density as the target expectation value of a suitable quantum mechanical operator, so that a clear connection to non-perturbative QCD is provided. Starting from the intuitive ideas in the parton model that predate QCD, we will see how the simplest operator definitions suffer from divergences. Corrections to the definition are needed to eliminate the divergences. An improved definition of unintegrated parton densities is proposed. (author)

  7. Ocean Drilling Program: Web Site Access Statistics

    Science.gov (United States)

    web site ODP/TAMU Science Operator Home Ocean Drilling Program Web Site Access Statistics* Overview See statistics for JOIDES members. See statistics for Janus database. 1997 October November December

  8. ITER Experts' meeting on density limits

    International Nuclear Information System (INIS)

    Borrass, K.; Igitkhanov, Y.L.; Uckan, N.A.

    1989-12-01

    The necessity of achieving a prescribed wall load or fusion power essentially determines the plasma pressure in a device like ITER. The range of operation densities and temperatures compatible with this condition is constrained by the problems of power exhaust and the disruptive density limit. The maximum allowable heat loads on the divertor plates and the maximum allowable sheath edge temperature practically impose a lower limit on the operating densities, whereas the disruptive density limit imposes an upper limit. For most of the density limit scalings proposed in the past an overlap of the two constraints or at best a very narrow accessible density range is predicted for ITER. Improved understanding of the underlying mechanisms is therefore a crucial issue in order to provide a more reliable basis for extrapolation to ITER and to identify possible ways of alleviating the problem

  9. High density internal transport barriers for burning plasma operation

    Energy Technology Data Exchange (ETDEWEB)

    Ridolfini, V Pericoli [Associazione EURATOM-ENEA sulla Fusione, CR Frascati, Rome (Italy); Barbato, E [Associazione EURATOM-ENEA sulla Fusione, CR Frascati, Rome (Italy); Buratti, P [Associazione EURATOM-ENEA sulla Fusione, CR Frascati, Rome (Italy)] (and others)

    2005-12-15

    A tokamak plasma with internal transport barriers (ITBs) is the best candidate for a steady ITER operation, since the high energy confinement allows working at plasma currents (I{sub p}) lower than the reference scenario. To build and sustain an ITB at the ITER high density ({>=}10{sup 20} m{sup -3}) and largely dominant electron (e{sup -}) heating is not trivial in most existing tokamaks. FTU can instead meet both requests, thanks to its radiofrequency heating systems, lower hybrid (LH, up to 1.9 MW) and electron cyclotron (EC up to 1.2 MW). By the combined use of them, ITBs are obtained up to peak densities n{sub e0} > 1.3 x 10{sup 20} m{sup -3}, with central e{sup -} temperatures T{sub e0} {approx} 5.5 keV, and are sustained for as long as the heating pulse is applied (>35 confinement times, {tau}{sub E}). At n{sub e0} {approx} 0.8 x 10{sup 20} m{sup -3} T{sub e0} can be larger than 11 keV. Almost full current drive (CD) and an overall good steadiness is attained within about one {tau}{sub E}, 20 times faster than the ohmic current relaxation time. The ITB extends over a central region with an almost flat or slightly reversed q profile and q{sub min} {approx} 1.3 that is fully sustained by off-axis lower hybrid current drive. Consequent to this is the beneficial good alignment of the bootstrap current, generated by the ITB large pressure gradients, with the LH driven current. Reflectometry shows a clear change in the turbulence close to the ITB radius, consistent with the reduced e{sup -} transport. Ions (i{sup +}) are significantly heated via collisions, but thermal equilibrium with electrons cannot be attained since the e{sup -}-i{sup +} equipartition time is always 4-5 times longer than {tau}{sub E}. No degradation of the overall ion transport, rather a reduction of the i{sup +} heat diffusivity, is observed inside the ITB. The global confinement has been improved up to 1.6 times over the scaling predictions. The ITB radius can be controlled by adjusting the

  10. Epidemiological reference ranges for low-density lipoprotein ...

    African Journals Online (AJOL)

    Although there is widespread acceptance that total cholesterol (TC) value reference ranges should be based on epidemiological rather than statistical considerations, the epidemiological action limits for Iow-density lipoprotein cholesterol (LDL-C) are still incomplete and only statistical reference ranges for apolipoprotein B ...

  11. The Abdominal Aortic Aneurysm Statistically Corrected Operative Risk Evaluation (AAA SCORE) for predicting mortality after open and endovascular interventions.

    Science.gov (United States)

    Ambler, Graeme K; Gohel, Manjit S; Mitchell, David C; Loftus, Ian M; Boyle, Jonathan R

    2015-01-01

    Accurate adjustment of surgical outcome data for risk is vital in an era of surgeon-level reporting. Current risk prediction models for abdominal aortic aneurysm (AAA) repair are suboptimal. We aimed to develop a reliable risk model for in-hospital mortality after intervention for AAA, using rigorous contemporary statistical techniques to handle missing data. Using data collected during a 15-month period in the United Kingdom National Vascular Database, we applied multiple imputation methodology together with stepwise model selection to generate preoperative and perioperative models of in-hospital mortality after AAA repair, using two thirds of the available data. Model performance was then assessed on the remaining third of the data by receiver operating characteristic curve analysis and compared with existing risk prediction models. Model calibration was assessed by Hosmer-Lemeshow analysis. A total of 8088 AAA repair operations were recorded in the National Vascular Database during the study period, of which 5870 (72.6%) were elective procedures. Both preoperative and perioperative models showed excellent discrimination, with areas under the receiver operating characteristic curve of .89 and .92, respectively. This was significantly better than any of the existing models (area under the receiver operating characteristic curve for best comparator model, .84 and .88; P AAA repair. These models were carefully developed with rigorous statistical methodology and significantly outperform existing methods for both elective cases and overall AAA mortality. These models will be invaluable for both preoperative patient counseling and accurate risk adjustment of published outcome data. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  12. The temporal variability of species densities

    International Nuclear Information System (INIS)

    Redfearn, A.; Pimm, S.L.

    1993-01-01

    Ecologists use the term 'stability' to mean to number of different things (Pimm 1984a). One use is to equate stability with low variability in population density over time (henceforth, temporal variability). Temporal variability varies greatly from species to species, so what effects it? There are at least three sets of factors: the variability of extrinsic abiotic factors, food web structure, and the intrinsic features of the species themselves. We can measure temporal variability using at least three statistics: the coefficient of variation of density (CV); the standard deviation of the logarithms of density (SDL); and the variance in the differences between logarithms of density for pairs of consecutive years (called annual variability, hence AV, b y Wolda 1978). There are advantages and disadvantages to each measure (Williamson 1984), though in our experience, the measures are strongly correlated across sets of taxonomically related species. The increasing availability of long-term data sets allows one to calculate these statistics for many species and so to begin to understand the various causes of species differences in temporal variability

  13. Density functional theory calculations of H/D isotope effects on polymer electrolyte membrane fuel cell operations

    Energy Technology Data Exchange (ETDEWEB)

    Yanase, Satoshi; Oi, Takao [Sophia Univ., Tokyo (Japan). Faculty of Science and Technology

    2015-10-01

    To elucidate hydrogen isotope effects observed between fuel and exhaust hydrogen gases during polymer electrolyte membrane fuel cell operations, H-to-D reduced partition function ratios (RPFRs) for the hydrogen species in the Pt catalyst phase of the anode and the electrolyte membrane phase of the fuel cell were evaluated by density functional theory calculations on model species of the two phases. The evaluation yielded 3.2365 as the value of the equilibrium constant of the hydrogen isotope exchange reaction between the two phases at 39 C, which was close to the experimentally estimated value of 3.46-3.99 at the same temperature. It was indicated that H{sup +} ions on the Pt catalyst surface of the anode and H species in the electrolyte membrane phase were isotopically in equilibrium with one another during fuel cell operations.

  14. Compatibility of advanced tokamak plasma with high density and high radiation loss operation in JT-60U

    International Nuclear Information System (INIS)

    Takenaga, H.; Asakura, N.; Kubo, H.; Higashijima, S.; Konoshima, S.; Nakano, T.; Oyama, N.; Ide, S.; Fujita, T.; Takizuka, T.; Kamada, Y.; Miura, Y.; Porter, G.D.; Rognlien, T.D.; Rensink, M.E.

    2005-01-01

    Compatibility of advanced tokamak plasmas with high density and high radiation loss has been investigated in both reversed shear (RS) plasmas and high β p H-mode plasmas with a weak positive shear on JT-60U. In the RS plasmas, the operation regime is extended to high density above the Greenwald density (n GW ) with high confinement (HH y2 >1) and high radiation loss fraction (f rad >0.9) by tailoring the internal transport barriers (ITBs). High confinement of HH y2 =1.2 is sustained even with 80% radiation from the main plasma enhanced by accumulated metal impurity. The divertor radiation is enhanced by Ne seeding and the ratio of the divertor radiation to the total radiation is increased from 20% without seeding to 40% with Ne seeding. In the high β p H-mode plasmas, high confinement (HH y2 =0.96) is maintained at high density (n-bar e /n GW =0.92) with high radiation loss fraction (f rad ∼1) by utilizing high-field-side pellets and Ar injections. The high n-bar e /n GW is obtained due to a formation of clear density ITB. Strong core-edge parameter linkage is observed, as well as without Ar injection. In this linkage, the pedestal β p , defined as β p ped =p ped /(B p 2 /2μ 0 ) where p ped is the plasma pressure at the pedestal top, is enhanced with the total β p . The radiation profile in the main plasma is peaked due to Ar accumulation inside the ITB and the measured central radiation is ascribed to Ar. The impurity transport analyses indicate that Ar accumulation by a factor of 2 more than the electron, as observed in the high β p H-mode plasma, is acceptable even with peaked density profile in a fusion reactor for impurity seeding. (author)

  15. Experimental study on working characteristics of density lock

    International Nuclear Information System (INIS)

    Sun Furong; Yan Changqi; Gu Haifeng

    2011-01-01

    The working principle of density lock was introduced in this paper, and the experimental loop was built so that researches on working performance of density lock in the system were done at steady-state operation and pump trip conditions. The results show that at steady-state operation conditions, density lock can keep close in a long run, which will separate passive residual heat removal circuit from primary circuit. As a result, passive residual heat removal circuit is in the non-operating conditions, which dose not influence normal operation of reactors. At the pump trip conditions, density lock can be automatically opened quickly, which will make primary and passive residual heat removal system communicated. The natural circulation is well established in the two systems, and is enough to ensure removal of residual heat. (authors)

  16. Density regulation in Northeast Atlantic fish populations: Density dependence is stronger in recruitment than in somatic growth.

    Science.gov (United States)

    Zimmermann, Fabian; Ricard, Daniel; Heino, Mikko

    2018-05-01

    Population regulation is a central concept in ecology, yet in many cases its presence and the underlying mechanisms are difficult to demonstrate. The current paradigm maintains that marine fish populations are predominantly regulated by density-dependent recruitment. While it is known that density-dependent somatic growth can be present too, its general importance remains unknown and most practical applications neglect it. This study aimed to close this gap by for the first time quantifying and comparing density dependence in growth and recruitment over a large set of fish populations. We fitted density-dependent models to time-series data on population size, recruitment and age-specific weight from commercially exploited fish populations in the Northeast Atlantic Ocean and the Baltic Sea. Data were standardized to enable a direct comparison within and among populations, and estimated parameters were used to quantify the impact of density regulation on population biomass. Statistically significant density dependence in recruitment was detected in a large proportion of populations (70%), whereas for density dependence in somatic growth the prevalence of density dependence depended heavily on the method (26% and 69%). Despite age-dependent variability, the density dependence in recruitment was consistently stronger among age groups and between alternative approaches that use weight-at-age or weight increments to assess growth. Estimates of density-dependent reduction in biomass underlined these results: 97% of populations with statistically significant parameters for growth and recruitment showed a larger impact of density-dependent recruitment on population biomass. The results reaffirm the importance of density-dependent recruitment in marine fishes, yet they also show that density dependence in somatic growth is not uncommon. Furthermore, the results are important from an applied perspective because density dependence in somatic growth affects productivity and

  17. Probability density function shape sensitivity in the statistical modeling of turbulent particle dispersion

    Science.gov (United States)

    Litchford, Ron J.; Jeng, San-Mou

    1992-01-01

    The performance of a recently introduced statistical transport model for turbulent particle dispersion is studied here for rigid particles injected into a round turbulent jet. Both uniform and isosceles triangle pdfs are used. The statistical sensitivity to parcel pdf shape is demonstrated.

  18. Cluster observations of near-Earth magnetospheric lobe plasma densities – a statistical study

    Directory of Open Access Journals (Sweden)

    K. R. Svenes

    2008-09-01

    Full Text Available The Cluster-mission has enabled a study of the near-Earth magnetospheric lobes throughout the waning part of solar cycle 23. During the first seven years of the mission the satellites crossed this region of space regularly from about July to October. We have obtained new and more accurate plasma densities in this region based on spacecraft potential measurements from the EFW-instrument. The plasma density measurements are found by converting the potential measurements using a functional relationship between these two parameters. Our observations have shown that throughout this period a full two thirds of the measurements were contained in the range 0.007–0.092 cm−3 irrespective of solar wind conditions or geomagnetic activity. In fact, the most probable density encountered was 0.047 cm−3, staying roughly constant throughout the entire observation period. The plasma population in this region seems to reflect an equilibrium situation in which the density is independent of the solar wind condition or geomagnetic activity. However, the high density tail of the population (ne>0.2 cm−3 seemed to decrease with the waning solar cycle. This points to a source region influenced by the diminishing solar UV/EUV-intensity. Noting that the quiet time polar wind has just such a development and that it is magnetically coupled to the lobes, it seems likely to assume that this is a prominent source for the lobe plasma.

  19. Statistics of strain rates and surface density function in a flame-resolved high-fidelity simulation of a turbulent premixed bluff body burner

    Science.gov (United States)

    Sandeep, Anurag; Proch, Fabian; Kempf, Andreas M.; Chakraborty, Nilanjan

    2018-06-01

    The statistical behavior of the surface density function (SDF, the magnitude of the reaction progress variable gradient) and the strain rates, which govern the evolution of the SDF, have been analyzed using a three-dimensional flame-resolved simulation database of a turbulent lean premixed methane-air flame in a bluff-body configuration. It has been found that the turbulence intensity increases with the distance from the burner, changing the flame curvature distribution and increasing the probability of the negative curvature in the downstream direction. The curvature dependences of dilatation rate ∇ṡu → and displacement speed Sd give rise to variations of these quantities in the axial direction. These variations affect the nature of the alignment between the progress variable gradient and the local principal strain rates, which in turn affects the mean flame normal strain rate, which assumes positive values close to the burner but increasingly becomes negative as the effect of turbulence increases with the axial distance from the burner exit. The axial distance dependences of the curvature and displacement speed also induce a considerable variation in the mean value of the curvature stretch. The axial distance dependences of the dilatation rate and flame normal strain rate govern the behavior of the flame tangential strain rate, and its mean value increases in the downstream direction. The current analysis indicates that the statistical behaviors of different strain rates and displacement speed and their curvature dependences need to be included in the modeling of flame surface density and scalar dissipation rate in order to accurately capture their local behaviors.

  20. Statistical Description of Segregation in a Powder Mixture

    DEFF Research Database (Denmark)

    Chapiro, Alexander; Stenby, Erling Halfdan

    1996-01-01

    In this paper we apply the statistical mechanics of powders to describe a segregated state in a mixture of grains of different sizes. Variation of the density of a packing with depth arising due to changes of particle configurations is studied. The statistical mechanics of powders is generalized...

  1. Damage detection of engine bladed-disks using multivariate statistical analysis

    Science.gov (United States)

    Fang, X.; Tang, J.

    2006-03-01

    The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.

  2. Multivariate density estimation theory, practice, and visualization

    CERN Document Server

    Scott, David W

    2015-01-01

    David W. Scott, PhD, is Noah Harding Professor in the Department of Statistics at Rice University. The author of over 100 published articles, papers, and book chapters, Dr. Scott is also Fellow of the American Statistical Association (ASA) and the Institute of Mathematical Statistics. He is recipient of the ASA Founder's Award and the Army Wilks Award. His research interests include computational statistics, data visualization, and density estimation. Dr. Scott is also Coeditor of Wiley Interdisciplinary Reviews: Computational Statistics and previous Editor of the Journal of Computational and

  3. A real-space stochastic density matrix approach for density functional electronic structure.

    Science.gov (United States)

    Beck, Thomas L

    2015-12-21

    The recent development of real-space grid methods has led to more efficient, accurate, and adaptable approaches for large-scale electrostatics and density functional electronic structure modeling. With the incorporation of multiscale techniques, linear-scaling real-space solvers are possible for density functional problems if localized orbitals are used to represent the Kohn-Sham energy functional. These methods still suffer from high computational and storage overheads, however, due to extensive matrix operations related to the underlying wave function grid representation. In this paper, an alternative stochastic method is outlined that aims to solve directly for the one-electron density matrix in real space. In order to illustrate aspects of the method, model calculations are performed for simple one-dimensional problems that display some features of the more general problem, such as spatial nodes in the density matrix. This orbital-free approach may prove helpful considering a future involving increasingly parallel computing architectures. Its primary advantage is the near-locality of the random walks, allowing for simultaneous updates of the density matrix in different regions of space partitioned across the processors. In addition, it allows for testing and enforcement of the particle number and idempotency constraints through stabilization of a Feynman-Kac functional integral as opposed to the extensive matrix operations in traditional approaches.

  4. A statistical model for horizontal mass flux of erodible soil

    International Nuclear Information System (INIS)

    Babiker, A.G.A.G.; Eltayeb, I.A.; Hassan, M.H.A.

    1986-11-01

    It is shown that the mass flux of erodible soil transported horizontally by a statistically distributed wind flow has a statistical distribution. Explicit expression for the probability density function, p.d.f., of the flux is derived for the case in which the wind speed has a Weibull distribution. The statistical distribution for a mass flux characterized by a generalized Bagnold formula is found to be Weibull for the case of zero threshold speed. Analytic and numerical values for the average horizontal mass flux of soil are obtained for various values of wind parameters, by evaluating the first moment of the flux density function. (author)

  5. Fast response densitometer for measuring liquid density

    Science.gov (United States)

    1972-01-01

    Densitometer was developed which produces linear voltage proportional to changes in density of flowing liquid hydrogen. Unit has fast response time and good system stability, statistical variation, and thermal equilibrium. System accuracy is 2 percent of total density span. Basic design may be altered to include measurement of other flowing materials.

  6. The effects of dynamics on statistical emission

    International Nuclear Information System (INIS)

    Friedman, W.A.

    1989-01-01

    The dynamical processes which occur during the disassembly of an excited nuclear system influence predictions arising from a statistical treatment of the decay of that system. Changes, during the decay period, in such collective properties as angular momentum, density, and kinetic energy of the emitting source affect both the mass and energy spectra of the emitted fragments. This influence will be examined. The author will explore the influence of nuclear compressibility on the decay process, in order to determine what information can be learned about this property from the products of decay. He will compare the relationship between disparate scenarios of decay: a succession of binary decays, each governed by statistics; and a full microcanonical distribution at a single freeze-out density. The author hopes to learn from the general nature of these two statistical predictions when one or the other might be more realistic, and what signatures resulting from the two models might be used to determine which accounts best for specific experimental results

  7. Relating N2O emissions during biological nitrogen removal with operating conditions using multivariate statistical techniques.

    Science.gov (United States)

    Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E

    2018-04-26

    Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants

  8. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  9. Level density of radioactive doubly-magic nucleus 56Ni

    International Nuclear Information System (INIS)

    Santhosh Kumar, S.; Rengaiyan, R.; Victor Babu, A.; Preetha, P.

    2012-01-01

    In this work the single particle energies are obtained by diagonalising the Nilsson Hamiltonian in the cylindrical basis and are generated up to N =11 shells for the isotopes of Ni from A = 48-70, emphasizing the three magic nuclei viz, 48 Ni, 56 Ni and 68 Ni. The statistical quantities like excitation energy, level density parameter and nuclear level density which play the important roles in the nuclear structure and nuclear reactions can be calculated theoretically by means of the Statistical or Partition function method. Hence the statistical model approach is followed to probe the dynamical properties of the nucleus in the microscopic level

  10. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2005-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describes the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of full-scale measurements recorded with a high sampling rate...

  11. Statistical mechanics of violent relaxation

    International Nuclear Information System (INIS)

    Shu, F.H.

    1978-01-01

    We reexamine the foundations of Lynden-Bell's statistical mechanical discussion of violent relaxation in collisionless stellar systems. We argue that Lynden-Bell's formulation in terms of a continuum description introduces unnecessary complications, and we consider a more conventional formulation in terms of particles. We then find the exclusion principle discovered by Lynden-Bell to be quantitatively important only at phase densities where two-body encounters are no longer negligible. Since the edynamical basis for the exclusion principle vanishes in such cases anyway, Lynden-Bell statistics always reduces in practice to Maxwell-Boltzmann statistics when applied to stellar systems. Lynden-Bell also found the equilibrium distribution function generally to be a sum of Maxwellians with velocity dispersions dependent on the phase density at star formation. We show that this difficulty vanishes in the particulate description for an encounterless stellar system as long as stars of different masses are initially well mixed in phase space. Our methods also demonstrate the equivalence between Gibbs's formalism which uses the microcanonical ensemble and Boltzmann's formalism which uses a coarse-grained continuum description. In addition, we clarify the concept of irreversible behavior on a macroscopic scale for an encounterless stellar system. Finally, we comment on the use of unusual macroscopic constraints to simulate the effects of incomplete relaxation

  12. U.S. nuclear plant statistics, 8th Edition

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    Wolf Creek was the lowest cost nuclear plant in 1992 according to the annual plant rankings in UDI's comprehensive annual statistical factbook for US nuclear power plants (operating, under construction, deferred, canceled or retired). The book covers operating and maintenance expenses for the past year (1992), annual and lifetime performance statistics, capitalization expenses and changes in capitalization, construction cost information, joint ownership of plants and canceled plants. First published for CY1984 statistics

  13. Beginning R The Statistical Programming Language

    CERN Document Server

    Gardener, Mark

    2012-01-01

    Conquer the complexities of this open source statistical language R is fast becoming the de facto standard for statistical computing and analysis in science, business, engineering, and related fields. This book examines this complex language using simple statistical examples, showing how R operates in a user-friendly context. Both students and workers in fields that require extensive statistical analysis will find this book helpful as they learn to use R for simple summary statistics, hypothesis testing, creating graphs, regression, and much more. It covers formula notation, complex statistics

  14. Review of DIII-D H-Mode Density Limit Studies

    International Nuclear Information System (INIS)

    Maingi, R.; Mahdavi, M.A.

    2005-01-01

    Density limit studies over the past 10 yr on DIII-D have successfully identified several processes that limit plasma density in various operating modes. The recent focus of these studies has been on maintenance of the high-density operational window with good H-mode level energy confinement. We find that detachment and onset of multifaceted axisymmetric radiation from the edge (MARFE), fueling efficiency, particle confinement, and magnetohydrodynamic activity can impose density limits in certain regimes. By studying these processes, we have devised techniques with either pellets or gas fueling and divertor pumping to achieve line average density above Greenwald scaling, relying on increasing the ratio of pedestal to separatrix density, as well as density profile peaking. The scaling of several of these processes to next-step devices (e.g., the International Thermonuclear Experimental Reactor) has indicated that sufficiently high pedestal density can be achieved with conventional fueling techniques while ensuring divertor partial detachment needed for heat flux reduction. One density limit process requiring further study is neoclassical tearing mode (NTM) onset, and techniques for avoidance/mitigation of NTMs need additional development in present-day devices operated at high density

  15. Symmetry and statistics

    International Nuclear Information System (INIS)

    French, J.B.

    1974-01-01

    The concepts of statistical behavior and symmetry are presented from the point of view of many body spectroscopy. Remarks are made on methods for the evaluation of moments, particularly widths, for the purpose of giving a feeling for the types of mathematical structures encountered. Applications involving ground state energies, spectra, and level densities are discussed. The extent to which Hamiltonian eigenstates belong to irreducible representations is mentioned. (4 figures, 1 table) (U.S.)

  16. Statistical study of TCV disruptivity and H-mode accessibility

    International Nuclear Information System (INIS)

    Martin, Y.; Deschenaux, C.; Lister, J.B.; Pochelon, A.

    1997-01-01

    Optimising tokamak operation consists of finding a path, in a multidimensional parameter space, which leads to the desired plasma characteristics and avoids hazards regions. Typically the desirable regions are the domain where an L-mode to H-mode transition can occur, and then, in the H-mode, where ELMs and the required high density< y can be maintained. The regions to avoid are those with a high rate of disruptivity. On TCV, learning the safe and successful paths is achieved empirically. This will no longer be possible in a machine like ITER, since only a small percentage of disrupted discharges will be tolerable. An a priori knowledge of the hazardous regions in ITER is therefore mandatory. This paper presents the results of a statistical analysis of the occurrence of disruptions in TCV. (author) 4 figs

  17. Density structures inside the plasmasphere: Cluster observations

    DEFF Research Database (Denmark)

    Darrouzet, F.; Decreau, P.M.E.; De Keyser, J.

    2004-01-01

    The electron density profiles derived from the EFW and WHISPER instruments on board the four Cluster spacecraft reveal density structures inside the plasmasphere and at its outer boundary, the plasmapause. We have conducted a statistical study to characterize these density structures. We focus...... on the plasmasphere crossing on I I April 2002, during which Cluster observed several density irregularities inside the plasmasphere, as well as a plasmaspheric plume. We derive the density gradient vectors from simultaneous density measurements by the four spacecraft. We also determine the normal velocity...... of the boundaries of the plume and of the irregularities from the time delays between those boundaries in the four individual density profiles, assuming they are planar. These new observations yield novel insights about the occurrence of density irregularities, their geometry and their dynamics. These in...

  18. Thermodynamics for Fractal Statistics

    OpenAIRE

    da Cruz, Wellington

    1998-01-01

    We consider for an anyon gas its termodynamics properties taking into account the fractal statistics obtained by us recently. This approach describes the anyonic excitations in terms of equivalence classes labeled by fractal parameter or Hausdorff dimension $h$. An exact equation of state is obtained in the high-temperature and low-temperature limits, for gases with a constant density of states.

  19. Effects of watershed densities of animal feeding operations on nutrient concentrations and estrogenic activity in agricultural streams

    Science.gov (United States)

    Ciparis, Serena; Iwanowicz, Luke R.; Voshell, J. Reese

    2012-01-01

    Application of manures from animal feeding operations (AFOs) as fertilizer on agricultural land can introduce nutrients and hormones (e.g. estrogens) to streams. A landscape-scale study was conducted in the Shenandoah River watershed (Virginia, USA) in order to assess the relationship between densities of AFOs in watersheds of agricultural streams and in-stream nutrient concentrations and estrogenic activity. The effect of wastewater treatment plants (WWTPs) on nutrients and estrogenic activity was also evaluated. During periods of high and low flow, dissolved inorganic nitrogen (DIN) and orthophosphate (PO4-P) concentrations were analyzed and estrogens/estrogenic compounds were extracted and quantified as17β-estradiol equivalents (E2Eq) using a bioluminescent yeast estrogen screen. Estrogenic activity was measurable in the majority of collected samples, and 20% had E2Eq concentrations > 1 ng/L. Relatively high concentrations of DIN (> 1000 μg/L) were also frequently detected. During all sampling periods, there were strong relationships between watershed densities of AFOs and in-stream concentrations of DIN (R2 = 0.56–0.81) and E2Eq (R2 = 0.39–0.75). Relationships between watershed densities of AFOs and PO4-P were weaker, but were also significant (R2 = 0.27–0.57). When combined with the effect of watershed AFO density, streams receiving WWTP effluent had higher concentrations of PO4-P than streams without WWTP discharges, and PO4-P was the only analyte with a consistent relationship to WWTPs. The results of this study suggest that as the watershed density of AFOs increases, there is a proportional increase in the potential for nonpoint source pollution of agricultural streams and their receiving waters by nutrients, particularly DIN, and compounds that can cause endocrine disruption in aquatic organisms.

  20. Effects of watershed densities of animal feeding operations on nutrient concentrations and estrogenic activity in agricultural streams.

    Science.gov (United States)

    Ciparis, Serena; Iwanowicz, Luke R; Voshell, J Reese

    2012-01-01

    Application of manures from animal feeding operations (AFOs) as fertilizer on agricultural land can introduce nutrients and hormones (e.g. estrogens) to streams. A landscape-scale study was conducted in the Shenandoah River watershed (Virginia, USA) in order to assess the relationship between densities of AFOs in watersheds of agricultural streams and in-stream nutrient concentrations and estrogenic activity. The effect of wastewater treatment plants (WWTPs) on nutrients and estrogenic activity was also evaluated. During periods of high and low flow, dissolved inorganic nitrogen (DIN) and orthophosphate (PO(4)-P) concentrations were analyzed and estrogens/estrogenic compounds were extracted and quantified as17β-estradiol equivalents (E2Eq) using a bioluminescent yeast estrogen screen. Estrogenic activity was measurable in the majority of collected samples, and 20% had E2Eq concentrations >1 ng/L. Relatively high concentrations of DIN (>1000 μg/L) were also frequently detected. During all sampling periods, there were strong relationships between watershed densities of AFOs and in-stream concentrations of DIN (R(2) = 0.56-0.81) and E2Eq (R(2) = 0.39-0.75). Relationships between watershed densities of AFOs and PO(4)-P were weaker, but were also significant (R(2) = 0.27-0.57). When combined with the effect of watershed AFO density, streams receiving WWTP effluent had higher concentrations of PO(4)-P than streams without WWTP discharges, and PO(4)-P was the only analyte with a consistent relationship to WWTPs. The results of this study suggest that as the watershed density of AFOs increases, there is a proportional increase in the potential for nonpoint source pollution of agricultural streams and their receiving waters by nutrients, particularly DIN, and compounds that can cause endocrine disruption in aquatic organisms. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Level densities in nuclear physics

    International Nuclear Information System (INIS)

    Beckerman, M.

    1978-01-01

    In the independent-particle model nucleons move independently in a central potential. There is a well-defined set of single- particle orbitals, each nucleon occupies one of these orbitals subject to Fermi statistics, and the total energy of the nucleus is equal to the sum of the energies of the individual nucleons. The basic question is the range of validity of this Fermi gas description and, in particular, the roles of the residual interactions and collective modes. A detailed examination of experimental level densities in light-mass system is given to provide some insight into these questions. Level densities over the first 10 MeV or so in excitation energy as deduced from neutron and proton resonances data and from spectra of low-lying bound levels are discussed. To exhibit some of the salient features of these data comparisons to independent-particle (shell) model calculations are presented. Shell structure is predicted to manifest itself through discontinuities in the single-particle level density at the Fermi energy and through variatons in the occupancy of the valence orbitals. These predictions are examined through combinatorial calculations performed with the Grover [Phys. Rev., 157, 832(1967), 185 1303(1969)] odometer method. Before the discussion of the experimenta results, statistical mechanical level densities for spherical nuclei are reviewed. After consideration of deformed nuclei, the conclusions resulting from this work are drawn. 7 figures, 3 tables

  2. Probing topological relations between high-density and low-density regions of 2MASS with hexagon cells

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yongfeng [American Physical Society, San Diego, CA (United States); Xiao, Weike, E-mail: yongfeng.wu@maine.edu [Department of Astronautics Engineering, Harbin Institute of Technology, P.O. Box 345, Heilongjiang Province 150001 (China)

    2014-02-01

    We introduced a new two-dimensional (2D) hexagon technique for probing the topological structure of the universe in which we mapped regions of the sky with high and low galaxy densities onto a 2D lattice of hexagonal unit cells. We defined filled cells as corresponding to high-density regions and empty cells as corresponding to low-density regions. The numbers of filled cells and empty cells were kept the same by controlling the size of the cells. By analyzing the six sides of each hexagon, we could obtain and compare the statistical topological properties of high-density and low-density regions of the universe in order to have a better understanding of the evolution of the universe. We applied this hexagonal method to Two Micron All Sky Survey data and discovered significant topological differences between the high-density and low-density regions. Both regions had significant (>5σ) topological shifts from both the binomial distribution and the random distribution.

  3. On a curvature-statistics theorem

    International Nuclear Information System (INIS)

    Calixto, M; Aldaya, V

    2008-01-01

    The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature (κ = ±1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.

  4. On a curvature-statistics theorem

    Energy Technology Data Exchange (ETDEWEB)

    Calixto, M [Departamento de Matematica Aplicada y Estadistica, Universidad Politecnica de Cartagena, Paseo Alfonso XIII 56, 30203 Cartagena (Spain); Aldaya, V [Instituto de Astrofisica de Andalucia, Apartado Postal 3004, 18080 Granada (Spain)], E-mail: Manuel.Calixto@upct.es

    2008-08-15

    The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature ({kappa} = {+-}1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.

  5. Functional statistics and related fields

    CERN Document Server

    Bongiorno, Enea; Cao, Ricardo; Vieu, Philippe

    2017-01-01

    This volume collects latest methodological and applied contributions on functional, high-dimensional and other complex data, related statistical models and tools as well as on operator-based statistics. It contains selected and refereed contributions presented at the Fourth International Workshop on Functional and Operatorial Statistics (IWFOS 2017) held in A Coruña, Spain, from 15 to 17 June 2017. The series of IWFOS workshops was initiated by the Working Group on Functional and Operatorial Statistics at the University of Toulouse in 2008. Since then, many of the major advances in functional statistics and related fields have been periodically presented and discussed at the IWFOS workshops. .

  6. Operational limits of high density H-modes in ASDEX Upgrade

    International Nuclear Information System (INIS)

    Mertens, V.; Borrass, K.; Kaufmann, M.; Lang, P.T.; Lang, R.; Mueller, H.W.; Neuhauser, J.; Schneider, R.; Schweinzer, J.; Suttrop, W.

    2001-01-01

    Systematic investigations of H-mode density limit (H→L-mode back transition) plasmas with gas fuelling and alternatively with additional pellet injection from the magnetic high-field-side HFS are being performed in the new closed divertor configuration DV-II. The resulting database covering a wide range of the externally controllable plasma parameters I p , B t and P heat confirms that the H-mode threshold power exceeds the generally accepted prediction P L→H heat ∝B-bar t dramatically when one approaches Greenwald densities. Additionally, in contrast to the Greenwald scaling a moderate B t -dependence of the H-mode density limit is found. The limit is observed to coincide with divertor detachment and a strong increase of the edge thermal transport, which has, however, no detrimental effect on global τ E . The pellet injection scheme from the magnetic high-field-side HFS, developed recently on ASDEX Upgrade, leads to fast particle drifts which are, contrary to the standard injection from the low-field-side, directed into the plasma core. This improves markedly the pellet particle fuelling efficiency. The responsible physical mechanism, the diamagnetic particle drift of the pellet ablatant was successfully verified recently. Other increased particle losses on respectively different time scales after the ablation process, however, still persist. Generally, a clear gain in achievable density and plasma stored energy is achieved with stationary HFS pellet injection compared to gas-puffing. (author)

  7. Long-range corrected density functional theory with accelerated Hartree-Fock exchange integration using a two-Gaussian operator [LC-ωPBE(2Gau)].

    Science.gov (United States)

    Song, Jong-Won; Hirao, Kimihiko

    2015-10-14

    Since the advent of hybrid functional in 1993, it has become a main quantum chemical tool for the calculation of energies and properties of molecular systems. Following the introduction of long-range corrected hybrid scheme for density functional theory a decade later, the applicability of the hybrid functional has been further amplified due to the resulting increased performance on orbital energy, excitation energy, non-linear optical property, barrier height, and so on. Nevertheless, the high cost associated with the evaluation of Hartree-Fock (HF) exchange integrals remains a bottleneck for the broader and more active applications of hybrid functionals to large molecular and periodic systems. Here, we propose a very simple yet efficient method for the computation of long-range corrected hybrid scheme. It uses a modified two-Gaussian attenuating operator instead of the error function for the long-range HF exchange integral. As a result, the two-Gaussian HF operator, which mimics the shape of the error function operator, reduces computational time dramatically (e.g., about 14 times acceleration in C diamond calculation using periodic boundary condition) and enables lower scaling with system size, while maintaining the improved features of the long-range corrected density functional theory.

  8. Strong density of a class of simple operators

    International Nuclear Information System (INIS)

    Somasundaram, S.; Mohammad, N.

    1991-08-01

    An algebra of simple operators has been shown to be strongly dense in the algebra of all bounded linear operators on function spaces of a compact (not necessarily abelian) group. Further, it is proved that the same result is also true for L 2 (G) if G is a locally compact (not necessarily compact) abelian group. (author). 6 refs

  9. Correlations between different methods of UO2 pellet density measurement

    International Nuclear Information System (INIS)

    Yanagisawa, Kazuaki

    1977-07-01

    Density of UO 2 pellets was measured by three different methods, i.e., geometrical, water-immersed and meta-xylene immersed and treated statistically, to find out the correlations between UO 2 pellets are of six kinds but with same specifications. The correlations are linear 1 : 1 for pellets of 95% theoretical densities and above, but such do not exist below the level and variated statistically due to interaction between open and close pores. (auth.)

  10. Parity and the spin{statistics connection

    Indian Academy of Sciences (India)

    A simple demonstration of the spin-statistics connection for general causal fields is obtained by using the parity operation to exchange spatial coordinates in the scalar product of a locally commuting field operator, evaluated at position x, with the same field operator evaluated at -x, at equal times.

  11. The statistics of galaxies: beyond correlation functions

    International Nuclear Information System (INIS)

    Lachieze-Rey, M.

    1988-01-01

    I mention some normalization problems encountered when estimating the 2-point correlation functions in samples of galaxies of different average densities. I present some aspects of the void probability function as a statistical indicator, free of such normalization problems. Finally I suggest a new statistical approach to give an account in a synthetic way of those aspects of the galaxy distribution that a conventional method is unable to characterize

  12. Statistical summary 1990-91

    International Nuclear Information System (INIS)

    1991-01-01

    The information contained in this statistical summary leaflet summarizes in bar charts or pie charts Nuclear Electric's performance in 1990-91 in the areas of finance, plant and plant operations, safety, commercial operations and manpower. It is intended that the information will provide a basis for comparison in future years. The leaflet also includes a summary of Nuclear Electric's environmental policy statement. (UK)

  13. Operation of ADITYA Thomson scattering system: measurement of temperature and density

    International Nuclear Information System (INIS)

    Thomas, Jinto; Pillai, Vishal; Singh, Neha; Patel, Kiran; Lingeshwari, G.; Hingrajiya, Zalak; Kumar, Ajai

    2015-01-01

    ADITYA Thomson scattering (TS) system is a single point measurement system operated using a 10 J ruby laser and a 1 meter grating spectrometer. Multi-slit optical fibers are arranged at the image plane of the spectrometer so that each fiber slit collects 2 nm band of scattered spectrum. Each slit of the fiber bundle is coupled to high gain Photomultiplier tubes (PMT). Standard white light source is used to calibrate the optical fiber transmission and the laser light itself is used to calibrate the relative gain of the PMT. Rayleigh scattering has been performed for the absolute calibration of the TS system. The temperature of ADITYA plasma has been calculated using the conventional method of estimation (calculated using the slope of logarithmic intensity vs the square of delta lambda). It has been observed that the core temperature of ADITYA Tokamak plasma is in the range of 300 to 600 eV for different plasma shots and the density 2-3 X 10 13 /cc. The time evolution of the plasma discharge has been studied by firing the laser at different times of the discharge assuming the shots are identical. In some of the discharges, the velocity distribution appears to be non Maxwellian. (author)

  14. Statistics of excitations in the electron glass model

    Science.gov (United States)

    Palassini, Matteo

    2011-03-01

    We study the statistics of elementary excitations in the classical electron glass model of localized electrons interacting via the unscreened Coulomb interaction in the presence of disorder. We reconsider the long-standing puzzle of the exponential suppression of the single-particle density of states near the Fermi level, by measuring accurately the density of states of charged and electron-hole pair excitations via finite temperature Monte Carlo simulation and zero-temperature relaxation. We also investigate the statistics of large charge rearrangements after a perturbation of the system, which may shed some light on the slow relaxation and glassy phenomena recently observed in a variety of Anderson insulators. In collaboration with Martin Goethe.

  15. Density-dependent feedbacks can mask environmental drivers of populations

    DEFF Research Database (Denmark)

    Dahlgren, Johan Petter

    I present some results from studies identifying environmental drivers of vital rates and population dynamics when controlling for intraspecific density statistically or experimentally, show that density dependence can be strong even in populations of slow-growing species in stressful habitats, an...

  16. Two methods for isolating the lung area of a CT scan for density information

    International Nuclear Information System (INIS)

    Hedlund, L.W.; Anderson, R.F.; Goulding, P.L.; Beck, J.W.; Effmann, E.L.; Putman, C.E.

    1982-01-01

    Extracting density information from irregularly shaped tissue areas of CT scans requires automated methods when many scans are involved. We describe two computer methods that automatically isolate the lung area of a CT scan. Each starts from a single, operator specified point in the lung. The first method follows the steep density gradient boundary between lung and adjacent tissues; this tracking method is useful for estimating the overall density and total area of lung in a scan because all pixels within the lung area are available for statistical sampling. The second method finds all contiguous pixels of lung that are within the CT number range of air to water and are not a part of strong density gradient edges; this method is useful for estimating density and area of the lung parenchyma. Structures within the lung area that are surrounded by strong density gradient edges, such as large blood vessels, airways and nodules, are excluded from the lung sample while lung areas with diffuse borders, such as an area of mild or moderate edema, are retained. Both methods were tested on scans from an animal model of pulmonary edema and were found to be effective in isolating normal and diseased lungs. These methods are also suitable for isolating other organ areas of CT scans that are bounded by density gradient edges

  17. 14 CFR 291.41 - Financial and statistical reporting-general.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Financial and statistical reporting-general... (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS CARGO OPERATIONS IN INTERSTATE AIR TRANSPORTATION Reporting Rules § 291.41 Financial and statistical reporting—general. (a) Carriers providing cargo operations in...

  18. Determining the Limiting Current Density of Vanadium Redox Flow Batteries

    Directory of Open Access Journals (Sweden)

    Jen-Yu Chen

    2014-09-01

    Full Text Available All-vanadium redox flow batteries (VRFBs are used as energy storage systems for intermittent renewable power sources. The performance of VRFBs depends on materials of key components and operating conditions, such as current density, electrolyte flow rate and electrolyte composition. Mass transfer overpotential is affected by the electrolyte flow rate and electrolyte composition, which is related to the limiting current density. In order to investigate the effect of operating conditions on mass transport overpotential, this study established a relationship between the limiting current density and operating conditions. First, electrolyte solutions with different states of charge were prepared and used for a single cell to obtain discharging polarization curves under various operating conditions. The experimental results were then analyzed and are discussed in this paper. Finally, this paper proposes a limiting current density as a function of operating conditions. The result helps predict the effect of operating condition on the cell performance in a mathematical model.

  19. Detection of density dependence requires density manipulations and calculation of lambda.

    Science.gov (United States)

    Fowler, N L; Overath, R Deborah; Pease, Craig M

    2006-03-01

    To investigate density-dependent population regulation in the perennial bunchgrass Bouteloua rigidiseta, we experimentally manipulated density by removing adults or adding seeds to replicate quadrats in a natural population for three annual intervals. We monitored the adjacent control quadrats for 14 annual intervals. We constructed a population projection matrix for each quadrat in each interval, calculated lambda, and did a life table response experiment (LTRE) analysis. We tested the effects of density upon lambda by comparing experimental and control quadrats, and by an analysis of the 15-year observational data set. As measured by effects on lambda and on N(t+1/Nt in the experimental treatments, negative density dependence was strong: the population was being effectively regulated. The relative contributions of different matrix elements to treatment effect on lambda differed among years and treatments; overall the pattern was one of small contributions by many different life cycle stages. In contrast, density dependence could not be detected using only the observational (control quadrats) data, even though this data set covered a much longer time span. Nor did experimental effects on separate matrix elements reach statistical significance. These results suggest that ecologists may fail to detect density dependence when it is present if they have only descriptive, not experimental, data, do not have data for the entire life cycle, or analyze life cycle components separately.

  20. A performance analysis for MHD power cycles operating at maximum power density

    International Nuclear Information System (INIS)

    Sahin, Bahri; Kodal, Ali; Yavuz, Hasbi

    1996-01-01

    An analysis of the thermal efficiency of a magnetohydrodynamic (MHD) power cycle at maximum power density for a constant velocity type MHD generator has been carried out. The irreversibilities at the compressor and the MHD generator are taken into account. The results obtained from power density analysis were compared with those of maximum power analysis. It is shown that by using the power density criteria the MHD cycle efficiency can be increased effectively. (author)

  1. The statistical analysis of anisotropies

    International Nuclear Information System (INIS)

    Webster, A.

    1977-01-01

    One of the many uses to which a radio survey may be put is an analysis of the distribution of the radio sources on the celestial sphere to find out whether they are bunched into clusters or lie in preferred regions of space. There are many methods of testing for clustering in point processes and since they are not all equally good this contribution is presented as a brief guide to what seems to be the best of them. The radio sources certainly do not show very strong clusering and may well be entirely unclustered so if a statistical method is to be useful it must be both powerful and flexible. A statistic is powerful in this context if it can efficiently distinguish a weakly clustered distribution of sources from an unclustered one, and it is flexible if it can be applied in a way which avoids mistaking defects in the survey for true peculiarities in the distribution of sources. The paper divides clustering statistics into two classes: number density statistics and log N/log S statistics. (Auth.)

  2. Single-particle energies and density of states in density functional theory

    Science.gov (United States)

    van Aggelen, H.; Chan, G. K.-L.

    2015-07-01

    Time-dependent density functional theory (TD-DFT) is commonly used as the foundation to obtain neutral excited states and transition weights in DFT, but does not allow direct access to density of states and single-particle energies, i.e. ionisation energies and electron affinities. Here we show that by extending TD-DFT to a superfluid formulation, which involves operators that break particle-number symmetry, we can obtain the density of states and single-particle energies from the poles of an appropriate superfluid response function. The standard Kohn- Sham eigenvalues emerge as the adiabatic limit of the superfluid response under the assumption that the exchange- correlation functional has no dependence on the superfluid density. The Kohn- Sham eigenvalues can thus be interpreted as approximations to the ionisation energies and electron affinities. Beyond this approximation, the formalism provides an incentive for creating a new class of density functionals specifically targeted at accurate single-particle eigenvalues and bandgaps.

  3. Statistics of spatially integrated speckle intensity difference

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Yura, Harold

    2009-01-01

    We consider the statistics of the spatially integrated speckle intensity difference obtained from two separated finite collecting apertures. For fully developed speckle, closed-form analytic solutions for both the probability density function and the cumulative distribution function are derived...... here for both arbitrary values of the mean number of speckles contained within an aperture and the degree of coherence of the optical field. Additionally, closed-form expressions are obtained for the corresponding nth statistical moments....

  4. Quantum information density scaling and qubit operation time constraints of CMOS silicon-based quantum computer architectures

    Science.gov (United States)

    Rotta, Davide; Sebastiano, Fabio; Charbon, Edoardo; Prati, Enrico

    2017-06-01

    range of a silicon complementary metal-oxide-semiconductor quantum processor to be within 1 and 100 GHz. Such constraint limits the feasibility of fault-tolerant quantum information processing with complementary metal-oxide-semiconductor technology only to the most advanced nodes. The compatibility with classical complementary metal-oxide-semiconductor control circuitry is discussed, focusing on the cryogenic complementary metal-oxide-semiconductor operation required to bring the classical controller as close as possible to the quantum processor and to enable interfacing thousands of qubits on the same chip via time-division, frequency-division, and space-division multiplexing. The operation time range prospected for cryogenic control electronics is found to be compatible with the operation time expected for qubits. By combining the forecast of the development of scaled technology nodes with operation time and classical circuitry constraints, we derive a maximum quantum information density for logical qubits of 2.8 and 4 Mqb/cm2 for the 10 and 7-nm technology nodes, respectively, for the Steane code. The density is one and two orders of magnitude less for surface codes and for concatenated codes, respectively. Such values provide a benchmark for the development of fault-tolerant quantum algorithms by circuital quantum information based on silicon platforms and a guideline for other technologies in general.

  5. Licensed operating reactors. Operating units status report, data as of 2-28-79

    International Nuclear Information System (INIS)

    1979-03-01

    This report is divided into three sections: the first contains monthly highlights and statistics for commercial operating units, and errata from previously reported data; the second is a compilation of detailed information on each unit, provided by NRC Regional Offices and IE Headquarters; and the third section is an appendix containing comparative statistics of U.S. nuclear/fossil capacity, identification of nuclear power plants within regional Electric Reliability Councils, the relative status of U.S. nuclear electric production to all U.S. electric production by state, and selected Edison Electric Institute operating statistics. Throughout the report, statistical factors for periods greater than a report month, or for more than one unit, are computed as the arithmetic average of each unit's individual operating statistics. The statistical factors for each unit for the report month are computed from actual power production for the month. The factors for the life-span of each unit (the ''Cumulative'' column) are reported by the utility and are not entirely re-computed by NRC. Utility power production data is checked for consistency with previously submitted statistics

  6. Bounded Densities and Their Derivatives

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, V.

    2009-01-01

    This paper describes how one can compute interval-valued statistical measures given limited information about the underlying distribution. The particular focus is on a bounded derivative of a probability density function and its combination with other available statistical evidence for computing ...... quantities of interest. To be able to utilise the evidence about the derivative it is suggested to adapt the ‘conventional’ problem statement to variational calculus and the way to do so is demonstrated. A number of examples are given throughout the paper....

  7. Low Density Supersonic Decelerators

    Data.gov (United States)

    National Aeronautics and Space Administration — The Low-Density Supersonic Decelerator project will demonstrate the use of inflatable structures and advanced parachutes that operate at supersonic speeds to more...

  8. Fission level densities

    International Nuclear Information System (INIS)

    Maslov, V.M.

    1998-01-01

    Fission level densities (or fissioning nucleus level densities at fission saddle deformations) are required for statistical model calculations of actinide fission cross sections. Back-shifted Fermi-Gas Model, Constant Temperature Model and Generalized Superfluid Model (GSM) are widely used for the description of level densities at stable deformations. These models provide approximately identical level density description at excitations close to the neutron binding energy. It is at low excitation energies that they are discrepant, while this energy region is crucial for fission cross section calculations. A drawback of back-shifted Fermi gas model and traditional constant temperature model approaches is that it is difficult to include in a consistent way pair correlations, collective effects and shell effects. Pair, shell and collective properties of nucleus do not reduce just to the renormalization of level density parameter a, but influence the energy dependence of level densities. These effects turn out to be important because they seem to depend upon deformation of either equilibrium or saddle-point. These effects are easily introduced within GSM approach. Fission barriers are another key ingredients involved in the fission cross section calculations. Fission level density and barrier parameters are strongly interdependent. This is the reason for including fission barrier parameters along with the fission level densities in the Starter File. The recommended file is maslov.dat - fission barrier parameters. Recent version of actinide fission barrier data obtained in Obninsk (obninsk.dat) should only be considered as a guide for selection of initial parameters. These data are included in the Starter File, together with the fission barrier parameters recommended by CNDC (beijing.dat), for completeness. (author)

  9. VALIDATION OF SPRING OPERATED PRESSURE RELIEF VALVE TIME TO FAILURE AND THE IMPORTANCE OF STATISTICALLY SUPPORTED MAINTENANCE INTERVALS

    Energy Technology Data Exchange (ETDEWEB)

    Gross, R; Stephen Harris, S

    2009-02-18

    The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?

  10. Evaluation of macromolecular electron-density map quality using the correlation of local r.m.s. density

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    The correlation of local r.m.s. density is shown to be a good measure of the presence of distinct solvent and macromolecule regions in macromolecular electron-density maps. It has recently been shown that the standard deviation of local r.m.s. electron density is a good indicator of the presence of distinct regions of solvent and protein in macromolecular electron-density maps [Terwilliger & Berendzen (1999 ▶). Acta Cryst. D55, 501–505]. Here, it is demonstrated that a complementary measure, the correlation of local r.m.s. density in adjacent regions on the unit cell, is also a good measure of the presence of distinct solvent and protein regions. The correlation of local r.m.s. density is essentially a measure of how contiguous the solvent (and protein) regions are in the electron-density map. This statistic can be calculated in real space or in reciprocal space and has potential uses in evaluation of heavy-atom solutions in the MIR and MAD methods as well as for evaluation of trial phase sets in ab initio phasing procedures

  11. Model SM-1 ballast density gauge

    International Nuclear Information System (INIS)

    Gao Weixiang; Fang Jidong; Zhang Xuejuan; Zhang Reilin; Gao Wanshan

    1990-05-01

    The ballast density is one of the principal parameters for roadbed operating state. It greatly affects the railroad stability, the accumulation of railroad residual deformation and the amount of work for railroad maintenance. SM-1 ballast density gauge is designed to determine the density of ballast by using the effect of γ-ray passed through the ballast. Its fundamentals, construction, specifications, application and economic profit are described

  12. Statistical shape and appearance models of bones.

    Science.gov (United States)

    Sarkalkan, Nazli; Weinans, Harrie; Zadpoor, Amir A

    2014-03-01

    When applied to bones, statistical shape models (SSM) and statistical appearance models (SAM) respectively describe the mean shape and mean density distribution of bones within a certain population as well as the main modes of variations of shape and density distribution from their mean values. The availability of this quantitative information regarding the detailed anatomy of bones provides new opportunities for diagnosis, evaluation, and treatment of skeletal diseases. The potential of SSM and SAM has been recently recognized within the bone research community. For example, these models have been applied for studying the effects of bone shape on the etiology of osteoarthritis, improving the accuracy of clinical osteoporotic fracture prediction techniques, design of orthopedic implants, and surgery planning. This paper reviews the main concepts, methods, and applications of SSM and SAM as applied to bone. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. The reliability of cone-beam computed tomography to assess bone density at dental implant recipient sites: a histomorphometric analysis by micro-CT.

    Science.gov (United States)

    González-García, Raúl; Monje, Florencio

    2013-08-01

    The aim of this study was to objectively assess the reliability of the cone-beam computed tomography (CBCT) as a tool to pre-operatively determine radiographic bone density (RBD) by the density values provided by the system, analyzing its relationship with histomorphometric bone density expressed as bone volumetric fraction (BV/TV) assessed by micro-CT of bone biopsies at the site of insertion of dental implants in the maxillary bones. Thirty-nine bone biopsies of the maxillary bones at the sites of 39 dental implants from 31 edentulous healthy patients were analyzed. The NobelGuide™ software was used for implant planning, which also allowed fabrication of individual stereolithographic surgical guides. The analysis of CBCT images allowed pre-operative determination of mean density values of implant recipient sites along the major axis of the planned implants (axial RBD). Stereolithographic surgical guides were used to guide implant insertion and also to extract cylindrical bone biopsies from the core of the exact implant site. Further analysis of several osseous micro-structural variables including BV/TV was performed by micro-CT of the extracted bone biopsies. Mean axial RBD was 478 ± 212 (range: 144-953). A statistically significant difference (P = 0.02) was observed among density values of the cortical bone of the upper maxilla and mandible. A high positive Pearson's correlation coefficient (r = 0.858, P micro-CT at the site of dental implants in the maxillary bones. Pre-operative estimation of density values by CBCT is a reliable tool to objectively determine bone density. © 2012 John Wiley & Sons A/S.

  14. Detecting reduced bone mineral density from dental radiographs using statistical shape models

    NARCIS (Netherlands)

    Allen, P.D.; Graham, J.; Farnell, D.J.J.; Harrison, E.J.; Jacobs, R.; Nicopoulou-Karyianni, K.; Lindh, C.; van der Stelt, P.F.; Horner, K.; Devlin, H.

    2007-01-01

    We describe a novel method of estimating reduced bone mineral density (BMD) from dental panoramic tomograms (DPTs), which show the entire mandible. Careful expert width measurement of the inferior mandibular cortex has been shown to be predictive of BMD in hip and spine osteopenia and osteoporosis.

  15. Comparative analysis of human and bovine teeth: radiographic density

    Directory of Open Access Journals (Sweden)

    Jefferson Luis Oshiro Tanaka

    2008-12-01

    Full Text Available Since bovine teeth have been used as substitutes for human teeth in in vitro dental studies, the aim of this study was to compare the radiographic density of bovine teeth with that of human teeth to evaluate their usability for radiographic studies. Thirty bovine and twenty human teeth were cut transversally in 1 millimeter-thick slices. The slices were X-rayed using a digital radiographic system and an intraoral X-ray machine at 65 kVp and 7 mA. The exposure time (0.08 s and the target-sensor distance (40 cm were standardized for all the radiographs. The radiographic densities of the enamel, coronal dentin and radicular dentin of each slice were obtained separately using the "histogram" tool of Adobe Photoshop 7.0 software. The mean radiographic densities of the enamel, coronal dentin and radicular dentin were calculated by the arithmetic mean of the slices of each tooth. One-way ANOVA demonstrated statistically significant differences for the densities of bovine and human enamel (p 0.05. Based on the results, the authors concluded that: a the radiographic density of bovine enamel is significantly higher than that of human enamel; b the radiodensity of bovine coronal dentin is statistically lower than the radiodensity of human coronal dentin; bovine radicular dentin is also less radiodense than human radicular dentin, although this difference was not statistically significant; c bovine teeth should be used with care in radiographic in vitro studies.

  16. Basic statistics with Microsoft Excel: a review.

    Science.gov (United States)

    Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-06-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.

  17. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  18. Beyond δ : Tailoring marked statistics to reveal modified gravity

    Science.gov (United States)

    Valogiannis, Georgios; Bean, Rachel

    2018-01-01

    Models that seek to explain cosmic acceleration through modifications to general relativity (GR) evade stringent Solar System constraints through a restoring, screening mechanism. Down-weighting the high-density, screened regions in favor of the low density, unscreened ones offers the potential to enhance the amount of information carried in such modified gravity models. In this work, we assess the performance of a new "marked" transformation and perform a systematic comparison with the clipping and logarithmic transformations, in the context of Λ CDM and the symmetron and f (R ) modified gravity models. Performance is measured in terms of the fractional boost in the Fisher information and the signal-to-noise ratio (SNR) for these models relative to the statistics derived from the standard density distribution. We find that all three statistics provide improved Fisher boosts over the basic density statistics. The model parameters for the marked and clipped transformation that best enhance signals and the Fisher boosts are determined. We also show that the mark is useful both as a Fourier and real-space transformation; a marked correlation function also enhances the SNR relative to the standard correlation function, and can on mildly nonlinear scales show a significant difference between the Λ CDM and the modified gravity models. Our results demonstrate how a series of simple analytical transformations could dramatically increase the predicted information extracted on deviations from GR, from large-scale surveys, and give the prospect for a much more feasible potential detection.

  19. High density internal transport barriers for burning plasma operation

    International Nuclear Information System (INIS)

    Pericoli Ridolfini, V.

    2005-01-01

    One of the proposed ITER scenarios foresees the creation and sustainment of an internal transport barrier (ITB) in order to improve the confinement properties of the hot core plasma. The more stringent requests are: the ITB must be sustained with electron heating only with no or very small external momentum source, the strong collisional coupling at the envisaged density (line average >1.0 1020 m-3) must not prevent the barrier existence, the bootstrap current created by the large induced gradients must have a radial profile consistent with that requested by the barrier creation and sustainment. To all these items the studies carried out in FTU in the same density range (ne0 ?1.5 1020 m-3) provide encouraging prospects. With pure electron heating and current drive (LH+ECH) steady electron barrier are generated and maintained with central e- temperature >5.0 keV. Almost full CD conditions are established with a bootstrap current close to 25% of the total and well aligned with that driven by the LH waves and responsible for the barrier building. The clear change in the density fluctuations close to the ITB radius, observed by reflectometry, indicates stabilization of turbulence that is consistent with the drop of the thermal electron diffusivity inside the ITB to very low values, ?e<0.5 m2/s estimated by the transport analysis. The 10 fold neutron rate increase testifies a significant collisional ion heating, even though usually ?Ti0/Ti0 does not exceed 40%, because the e--i + equipartition time, always 4-5 times longer than the energy confinement time, does not allow thermal equilibrium with electrons to be attained. The ion thermal diffusivity inside the barrier must be lowered to the neoclassical level to account for the observed Ti(r) profiles, clearly indicating at least a non-degraded ion transport. The global confinement in turn improves by 1.6 times above the FTU L-scaling. The ITB radius can be controlled by varying the LH power deposition profile that is

  20. Statistics of the hubble diagram. II. The form of the luminosity function and density variations with application to quasars

    International Nuclear Information System (INIS)

    Turner, E.L.

    1979-01-01

    New techniques for deriving a luminosity function LF and a spatial density distribution rho (r) from magnitude-redshift data are presented. These techniques do not require iterative improvement of an initially guessed solution or the adoption of arbitrary analytic forms; instead, they provide explicit numerical estimates of the LF and rho (r). Thus, sources of systematic uncertainty are eliminated. This is achieved at the cost of an increase in the statistical noise. As in Paper I of this series, it is necessary to assume that the LF does not vary in functional form. An internal test of this assumption is described.These techniques are illustrated by application to a sample of 3 CR and 4C quasars. The radio luminosity function is found to be a steep power law with no features. The optical luminosity function is found to be a shallow power law cut off roughly exponentially above a characteristic luminosity L/sub opt/* (Z) corresponding roughly to M/sub B/=-22-6 log (1+Z) The comoving density evolution is not well fitted by any simple function of 1+Z [e.g., (1+Z) 6 errs by factors as large as approx.5 at some redshifts] but is well represented by an exponential of look-back time. Specific analytic fits and numerical tabulations are given for each of these functions. The constant LF form assumption is found to be a reasonable first approximation for the quasars.Other possible applications of the new methods to problems in extragalactic and stellar astronomy are suggested

  1. The influence of changes in the VVER-1000 fuel assembly shape during operation on the power density distribution

    Energy Technology Data Exchange (ETDEWEB)

    Shishkov, L. K., E-mail: Shishkov-LK@nrcki.ru; Gorodkov, S. S.; Mikailov, E. F.; Sukhino-Homenko, E. A.; Sumarokova, A. S., E-mail: Sumarokova-AS@nrcki.ru [National Research Center Kurchatov Institute (Russian Federation)

    2016-12-15

    A new approach to calculation of the coefficients of sensitivity of the fuel pin power to deviations in gap sizes between fuel assemblies of the VVER-1000 reactor during its operation is proposed. It is shown that the calculations by the MCU code should be performed for a full-size model of the core to take the interference of the gap influence into account. In order to reduce the conservatism of calculations, the coolant density and coolant temperature feedbacks should be taken into account, as well as the fuel burnup.

  2. On the statistical interpretation of quantum mechanics: evolution of the density matrix

    International Nuclear Information System (INIS)

    Benzecri, J.P.

    1986-01-01

    Without attempting to identify ontological interpretation with a mathematical structure, we reduce philosophical speculation to five theses. In the discussion of these, a central role is devoted to the mathematical problem of the evolution of the density matrix. This article relates to the first 3 of these 5 theses [fr

  3. Analysis and experimental study on hydraulic balance characteristics in density lock

    International Nuclear Information System (INIS)

    Gu Haifeng; Yan Changqi; Sun Furong

    2009-01-01

    Through the simplified theoretical model, the hydraulic balance condition which should be met in the density lock is obtained, when reactor operates normally and density lock is closed. The main parameters influencing this condition are analyzed, and the results show that the hydraulic balance in the density lock is characterized with self-stability in a certain range. Meantime, a simulating experimental loop is built and experimental verification on the self-stability characteristic is done. Moreover, experimental study is done on the conditions of flow change of work fluids in the primary circuit in the process of stable operations. The experimental results show that the hydraulic balance in the density lock can recovered quickly, depending on the self-stability characteristic without influences on the sealing performance of density lock and normal operation of reactor, after the change of operation parameters breaks the hydraulic balance. (authors)

  4. Statistical study of undulator radiated power by a classical detection system in the mm-wave regime

    Directory of Open Access Journals (Sweden)

    A. Eliran

    2009-05-01

    Full Text Available The statistics of FEL spontaneous emission power detected with a detector integration time much larger than the slippage time has been measured in many previous works at high frequencies. In such cases the quantum (shot noise generated in the detection process is dominant. We have measured spontaneous emission in the Israeli electrostatic accelerator FEL (EA-FEL operating in the mm-wave lengths. In this regime the detector is based on a diode rectifier for which the detector quantum noise is negligible. The measurements were repeated numerous times in order to create a sample space with sufficient data enabling evaluation of the statistical features of the radiated power. The probability density function of the radiated power was found and its moments were calculated. The results of analytical and numerical models are compared to those obtained in experimental measurements.

  5. Photon statistics, antibunching and squeezed states

    International Nuclear Information System (INIS)

    Leuchs, G.

    1986-01-01

    This paper attempts to describe the status and addresses future prospects of experiments regarding photon antibunching, and squeezed states. Light correlation is presented in the framework of classical electrodynamics. The extension to quantized radiation fields is discussed and an introduction to the basic principles related to photon statistics, antibunching and squeezed states are presented. The effect of linear attenuation (beam splitters, neutral density filters, and detector quantum efficiency) on the detected signal is discussed. Experiments on the change of photon statistics by the nonlinear interaction of radiation fields with matter are described and some experimental observations of antibunching and sub-Poissonian photon statistics in resonance fluorescence and with possible schemes for the generation and detection of squeezed states are examined

  6. Securing cooperation from persons supplying statistical data.

    Science.gov (United States)

    AUBENQUE, M J; BLAIKLEY, R M; HARRIS, F F; LAL, R B; NEURDENBURG, M G; DE SHELLY HERNANDEZ, R

    1954-01-01

    Securing the co-operation of persons supplying information required for medical statistics is essentially a problem in human relations, and an understanding of the motivations, attitudes, and behaviour of the respondents is necessary.Before any new statistical survey is undertaken, it is suggested by Aubenque and Harris that a preliminary review be made so that the maximum use is made of existing information. Care should also be taken not to burden respondents with an overloaded questionnaire. Aubenque and Harris recommend simplified reporting. Complete population coverage is not necessary.Neurdenburg suggests that the co-operation and support of such organizations as medical associations and social security boards are important and that propaganda should be directed specifically to the groups whose co-operation is sought. Informal personal contacts are valuable and desirable, according to Blaikley, but may have adverse effects if the right kind of approach is not made.Financial payments as an incentive in securing co-operation are opposed by Neurdenburg, who proposes that only postage-free envelopes or similar small favours be granted. Blaikley and Harris, on the other hand, express the view that financial incentives may do much to gain the support of those required to furnish data; there are, however, other incentives, and full use should be made of the natural inclinations of respondents. Compulsion may be necessary in certain instances, but administrative rather than statutory measures should be adopted. Penalties, according to Aubenque, should be inflicted only when justified by imperative health requirements.The results of surveys should be made available as soon as possible to those who co-operated, and Aubenque and Harris point out that they should also be of practical value to the suppliers of the information.Greater co-operation can be secured from medical persons who have an understanding of the statistical principles involved; Aubenque and Neurdenburg

  7. A question of separation: disentangling tracer bias and gravitational non-linearity with counts-in-cells statistics

    Science.gov (United States)

    Uhlemann, C.; Feix, M.; Codis, S.; Pichon, C.; Bernardeau, F.; L'Huillier, B.; Kim, J.; Hong, S. E.; Laigle, C.; Park, C.; Shin, J.; Pogosyan, D.

    2018-02-01

    Starting from a very accurate model for density-in-cells statistics of dark matter based on large deviation theory, a bias model for the tracer density in spheres is formulated. It adopts a mean bias relation based on a quadratic bias model to relate the log-densities of dark matter to those of mass-weighted dark haloes in real and redshift space. The validity of the parametrized bias model is established using a parametrization-independent extraction of the bias function. This average bias model is then combined with the dark matter PDF, neglecting any scatter around it: it nevertheless yields an excellent model for densities-in-cells statistics of mass tracers that is parametrized in terms of the underlying dark matter variance and three bias parameters. The procedure is validated on measurements of both the one- and two-point statistics of subhalo densities in the state-of-the-art Horizon Run 4 simulation showing excellent agreement for measured dark matter variance and bias parameters. Finally, it is demonstrated that this formalism allows for a joint estimation of the non-linear dark matter variance and the bias parameters using solely the statistics of subhaloes. Having verified that galaxy counts in hydrodynamical simulations sampled on a scale of 10 Mpc h-1 closely resemble those of subhaloes, this work provides important steps towards making theoretical predictions for density-in-cells statistics applicable to upcoming galaxy surveys like Euclid or WFIRST.

  8. A Primer on Receiver Operating Characteristic Analysis and Diagnostic Efficiency Statistics for Pediatric Psychology: We Are Ready to ROC

    Science.gov (United States)

    2014-01-01

    Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores 30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298

  9. Advances in statistical multisource-multitarget information fusion

    CERN Document Server

    Mahler, Ronald PS

    2014-01-01

    This is the sequel to the 2007 Artech House bestselling title, Statistical Multisource-Multitarget Information Fusion. That earlier book was a comprehensive resource for an in-depth understanding of finite-set statistics (FISST), a unified, systematic, and Bayesian approach to information fusion. The cardinalized probability hypothesis density (CPHD) filter, which was first systematically described in the earlier book, has since become a standard multitarget detection and tracking technique, especially in research and development.Since 2007, FISST has inspired a considerable amount of research

  10. Statistical Properties of Clear and Dark Duration Lengths

    Czech Academy of Sciences Publication Activity Database

    Brabec, Marek; Paulescu, M.; Badescu, V.

    2017-01-01

    Roč. 153, 1 September (2017), s. 508-518 ISSN 0038-092X Institutional support: RVO:67985807 Keywords : sunshine duration * clouds * solar irradiance * statistical lifetime modeling * cox regression * censored data Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 4.018, year: 2016

  11. Density in Liquids.

    Science.gov (United States)

    Nesin, Gert; Barrow, Lloyd H.

    1984-01-01

    Describes a fourth-grade unit on density which introduces a concept useful in the study of chemistry and procedures appropriate to the chemistry laboratory. The hands-on activities, which use simple equipment and household substances, are at the level of thinking Piaget describes as concrete operational. (BC)

  12. On the statistical interpretation of quantum mechanics: evolution of the density matrix

    International Nuclear Information System (INIS)

    Benzecri, J.-P.

    1986-01-01

    Using two classical examples (the Young slit experiment and coherent and incoherent crystal diffraction of neutrons) we show in a general framework, that for a system viewed as consisting of two components, depolarisation of the density matrix by one of these can result from the application of the Schroedinger equation to the global system [fr

  13. Edge operational space for high density/high confinement ELMY H-modes in JET

    International Nuclear Information System (INIS)

    Sartori, R.; Saibene, G.; Loarte, A.

    2002-01-01

    This paper discusses how the proximity to the L-H threshold affects the confinement of ELMy H-modes at high density. The largest reduction in confinement at high density is observed at the transition from the Type I to the Type III ELMy regime. At medium plasma triangularity, δ≅0.3 (where δ is the average triangularity at the separatrix), JET experiments show that by increasing the margin above the L-H threshold power and maintaining the edge temperature above the critical temperature for the transition to Type III ELMs, it is possible to avoid the degradation of the pedestal pressure with density, normally observed at lower power. As a result, the range of achievable densities (both in the core and in the pedestal) is increased. At high power above the L-H threshold power the core density was equal to the Greenwald limit with H97≅0.9. There is evidence that a mixed regime of Type I and Type II ELMs has been obtained at this intermediate triangularity, possibly as a result of this increase in density. At higher triangularity, δ≅0.5, the power required to achieve similar results is lower. (author)

  14. Statistics of Smoothed Cosmic Fields in Perturbation Theory. I. Formulation and Useful Formulae in Second-Order Perturbation Theory

    Science.gov (United States)

    Matsubara, Takahiko

    2003-02-01

    We formulate a general method for perturbative evaluations of statistics of smoothed cosmic fields and provide useful formulae for application of the perturbation theory to various statistics. This formalism is an extensive generalization of the method used by Matsubara, who derived a weakly nonlinear formula of the genus statistic in a three-dimensional density field. After describing the general method, we apply the formalism to a series of statistics, including genus statistics, level-crossing statistics, Minkowski functionals, and a density extrema statistic, regardless of the dimensions in which each statistic is defined. The relation between the Minkowski functionals and other geometrical statistics is clarified. These statistics can be applied to several cosmic fields, including three-dimensional density field, three-dimensional velocity field, two-dimensional projected density field, and so forth. The results are detailed for second-order theory of the formalism. The effect of the bias is discussed. The statistics of smoothed cosmic fields as functions of rescaled threshold by volume fraction are discussed in the framework of second-order perturbation theory. In CDM-like models, their functional deviations from linear predictions plotted against the rescaled threshold are generally much smaller than that plotted against the direct threshold. There is still a slight meatball shift against rescaled threshold, which is characterized by asymmetry in depths of troughs in the genus curve. A theory-motivated asymmetry factor in the genus curve is proposed.

  15. Air Carrier Traffic Statistics.

    Science.gov (United States)

    2013-11-01

    This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...

  16. Air Carrier Traffic Statistics.

    Science.gov (United States)

    2012-07-01

    This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...

  17. Uniform Statistical Convergence on Time Scales

    Directory of Open Access Journals (Sweden)

    Yavuz Altin

    2014-01-01

    Full Text Available We will introduce the concept of m- and (λ,m-uniform density of a set and m- and (λ,m-uniform statistical convergence on an arbitrary time scale. However, we will define m-uniform Cauchy function on a time scale. Furthermore, some relations about these new notions are also obtained.

  18. Gamow-Jordan vectors and non-reducible density operators from higher-order S-matrix poles

    International Nuclear Information System (INIS)

    Bohm, A.; Loewe, M.; Maxson, S.; Patuleanu, P.; Puentmann, C.; Gadella, M.

    1997-01-01

    In analogy to Gamow vectors that are obtained from first-order resonance poles of the S-matrix, one can also define higher-order Gamow vectors which are derived from higher-order poles of the S-matrix. An S-matrix pole of r-th order at z R =E R -iΓ/2 leads to r generalized eigenvectors of order k=0,1,hor-ellipsis,r-1, which are also Jordan vectors of degree (k+1) with generalized eigenvalue (E R -iΓ/2). The Gamow-Jordan vectors are elements of a generalized complex eigenvector expansion, whose form suggests the definition of a state operator (density matrix) for the microphysical decaying state of this higher-order pole. This microphysical state is a mixture of non-reducible components. In spite of the fact that the k-th order Gamow-Jordan vectors has the polynomial time-dependence which one always associates with higher-order poles, the microphysical state obeys a purely exponential decay law. copyright 1997 American Institute of Physics

  19. Statistical gamma transitions in {sup 174}Hf

    Energy Technology Data Exchange (ETDEWEB)

    Farris, L P; Cizewski, J A; Brinkman, M J; Henry, R G; Lee, C S [Rutgers--the State Univ., New Brunswick, NJ (United States); Khoo, T L; Janssens, R V.F.; Moore, E F; Carpenter, M P; Ahmad, I; Lauritsen, T [Argonne National Lab., IL (United States); Kolata, J J; Beard, K B; Ye, B; Garg, U [Notre Dame Univ., IN (United States); Kaplan, M S; Saladin, J X; Winchell, D [Pittsburgh Univ., PA (United States)

    1992-08-01

    The statistical spectrum extracted from the {sup 172}Yb({alpha},2n){sup 174}Hf reaction was fit with Monte Carlo simulations using a modified GDR E1 strength function and several formulations of the level density. (author). 15 refs., 1 tab., 3 figs.

  20. Statistical Analysis of Model Data for Operational Space Launch Weather Support at Kennedy Space Center and Cape Canaveral Air Force Station

    Science.gov (United States)

    Bauman, William H., III

    2010-01-01

    The 12-km resolution North American Mesoscale (NAM) model (MesoNAM) is used by the 45th Weather Squadron (45 WS) Launch Weather Officers at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) to support space launch weather operations. The 45 WS tasked the Applied Meteorology Unit to conduct an objective statistics-based analysis of MesoNAM output compared to wind tower mesonet observations and then develop a an operational tool to display the results. The National Centers for Environmental Prediction began running the current version of the MesoNAM in mid-August 2006. The period of record for the dataset was 1 September 2006 - 31 January 2010. The AMU evaluated MesoNAM hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The MesoNAM forecast winds, temperature and dew point were compared to the observed values of these parameters from the sensors in the KSC/CCAFS wind tower network. The data sets were stratified by model initialization time, month and onshore/offshore flow for each wind tower. Statistics computed included bias (mean difference), standard deviation of the bias, root mean square error (RMSE) and a hypothesis test for bias = O. Twelve wind towers located in close proximity to key launch complexes were used for the statistical analysis with the sensors on the towers positioned at varying heights to include 6 ft, 30 ft, 54 ft, 60 ft, 90 ft, 162 ft, 204 ft and 230 ft depending on the launch vehicle and associated weather launch commit criteria being evaluated. These twelve wind towers support activities for the Space Shuttle (launch and landing), Delta IV, Atlas V and Falcon 9 launch vehicles. For all twelve towers, the results indicate a diurnal signal in the bias of temperature (T) and weaker but discernable diurnal signal in the bias of dewpoint temperature (T(sub d)) in the MesoNAM forecasts. Also, the standard deviation of the bias and RMSE of T, T(sub d), wind speed and wind

  1. A New Statistical Tool: Scalar Score Function

    Czech Academy of Sciences Publication Activity Database

    Fabián, Zdeněk

    2011-01-01

    Roč. 2, - (2011), s. 109-116 ISSN 1934-7332 R&D Projects: GA ČR GA205/09/1079 Institutional research plan: CEZ:AV0Z10300504 Keywords : statistics * inference function * data characteristics * point estimates * heavy tails Subject RIV: BB - Applied Statistics, Operational Research

  2. A Balanced Approach to Adaptive Probability Density Estimation

    Directory of Open Access Journals (Sweden)

    Julio A. Kovacs

    2017-04-01

    Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  3. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  4. Inference of core barrel motion from neutron noise spectral density. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.C.; Shahrokhi, F.; Kryter, R.C.

    1977-03-15

    A method was developed for inference of core barrel motion from the following statistical descriptors: cross-power spectral density, autopower spectral density, and amplitude probability density. To quantify the core barrel motion in a typical pressurized water reactor (PWR), a scale factor was calculated in both one- and two-dimensional geometries using forward, variational, and perturbation methods of discrete ordinates neutron transport. A procedure for selection of the proper frequency band limits for the statistical descriptors was developed. It was found that although perturbation theory is adequate for the calculation of the scale factor, two-dimensional geometric effects are important enough to rule out the use of a one-dimensional approximation for all but the crudest calculations. It was also found that contributions of gamma rays can be ignored and that the results are relatively insensitive to the cross-section set employed. The proper frequency band for the statistical descriptors is conveniently determined from the coherence and phase information from two ex-core power range neutron monitors positioned diametrically across the reactor vessel. Core barrel motion can then be quantified from the integral of the band-limited cross-power spectral density of two diametrically opposed ex-core monitors or, if the coherence between the pair is greater than or equal to 0.7, from a properly band-limited amplitude probability density function. Wide-band amplitude probability density functions were demonstrated to yield erroneous estimates for the magnitude of core barrel motion.

  5. TRACING THE STAR-FORMATION-DENSITY RELATION TO z {approx} 2

    Energy Technology Data Exchange (ETDEWEB)

    Quadri, Ryan F.; Williams, Rik J. [Carnegie Observatories, Pasadena, CA 91101 (United States); Franx, Marijn; Hildebrandt, Hendrik, E-mail: quadri@obs.carnegiescience.edu [Leiden Observatory, Leiden University, NL-2300 RA Leiden (Netherlands)

    2012-01-10

    Recent work has shown that the star formation (SF) density relation-in which galaxies with low SF rates are preferentially found in dense environments-is still in place at z {approx} 1, but the situation becomes less clear at higher redshifts. We use mass-selected samples drawn from the UKIDSS Ultra-Deep Survey to show that galaxies with quenched SF tend to reside in dense environments out to at least z {approx} 1.8. Over most of this redshift range we are able to demonstrate that this SF-density relation holds even at fixed stellar mass. The environmental quenching of SF appears to operate with similar efficiency on all galaxies regardless of stellar mass. Nevertheless, the environment plays a greater role in the buildup of the red sequence at lower masses, whereas other quenching processes dominate at higher masses. In addition to a statistical analysis of environmental densities, we investigate a cluster at z = 1.6, and show that the central region has an elevated fraction of quiescent objects relative to the field. Although the uncertainties are large, the environmental quenching efficiency in this cluster is consistent with that of galaxy groups and clusters at z {approx} 0. In this work we rely on photometric redshifts and describe some of the pitfalls that large redshift errors can present.

  6. Path-integral computation of superfluid densities

    International Nuclear Information System (INIS)

    Pollock, E.L.; Ceperley, D.M.

    1987-01-01

    The normal and superfluid densities are defined by the response of a liquid to sample boundary motion. The free-energy change due to uniform boundary motion can be calculated by path-integral methods from the distribution of the winding number of the paths around a periodic cell. This provides a conceptually and computationally simple way of calculating the superfluid density for any Bose system. The linear-response formulation relates the superfluid density to the momentum-density correlation function, which has a short-ranged part related to the normal density and, in the case of a superfluid, a long-ranged part whose strength is proportional to the superfluid density. These facts are discussed in the context of path-integral computations and demonstrated for liquid 4 He along the saturated vapor-pressure curve. Below the experimental superfluid transition temperature the computed superfluid fractions agree with the experimental values to within the statistical uncertainties of a few percent in the computations. The computed transition is broadened by finite-sample-size effects

  7. Relativistic density functional theory with picture-change corrected electron density based on infinite-order Douglas-Kroll-Hess method

    Science.gov (United States)

    Oyama, Takuro; Ikabata, Yasuhiro; Seino, Junji; Nakai, Hiromi

    2017-07-01

    This Letter proposes a density functional treatment based on the two-component relativistic scheme at the infinite-order Douglas-Kroll-Hess (IODKH) level. The exchange-correlation energy and potential are calculated using the electron density based on the picture-change corrected density operator transformed by the IODKH method. Numerical assessments indicated that the picture-change uncorrected density functional terms generate significant errors, on the order of hartree for heavy atoms. The present scheme was found to reproduce the energetics in the four-component treatment with high accuracy.

  8. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  9. Tucker tensor analysis of Matern functions in spatial statistics

    KAUST Repository

    Litvinenko, Alexander

    2018-04-20

    Low-rank Tucker tensor methods in spatial statistics 1. Motivation: improve statistical models 2. Motivation: disadvantages of matrices 3. Tools: Tucker tensor format 4. Tensor approximation of Matern covariance function via FFT 5. Typical statistical operations in Tucker tensor format 6. Numerical experiments

  10. Correlation density matrices for one-dimensional quantum chains based on the density matrix renormalization group

    International Nuclear Information System (INIS)

    Muender, W; Weichselbaum, A; Holzner, A; Delft, Jan von; Henley, C L

    2010-01-01

    A useful concept for finding numerically the dominant correlations of a given ground state in an interacting quantum lattice system in an unbiased way is the correlation density matrix (CDM). For two disjoint, separated clusters, it is defined to be the density matrix of their union minus the direct product of their individual density matrices and contains all the correlations between the two clusters. We show how to extract from the CDM a survey of the relative strengths of the system's correlations in different symmetry sectors and the nature of their decay with distance (power law or exponential), as well as detailed information on the operators carrying long-range correlations and the spatial dependence of their correlation functions. To achieve this goal, we introduce a new method of analysing the CDM, termed the dominant operator basis (DOB) method, which identifies in an unbiased fashion a small set of operators for each cluster that serve as a basis for the dominant correlations of the system. We illustrate this method by analysing the CDM for a spinless extended Hubbard model that features a competition between charge density correlations and pairing correlations, and show that the DOB method successfully identifies their relative strengths and dominant correlators. To calculate the ground state of this model, we use the density matrix renormalization group, formulated in terms of a variational matrix product state (MPS) approach within which subsequent determination of the CDM is very straightforward. In an extended appendix, we give a detailed tutorial introduction to our variational MPS approach for ground state calculations for one-dimensional quantum chain models. We present in detail how MPSs overcome the problem of large Hilbert space dimensions in these models and describe all the techniques needed for handling them in practice.

  11. Statistics and finance an introduction

    CERN Document Server

    Ruppert, David

    2004-01-01

    This textbook emphasizes the applications of statistics and probability to finance. Students are assumed to have had a prior course in statistics, but no background in finance or economics. The basics of probability and statistics are reviewed and more advanced topics in statistics, such as regression, ARMA and GARCH models, the bootstrap, and nonparametric regression using splines, are introduced as needed. The book covers the classical methods of finance such as portfolio theory, CAPM, and the Black-Scholes formula, and it introduces the somewhat newer area of behavioral finance. Applications and use of MATLAB and SAS software are stressed. The book will serve as a text in courses aimed at advanced undergraduates and masters students in statistics, engineering, and applied mathematics as well as quantitatively oriented MBA students. Those in the finance industry wishing to know more statistics could also use it for self-study. David Ruppert is the Andrew Schultz, Jr. Professor of Engineering, School of Oper...

  12. East African Journal of Statistics: Editorial Policies

    African Journals Online (AJOL)

    Focus and Scope. EAJOSTA publishes the latest finding in applied and theoretical statistics. The journal also accepts papers in operations research, financial mathematics and acturial sciences, all considered as part of applied statistics. Articles must deal with original research, which have not been accepted for publication ...

  13. The asymptotic behavior of Frobenius-Perron operator with local lower-bound function

    International Nuclear Information System (INIS)

    Ding Yiming

    2003-01-01

    Let (X,Σ,μ) be a σ-finite measure space, S:X→X be a nonsingular transformation and P S :L 1 →L 1 be the Frobenius-Perron operator associated with S. It is proved that if P S satisfies the local lower-bound function condition then for every f is a subset of D the sequence {P S n f} converges strongly to a stationary density of P S as n→∞. The statistical stability of S is also concerned via the local lower-bound function method

  14. From Quality to Information Quality in Official Statistics

    Directory of Open Access Journals (Sweden)

    Kenett Ron S.

    2016-12-01

    Full Text Available The term quality of statistical data, developed and used in official statistics and international organizations such as the International Monetary Fund (IMF and the Organisation for Economic Co-operation and Development (OECD, refers to the usefulness of summary statistics generated by producers of official statistics. Similarly, in the context of survey quality, official agencies such as Eurostat, National Center for Science and Engineering Statistics (NCSES, and Statistics Canada have created dimensions for evaluating the quality of a survey and its ability to report ‘accurate survey data’.

  15. Density Distribution Sunflower Plots

    Directory of Open Access Journals (Sweden)

    William D. Dupont

    2003-01-01

    Full Text Available Density distribution sunflower plots are used to display high-density bivariate data. They are useful for data where a conventional scatter plot is difficult to read due to overstriking of the plot symbol. The x-y plane is subdivided into a lattice of regular hexagonal bins of width w specified by the user. The user also specifies the values of l, d, and k that affect the plot as follows. Individual observations are plotted when there are less than l observations per bin as in a conventional scatter plot. Each bin with from l to d observations contains a light sunflower. Other bins contain a dark sunflower. In a light sunflower each petal represents one observation. In a dark sunflower, each petal represents k observations. (A dark sunflower with p petals represents between /2-pk k and /2+pk k observations. The user can control the sizes and colors of the sunflowers. By selecting appropriate colors and sizes for the light and dark sunflowers, plots can be obtained that give both the overall sense of the data density distribution as well as the number of data points in any given region. The use of this graphic is illustrated with data from the Framingham Heart Study. A documented Stata program, called sunflower, is available to draw these graphs. It can be downloaded from the Statistical Software Components archive at http://ideas.repec.org/c/boc/bocode/s430201.html . (Journal of Statistical Software 2003; 8 (3: 1-5. Posted at http://www.jstatsoft.org/index.php?vol=8 .

  16. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  17. Characteristics of PEMFC operating at high current density with low external humidification

    International Nuclear Information System (INIS)

    Fan, Linhao; Zhang, Guobin; Jiao, Kui

    2017-01-01

    Highlights: • PEMFC with low humidity and high current density is studied by numerical simulation. • At high current density, water production lowers external humidification requirement. • A steady anode circulation status without external humidification is demonstrated. • The corresponding detailed internal water transfer path in the PEMFC is illustrated. • Counter-flow is superior to co-flow at low anode external humidification. - Abstract: A three-dimensional multiphase numerical model for proton exchange membrane fuel cell (PEMFC) is developed to study the fuel cell performance and water transport properties with low external humidification. The results show that the sufficient external humidification is necessary to prevent the polymer electrolyte dehydration at low current density, while at high current density, the water produced in cathode CL is enough to humidify the polymer electrolyte instead of external humidification by flowing back and forth between the anode and cathode across the membrane. Furthermore, a steady anode circulation status without external humidification is demonstrated in this study, of which the detailed internal water transfer path is also illustrated. Additionally, it is also found that the water balance under the counter-flow arrangement is superior to co-flow at low anode external humidification.

  18. Functional integral approach to classical statistical dynamics

    International Nuclear Information System (INIS)

    Jensen, R.V.

    1980-04-01

    A functional integral method is developed for the statistical solution of nonlinear stochastic differential equations which arise in classical dynamics. The functional integral approach provides a very natural and elegant derivation of the statistical dynamical equations that have been derived using the operator formalism of Martin, Siggia, and Rose

  19. Clinical Decision Support: Statistical Hopes and Challenges

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Zvárová, Jana

    2016-01-01

    Roč. 4, č. 1 (2016), s. 30-34 ISSN 1805-8698 Grant - others:Nadační fond na opdporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : decision support * data mining * multivariate statistics * psychiatry * information based medicine Subject RIV: BB - Applied Statistics, Operational Research

  20. Study on the effect of moderator density reactivity for Kartini reactor

    International Nuclear Information System (INIS)

    Budi Rohman; Widarto

    2009-01-01

    One of important characteristics of water-cooled reactors is the change of reactivity due to change in the density of coolant or moderator. This parameter generally has negative value and it has significant role in preventing the excursion of power during operation. Many thermal-hydraulic codes for nuclear reactors require this parameter as the input to account for reactivity feedback due to increase in moderator voids and the subsequent decrease in moderator density during operation. Kartini reactor is cooled and moderated by water, therefore, it is essential to study the effect of the change in moderator density as well as to determine the value of void or moderator density reactivity coefficient in order to characterize its behavior resulting from the presence of vapor or change of moderator density during operation. Analysis by MCNP code shows that the reactivity of core is decreasing with the decrease in moderator density. The analysis estimates the void or moderator density reactivity coefficient for Kartini Reactor to be -2.17×10-4 Δρ/ % void . (author)

  1. Cylinders out of a top hat: counts-in-cells for projected densities

    Science.gov (United States)

    Uhlemann, Cora; Pichon, Christophe; Codis, Sandrine; L'Huillier, Benjamin; Kim, Juhan; Bernardeau, Francis; Park, Changbom; Prunet, Simon

    2018-06-01

    Large deviation statistics is implemented to predict the statistics of cosmic densities in cylinders applicable to photometric surveys. It yields few per cent accurate analytical predictions for the one-point probability distribution function (PDF) of densities in concentric or compensated cylinders; and also captures the density dependence of their angular clustering (cylinder bias). All predictions are found to be in excellent agreement with the cosmological simulation Horizon Run 4 in the quasi-linear regime where standard perturbation theory normally breaks down. These results are combined with a simple local bias model that relates dark matter and tracer densities in cylinders and validated on simulated halo catalogues. This formalism can be used to probe cosmology with existing and upcoming photometric surveys like DES, Euclid or WFIRST containing billions of galaxies.

  2. Hydrological impact of high-density small dams in a humid catchment, Southeast China

    Science.gov (United States)

    Lu, W.; Lei, H.; Yang, D.

    2017-12-01

    The Jiulong River basin is a humid catchment with a drainage area of 14,741 km2; however, it has over 1000 hydropower stations within it. Such catchment with high-density small dams is scarce in China. Yet few is known about the impact of high-density small dams on streamflow changes. To what extent the large number of dams alters the hydrologic patterns is a fundamental scientific issue for water resources management, flood control, and aquatic ecological environment protection. Firstly, trend and change point analyses are applied to determine the characteristics of inter-annual streamflow. Based on the detected change point, the study period is divided into two study periods, the ``natural'' and ``disturbed'' periods. Then, a geomorphology-based hydrological model (GBHM) and the fixing-changing method are adopted to evaluate the relative contributions of climate variations and damming to the changes in streamflow at each temporal scale (i.e., from daily, monthly to annual). Based on the simulated natural streamflow, the impact of dam construction on hydrologic alteration and aquatic ecological environment will be evaluated. The hydrologic signatures that will be investigated include flood peak, seasonality of streamflow, and the inter-annual variability of streamflow. In particular, the impacts of damming on aquatic ecological environment will be investigated using eco-flow metrics and indicators of hydrologic alteration (IHA) which contains 33 individual streamflow statistics that are closely related to aquatic ecosystem. The results of this study expect to provide a reference for reservoir operation considering both ecological and economic benefits of such operations in the catchment with high-density dams.

  3. FAA statistical handbook of aviation

    Science.gov (United States)

    1994-01-01

    This report presents statistical information pertaining to the Federal Aviation Administration, the National Airspace System, Airports, Airport Activity, U.S. Civil Air Carrier Fleet, U.S. Civil Air Carrier Operating Data, Airmen, General Aviation Ai...

  4. Study of nuclear level density parameter and its temperature dependence

    International Nuclear Information System (INIS)

    Nasrabadi, M. N.; Behkami, A. N.

    2000-01-01

    The nuclear level density ρ is the basic ingredient required for theoretical studies of nuclear reaction and structure. It describes the statistical nuclear properties and is expressed as a function of various constants of motion such as number of particles, excitation energy and angular momentum. In this work the energy and spin dependence of nuclear level density will be presented and discussed. In addition the level density parameter α will be extracted from this level density information, and its temperature and mass dependence will be obtained

  5. Football fever: goal distributions and non-Gaussian statistics

    Science.gov (United States)

    Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.

    2009-02-01

    Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.

  6. Relationships between brightness of nighttime lights and population density

    Science.gov (United States)

    Naizhuo, Z.

    2012-12-01

    Brightness of nighttime lights has been proven to be a good proxy for socioeconomic and demographic statistics. Moreover, the satellite nighttime lights data have been used to spatially disaggregate amounts of gross domestic product (GDP), fossil fuel carbon dioxide emission, and electric power consumption (Ghosh et al., 2010; Oda and Maksyutov, 2011; Zhao et al., 2012). Spatial disaggregations were performed in these previous studies based on assumed linear relationships between digital number (DN) value of pixels in the nighttime light images and socioeconomic data. However, reliability of the linear relationships was never tested due to lack of relative high-spatial-resolution (equal to or finer than 1 km × 1 km) statistical data. With the similar assumption that brightness linearly correlates to population, Bharti et al. (2011) used nighttime light data as a proxy for population density and then developed a model about seasonal fluctuations of measles in West Africa. The Oak Ridge National Laboratory used sub-national census population data and high spatial resolution remotely-sensed-images to produce LandScan population raster datasets. The LandScan population datasets have 1 km × 1 km spatial resolution which is consistent with the spatial resolution of the nighttime light images. Therefore, in this study I selected 2008 LandScan population data as baseline reference data and the contiguous United State as study area. Relationships between DN value of pixels in the 2008 Defense Meteorological Satellite Program's Operational Linescan System (DMSP-OLS) stable light image and population density were established. Results showed that an exponential function can more accurately reflect the relationship between luminosity and population density than a linear function. Additionally, a certain number of saturated pixels with DN value of 63 exist in urban core areas. If directly using the exponential function to estimate the population density for the whole brightly

  7. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  8. Testing the statistical isotropy of large scale structure with multipole vectors

    International Nuclear Information System (INIS)

    Zunckel, Caroline; Huterer, Dragan; Starkman, Glenn D.

    2011-01-01

    A fundamental assumption in cosmology is that of statistical isotropy - that the Universe, on average, looks the same in every direction in the sky. Statistical isotropy has recently been tested stringently using cosmic microwave background data, leading to intriguing results on large angular scales. Here we apply some of the same techniques used in the cosmic microwave background to the distribution of galaxies on the sky. Using the multipole vector approach, where each multipole in the harmonic decomposition of galaxy density field is described by unit vectors and an amplitude, we lay out the basic formalism of how to reconstruct the multipole vectors and their statistics out of galaxy survey catalogs. We apply the algorithm to synthetic galaxy maps, and study the sensitivity of the multipole vector reconstruction accuracy to the density, depth, sky coverage, and pixelization of galaxy catalog maps.

  9. Impact of neutral density fluctuations on gas puff imaging diagnostics

    Science.gov (United States)

    Wersal, C.; Ricci, P.

    2017-11-01

    A three-dimensional turbulence simulation of the SOL and edge regions of a toroidally limited tokamak is carried out. The simulation couples self-consistently the drift-reduced two-fluid Braginskii equations to a kinetic equation for neutral atoms. A diagnostic neutral gas puff on the low-field side midplane is included and the impact of neutral density fluctuations on D_α light emission investigated. We find that neutral density fluctuations affect the D_α emission. In particular, at a radial distance from the gas puff smaller than the neutral mean free path, neutral density fluctuations are anti-correlated with plasma density, electron temperature, and D_α fluctuations. It follows that the neutral fluctuations reduce the D_α emission in most of the observed region and, therefore, have to be taken into account when interpreting the amplitude of the D_α emission. On the other hand, higher order statistical moments (skewness, kurtosis) and turbulence characteristics (such as correlation length, or the autocorrelation time) are not significantly affected by the neutral fluctuations. At distances from the gas puff larger than the neutral mean free path, a non-local shadowing effect influences the neutral density fluctuations. There, the D_α fluctuations are correlated with the neutral density fluctuations, and the high-order statistical moments and measurements of other turbulence properties are strongly affected by the neutral density fluctuations.

  10. Time-dependent density-functional tight-binding method with the third-order expansion of electron density.

    Science.gov (United States)

    Nishimoto, Yoshio

    2015-09-07

    We develop a formalism for the calculation of excitation energies and excited state gradients for the self-consistent-charge density-functional tight-binding method with the third-order contributions of a Taylor series of the density functional theory energy with respect to the fluctuation of electron density (time-dependent density-functional tight-binding (TD-DFTB3)). The formulation of the excitation energy is based on the existing time-dependent density functional theory and the older TD-DFTB2 formulae. The analytical gradient is computed by solving Z-vector equations, and it requires one to calculate the third-order derivative of the total energy with respect to density matrix elements due to the inclusion of the third-order contributions. The comparison of adiabatic excitation energies for selected small and medium-size molecules using the TD-DFTB2 and TD-DFTB3 methods shows that the inclusion of the third-order contributions does not affect excitation energies significantly. A different set of parameters, which are optimized for DFTB3, slightly improves the prediction of adiabatic excitation energies statistically. The application of TD-DFTB for the prediction of absorption and fluorescence energies of cresyl violet demonstrates that TD-DFTB3 reproduced the experimental fluorescence energy quite well.

  11. Three-dimensional electromagnetic strong turbulence. I. Scalings, spectra, and field statistics

    International Nuclear Information System (INIS)

    Graham, D. B.; Robinson, P. A.; Cairns, Iver H.; Skjaeraasen, O.

    2011-01-01

    The first fully three-dimensional (3D) simulations of large-scale electromagnetic strong turbulence (EMST) are performed by numerically solving the electromagnetic Zakharov equations for electron thermal speeds ν e with ν e /c≥0.025. The results of these simulations are presented, focusing on scaling behavior, energy density spectra, and field statistics of the Langmuir (longitudinal) and transverse components of the electric fields during steady-state strong turbulence, where multiple wave packets collapse simultaneously and the system is approximately statistically steady in time. It is shown that for ν e /c > or approx. 0.17 strong turbulence is approximately electrostatic and can be explained using the electrostatic two-component model. For v e /c > or approx. 0.17 the power-law behaviors of the scalings, spectra, and field statistics differ from the electrostatic predictions and results because ν e /c is sufficiently high to allow transverse modes to become trapped in density wells. The results are compared with those of past 3D electrostatic strong turbulence (ESST) simulations and 2D EMST simulations. For number density perturbations, the scaling behavior, spectra, and field statistics are shown to be only weakly dependent on ν e /c, whereas the Langmuir and transverse scalings, spectra, and field statistics are shown to be strongly dependent on ν e /c. Three-dimensional EMST is shown to have features in common with 2D EMST, such as a two-component structure and trapping of transverse modes which are dependent on ν e /c.

  12. Bayesian error estimation in density-functional theory

    DEFF Research Database (Denmark)

    Mortensen, Jens Jørgen; Kaasbjerg, Kristen; Frederiksen, Søren Lund

    2005-01-01

    We present a practical scheme for performing error estimates for density-functional theory calculations. The approach, which is based on ideas from Bayesian statistics, involves creating an ensemble of exchange-correlation functionals by comparing with an experimental database of binding energies...

  13. Quantum entanglement and teleportation using statistical correlations

    Indian Academy of Sciences (India)

    Administrator

    Abstract. A study of quantum teleportation using two and three-particle correlated density matrix is presented. A criterion based on standard quantum statistical correlations employed in the many-body virial expansion is used to determine the extent of entanglement for a 2N-particle system. A relation between the probability ...

  14. Statistical Analysis of Radio Propagation Channel in Ruins Environment

    Directory of Open Access Journals (Sweden)

    Jiao He

    2015-01-01

    Full Text Available The cellphone based localization system for search and rescue in complex high density ruins has attracted a great interest in recent years, where the radio channel characteristics are critical for design and development of such a system. This paper presents a spatial smoothing estimation via rotational invariance technique (SS-ESPRIT for radio channel characterization of high density ruins. The radio propagations at three typical mobile communication bands (0.9, 1.8, and 2 GHz are investigated in two different scenarios. Channel parameters, such as arrival time, delays, and complex amplitudes, are statistically analyzed. Furthermore, a channel simulator is built based on these statistics. By comparison analysis of average excess delay and delay spread, the validation results show a good agreement between the measurements and channel modeling results.

  15. Development of statistical analysis code for meteorological data (W-View)

    International Nuclear Information System (INIS)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  16. Obtaining sub-daily new snow density from automated measurements in high mountain regions

    Science.gov (United States)

    Helfricht, Kay; Hartl, Lea; Koch, Roland; Marty, Christoph; Olefs, Marc

    2018-05-01

    The density of new snow is operationally monitored by meteorological or hydrological services at daily time intervals, or occasionally measured in local field studies. However, meteorological conditions and thus settling of the freshly deposited snow rapidly alter the new snow density until measurement. Physically based snow models and nowcasting applications make use of hourly weather data to determine the water equivalent of the snowfall and snow depth. In previous studies, a number of empirical parameterizations were developed to approximate the new snow density by meteorological parameters. These parameterizations are largely based on new snow measurements derived from local in situ measurements. In this study a data set of automated snow measurements at four stations located in the European Alps is analysed for several winter seasons. Hourly new snow densities are calculated from the height of new snow and the water equivalent of snowfall. Considering the settling of the new snow and the old snowpack, the average hourly new snow density is 68 kg m-3, with a standard deviation of 9 kg m-3. Seven existing parameterizations for estimating new snow densities were tested against these data, and most calculations overestimate the hourly automated measurements. Two of the tested parameterizations were capable of simulating low new snow densities observed at sheltered inner-alpine stations. The observed variability in new snow density from the automated measurements could not be described with satisfactory statistical significance by any of the investigated parameterizations. Applying simple linear regressions between new snow density and wet bulb temperature based on the measurements' data resulted in significant relationships (r2 > 0.5 and p ≤ 0.05) for single periods at individual stations only. Higher new snow density was calculated for the highest elevated and most wind-exposed station location. Whereas snow measurements using ultrasonic devices and snow

  17. CRISS power spectral density

    International Nuclear Information System (INIS)

    Vaeth, W.

    1979-04-01

    The correlation of signal components at different frequencies like higher harmonics cannot be detected by a normal power spectral density measurement, since this technique correlates only components at the same frequency. This paper describes a special method for measuring the correlation of two signal components at different frequencies: the CRISS power spectral density. From this new function in frequency analysis, the correlation of two components can be determined quantitatively either they stem from one signal or from two diverse signals. The principle of the method, suitable for the higher harmonics of a signal as well as for any other frequency combinations is shown for the digital frequency analysis technique. Two examples of CRISS power spectral densities demonstrates the operation of the new method. (orig.) [de

  18. In-Depth Investigation of Statistical and Physicochemical Properties on the Field Study of the Intermittent Filling of Large Water Tanks

    Directory of Open Access Journals (Sweden)

    Do-Hwan Kim

    2017-01-01

    Full Text Available Large-demand customers, generally high-density dwellings and buildings, have dedicated ground or elevated water tanks to consistently supply drinking water to residents. Online field measurement for Nonsan-2 district meter area demonstrated that intermittent replenishment from large-demand customers could disrupt the normal operation of a water distribution system by taking large quantities of water in short times when filling the tanks from distribution mains. Based on the previous results of field measurement for hydraulic and water quality parameters, statistical analysis is performed for measured data in terms of autocorrelation, power spectral density, and cross-correlation. The statistical results show that the intermittent filling interval of 6.7 h and diurnal demand pattern of 23.3 h are detected through autocorrelation analyses, the similarities of the flow-pressure and the turbidity-particle count data are confirmed as a function of frequency through power spectral density analyses, and a strong cross-correlation is observed in the flow-pressure and turbidity-particle count analyses. In addition, physicochemical results show that the intermittent refill of storage tank from large-demand customers induces abnormal flow and pressure fluctuations and results in transient-induced turbid flow mainly composed of fine particles ranging within 2–4 μm and constituting Fe, Si, and Al.

  19. Comparison of low density and high density pedicle screw instrumentation in Lenke 1 adolescent idiopathic scoliosis.

    Science.gov (United States)

    Shen, Mingkui; Jiang, Honghui; Luo, Ming; Wang, Wengang; Li, Ning; Wang, Lulu; Xia, Lei

    2017-08-02

    The correlation between implant density and deformity correction has not yet led to a precise conclusion in adolescent idiopathic scoliosis (AIS). The aim of this study was to evaluate the effects of low density (LD) and high density (HD) pedicle screw instrumentation in terms of the clinical, radiological and Scoliosis Research Society (SRS)-22 outcomes in Lenke 1 AIS. We retrospectively reviewed 62 consecutive Lenke 1 AIS patients who underwent posterior spinal arthrodesis using all-pedicle screw instrumentation with a minimum follow-up of 24 months. The implant density was defined as the number of screws per spinal level fused. Patients were then divided into two groups according to the average implant density for the entire study. The LD group (n = 28) had fewer than 1.61 screws per level, while the HD group (n = 34) had more than 1.61 screws per level. The radiographs were analysed preoperatively, postoperatively and at final follow-up. The perioperative and SRS-22 outcomes were also assessed. Independent sample t tests were used between the two groups. Comparisons between the two groups showed no significant differences in the correction of the main thoracic curve and thoracic kyphosis, blood transfusion, hospital stay, and SRS-22 scores. Compared with the HD group, there was a decreased operating time (278.4 vs. 331.0 min, p = 0.004) and decreased blood loss (823.6 vs. 1010.9 ml, p = 0.048), pedicle screws needed (15.1 vs. 19.6, p density and high density pedicle screw instrumentation achieved satisfactory deformity correction in Lenke 1 AIS patients. However, the operating time and blood loss were reduced, and the implant costs were decreased with the use of low screw density constructs.

  20. NOx, Soot, and Fuel Consumption Predictions under Transient Operating Cycle for Common Rail High Power Density Diesel Engines

    Directory of Open Access Journals (Sweden)

    N. H. Walke

    2016-01-01

    Full Text Available Diesel engine is presently facing the challenge of controlling NOx and soot emissions on transient cycles, to meet stricter emission norms and to control emissions during field operations. Development of a simulation tool for NOx and soot emissions prediction on transient operating cycles has become the most important objective, which can significantly reduce the experimentation time and cost required for tuning these emissions. Hence, in this work, a 0D comprehensive predictive model has been formulated with selection and coupling of appropriate combustion and emissions models to engine cycle models. Selected combustion and emissions models are further modified to improve their prediction accuracy in the full operating zone. Responses of the combustion and emissions models have been validated for load and “start of injection” changes. Model predicted transient fuel consumption, air handling system parameters, and NOx and soot emissions are in good agreement with measured data on a turbocharged high power density common rail engine for the “nonroad transient cycle” (NRTC. It can be concluded that 0D models can be used for prediction of transient emissions on modern engines. How the formulated approach can also be extended to transient emissions prediction for other applications and fuels is also discussed.

  1. On exact and approximate exchange-energy densities

    DEFF Research Database (Denmark)

    Springborg, Michael; Dahl, Jens Peder

    1999-01-01

    Based on correspondence rules between quantum-mechanical operators and classical functions in phase space we construct exchange-energy densities in position space. Whereas these are not unique but depend on the chosen correspondence rule, the exchange potential is unique. We calculate this exchange......-energy density for 15 closed-shell atoms, and compare it with kinetic- and Coulomb-energy densities. It is found that it has a dominating local-density character, but electron-shell effects are recognizable. The approximate exchange-energy functionals that have been proposed so far are found to account only...

  2. Probing NWP model deficiencies by statistical postprocessing

    DEFF Research Database (Denmark)

    Rosgaard, Martin Haubjerg; Nielsen, Henrik Aalborg; Nielsen, Torben S.

    2016-01-01

    The objective in this article is twofold. On one hand, a Model Output Statistics (MOS) framework for improved wind speed forecast accuracy is described and evaluated. On the other hand, the approach explored identifies unintuitive explanatory value from a diagnostic variable in an operational....... Based on the statistical model candidates inferred from the data, the lifted index NWP model diagnostic is consistently found among the NWP model predictors of the best performing statistical models across sites....

  3. Statistical learning across development: Flexible yet constrained

    Directory of Open Access Journals (Sweden)

    Lauren eKrogh

    2013-01-01

    Full Text Available Much research in the past two decades has documented infants’ and adults' ability to extract statistical regularities from auditory input. Importantly, recent research has extended these findings to the visual domain, demonstrating learners' sensitivity to statistical patterns within visual arrays and sequences of shapes. In this review we discuss both auditory and visual statistical learning to elucidate both the generality of and constraints on statistical learning. The review first outlines the major findings of the statistical learning literature with infants, followed by discussion of statistical learning across domains, modalities, and development. The second part of this review considers constraints on statistical learning. The discussion focuses on two categories of constraint: constraints on the types of input over which statistical learning operates and constraints based on the state of the learner. The review concludes with a discussion of possible mechanisms underlying statistical learning.

  4. Statistical convergence of a non-positive approximation process

    International Nuclear Information System (INIS)

    Agratini, Octavian

    2011-01-01

    Highlights: → A general class of approximation processes is introduced. → The A-statistical convergence is studied. → Applications in quantum calculus are delivered. - Abstract: Starting from a general sequence of linear and positive operators of discrete type, we associate its r-th order generalization. This construction involves high order derivatives of a signal and it looses the positivity property. Considering that the initial approximation process is A-statistically uniform convergent, we prove that the property is inherited by the new sequence. Also, our result includes information about the uniform convergence. Two applications in q-Calculus are presented. We study q-analogues both of Meyer-Koenig and Zeller operators and Stancu operators.

  5. Scattering of Neutrons on Fluctuations of the Density of the Thin Films

    Directory of Open Access Journals (Sweden)

    S. G. ABDULVAHABOVA

    2016-11-01

    Full Text Available Abstract. The cross section for scattering neutron  on the density of  fluctuations of the  thin films is obtained in the framework of the quantum theory of multiple scattering  in the quasielastic approximation. Inhomogeneity can be caused by dynamic density fluctuations, and be statistical in nature. Fluctuations in the density of the scattering material cause neutron scattering wave. The probability of a collision between a neutron and an atomic nucleus depends on the number of neutrons and on their velocity. The formulas have been obtained under the assumption that the imaginary part of the optical potential is a local operator. It was determined that the scattering in density fluctuations does not contribute to the attenuation of the coherent neutron wave. In the approximation of a thin target the solution of the equation for the total scattering amplitude is identical to the expression obtained in the usual eikonal approximation and differs significantly, at least functionally, from the solution for the case of a thick target. There have been detailed investigations of the reflection and refraction of neutron waves in matter, and the details of their dispersion law have been studied. The results are  hown  also, that  the total cross section for scattering by the complete target becomes universal and does not depend on cross section for scattering by one nucleus.Keywords: 25.40-Ep

  6. Statistical effect of interactions on particle creation in expanding universe

    International Nuclear Information System (INIS)

    Kodama, Hideo

    1982-01-01

    The statistical effect of interactions which drives many-particle systems toward equilibrium is expected to change the qualitative and quantitative features of particle creation in expanding universe. To investigate this problem a simplified model called the finite-time reduction model is formulated and applied to the scalar particle creation in the radiation dominant Friedmann universe. The number density of created particles and the entropy production due to particle creation are estimated. The result for the number density is compared with that in the conventional free field theory. It is shown that the statistical effect increases the particle creation and lengthens the active creation period. As for the entropy production it is shown that it is negligible for scalar particles in the Friedmann universe. (author)

  7. Statistical inference for noisy nonlinear ecological dynamic systems.

    Science.gov (United States)

    Wood, Simon N

    2010-08-26

    Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.

  8. Universality of correlations of levels with discrete statistics

    OpenAIRE

    Brezin, Edouard; Kazakov, Vladimir

    1999-01-01

    We study the statistics of a system of N random levels with integer values, in the presence of a logarithmic repulsive potential of Dyson type. This probleme arises in sums over representations (Young tableaux) of GL(N) in various matrix problems and in the study of statistics of partitions for the permutation group. The model is generalized to include an external source and its correlators are found in closed form for any N. We reproduce the density of levels in the large N and double scalin...

  9. Comparison of Danish dichotomous and BI-RADS classifications of mammographic density.

    Science.gov (United States)

    Hodge, Rebecca; Hellmann, Sophie Sell; von Euler-Chelpin, My; Vejborg, Ilse; Andersen, Zorana Jovanovic

    2014-06-01

    In the Copenhagen mammography screening program from 1991 to 2001, mammographic density was classified either as fatty or mixed/dense. This dichotomous mammographic density classification system is unique internationally, and has not been validated before. To compare the Danish dichotomous mammographic density classification system from 1991 to 2001 with the density BI-RADS classifications, in an attempt to validate the Danish classification system. The study sample consisted of 120 mammograms taken in Copenhagen in 1991-2001, which tested false positive, and which were in 2012 re-assessed and classified according to the BI-RADS classification system. We calculated inter-rater agreement between the Danish dichotomous mammographic classification as fatty or mixed/dense and the four-level BI-RADS classification by the linear weighted Kappa statistic. Of the 120 women, 32 (26.7%) were classified as having fatty and 88 (73.3%) as mixed/dense mammographic density, according to Danish dichotomous classification. According to BI-RADS density classification, 12 (10.0%) women were classified as having predominantly fatty (BI-RADS code 1), 46 (38.3%) as having scattered fibroglandular (BI-RADS code 2), 57 (47.5%) as having heterogeneously dense (BI-RADS 3), and five (4.2%) as having extremely dense (BI-RADS code 4) mammographic density. The inter-rater variability assessed by weighted kappa statistic showed a substantial agreement (0.75). The dichotomous mammographic density classification system utilized in early years of Copenhagen's mammographic screening program (1991-2001) agreed well with the BI-RADS density classification system.

  10. Few-particle quantum dynamics–comparing nonequilibrium Green functions with the generalized Kadanoff–Baym ansatz to density operator theory

    International Nuclear Information System (INIS)

    Hermanns, S; Bonitz, M; Balzer, K

    2013-01-01

    The nonequilibrium description of quantum systems requires, for more than two or three particles, the use of a reduced description to be numerically tractable. Two possible approaches are based on either reduced density matrices or nonequilibrium Green functions (NEGF). Both concepts are formulated in terms of hierarchies of coupled equations—the Bogoliubov-Born-Green-Kirkwood-Yvon (BBGKY) hierarchy for the reduced density operators and the Martin-Schwinger-hierarchy (MS) for the Green functions, respectively. In both cases, similar approximations are introduced to decouple the hierarchy, yet still many questions regarding the correspondence of both approaches remain open. Here we analyze this correspondence by studying the generalized Kadanoff–Baym ansatz (GKBA) that reduces the NEGF to a single-time theory. Starting from the BBGKY-hierarchy we present the approximations that are necessary to recover the GKBA result both, with Hartree-Fock propagators (HF-GKBA) and propagators in second Born approximation. To test the quality of the HF-GKBA, we study the dynamics of a 4-electron Hubbard nanocluster starting from a strong nonequilibrium initial state and compare to exact results and the Wang-Cassing approximation to the BBGKY hierarchy presented recently by Akbari et al. [1].

  11. Statistical analysis of JET disruptions

    International Nuclear Information System (INIS)

    Tanga, A.; Johnson, M.F.

    1991-07-01

    In the operation of JET and of any tokamak many discharges are terminated by a major disruption. The disruptive termination of a discharge is usually an unwanted event which may cause damage to the structure of the vessel. In a reactor disruptions are potentially a very serious problem, hence the importance of studying them and devising methods to avoid disruptions. Statistical information has been collected about the disruptions which have occurred at JET over a long span of operations. The analysis is focused on the operational aspects of the disruptions rather than on the underlining physics. (Author)

  12. Fundamental link between system theory and statistical mechanics

    International Nuclear Information System (INIS)

    Atmanspacher, H.; Scheingraber, H.

    1987-01-01

    A fundamental link between system theory and statistical mechanics has been found to be established by the Kolmogorov entropy. By this quantity the temporal evolution of dynamical systems can be classified into regular, chaotic, and stochastic processes. Since K represents a measure for the internal information creation rate of dynamical systems, it provides an approach to irreversibility. The formal relationship to statistical mechanics is derived by means of an operator formalism originally introduced by Prigogine. For a Liouville operator L and an information operator M tilde acting on a distribution in phase space, it is shown that i[L, M tilde] = KI (I = identity operator). As a first consequence of this equivalence, a relation is obtained between the chaotic correlation time of a system and Prigogine's concept of a finite duration of presence. Finally, the existence of chaos in quantum systems is discussed with respect to the existence of a quantum mechanical time operator

  13. Development of statistical analysis code for meteorological data (W-View)

    Energy Technology Data Exchange (ETDEWEB)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  14. Statistics and Biomedical Informatics in Forensic Sciences

    Czech Academy of Sciences Publication Activity Database

    Zvárová, Jana

    2009-01-01

    Roč. 20, č. 6 (2009), s. 743-750 ISSN 1180-4009. [TIES 2007. Annual Meeting of the International Environmental Society /18./. Mikulov, 16.08.2007-20.08.2007] Institutional research plan: CEZ:AV0Z10300504 Keywords : biomedical informatics * biomedical statistics * genetic information * forensic dentistry Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.000, year: 2009

  15. Statistical and physical evolution of QSO's

    International Nuclear Information System (INIS)

    Caditz, D.; Petrosian, V.

    1989-09-01

    The relationship between the physical evolution of discrete extragalactic sources, the statistical evolution of the observed population of sources, and the cosmological model is discussed. Three simple forms of statistical evolution: pure luminosity evolution (PLE), pure density evolution (PDE), and generalized luminosity evolution (GLE), are considered in detail together with what these forms imply about the physical evolution of individual sources. Two methods are used to analyze the statistical evolution of the observed distribution of QSO's (quasars) from combined flux limited samples. It is shown that both PLE and PDE are inconsistent with the data over the redshift range 0 less than z less than 2.2, and that a more complicated form of evolution such as GLE is required, independent of the cosmological model. This result is important for physical models of AGN, and in particular, for the accretion disk model which recent results show may be inconsistent with PLE

  16. IBM SPSS statistics 19 made simple

    CERN Document Server

    Gray, Colin D

    2012-01-01

    This new edition of one of the most widely read textbooks in its field introduces the reader to data analysis with the most powerful and versatile statistical package on the market: IBM SPSS Statistics 19. Each new release of SPSS Statistics features new options and other improvements. There remains a core of fundamental operating principles and techniques which have continued to apply to all releases issued in recent years and have been proved to be worth communicating in a small volume. This practical and informal book combines simplicity and clarity of presentation with a comprehensive trea

  17. Nonequilibrium thermodynamics of interacting tunneling transport: variational grand potential, density functional formulation and nature of steady-state forces

    International Nuclear Information System (INIS)

    Hyldgaard, P

    2012-01-01

    The standard formulation of tunneling transport rests on an open-boundary modeling. There, conserving approximations to nonequilibrium Green function or quantum statistical mechanics provide consistent but computational costly approaches; alternatively, the use of density-dependent ballistic-transport calculations (e.g., Lang 1995 Phys. Rev. B 52 5335), here denoted ‘DBT’, provides computationally efficient (approximate) atomistic characterizations of the electron behavior but has until now lacked a formal justification. This paper presents an exact, variational nonequilibrium thermodynamic theory for fully interacting tunneling and provides a rigorous foundation for frozen-nuclei DBT calculations as a lowest-order approximation to an exact nonequilibrium thermodynamic density functional evaluation. The theory starts from the complete electron nonequilibrium quantum statistical mechanics and I identify the operator for the nonequilibrium Gibbs free energy which, generally, must be treated as an implicit solution of the fully interacting many-body dynamics. I demonstrate a minimal property of a functional for the nonequilibrium thermodynamic grand potential which thus uniquely identifies the solution as the exact nonequilibrium density matrix. I also show that the uniqueness-of-density proof from a closely related Lippmann-Schwinger collision density functional theory (Hyldgaard 2008 Phys. Rev. B 78 165109) makes it possible to express the variational nonequilibrium thermodynamic description as a single-particle formulation based on universal electron-density functionals; the full nonequilibrium single-particle formulation improves the DBT method, for example, by a more refined account of Gibbs free energy effects. I illustrate a formal evaluation of the zero-temperature thermodynamic grand potential value which I find is closely related to the variation in the scattering phase shifts and hence to Friedel density oscillations. This paper also discusses the

  18. Binomial vs poisson statistics in radiation studies

    International Nuclear Information System (INIS)

    Foster, J.; Kouris, K.; Spyrou, N.M.; Matthews, I.P.; Welsh National School of Medicine, Cardiff

    1983-01-01

    The processes of radioactive decay, decay and growth of radioactive species in a radioactive chain, prompt emission(s) from nuclear reactions, conventional activation and cyclic activation are discussed with respect to their underlying statistical density function. By considering the transformation(s) that each nucleus may undergo it is shown that all these processes are fundamentally binomial. Formally, when the number of experiments N is large and the probability of success p is close to zero, the binomial is closely approximated by the Poisson density function. In radiation and nuclear physics, N is always large: each experiment can be conceived of as the observation of the fate of each of the N nuclei initially present. Whether p, the probability that a given nucleus undergoes a prescribed transformation, is close to zero depends on the process and nuclide(s) concerned. Hence, although a binomial description is always valid, the Poisson approximation is not always adequate. Therefore further clarification is provided as to when the binomial distribution must be used in the statistical treatment of detected events. (orig.)

  19. Urinary density measurement and analysis methods in neonatal unit care

    Directory of Open Access Journals (Sweden)

    Maria Vera Lúcia Moreira Leitão Cardoso

    2013-09-01

    Full Text Available The objective was to assess urine collection methods through cotton in contact with genitalia and urinary collector to measure urinary density in newborns. This is a quantitative intervention study carried out in a neonatal unit of Fortaleza-CE, Brazil, in 2010. The sample consisted of 61 newborns randomly chosen to compose the study group. Most neonates were full term (31/50.8% males (33/54%. Data on urinary density measurement through the methods of cotton and collector presented statistically significant differences (p<0.05. The analysis of interquartile ranges between subgroups resulted in statistical differences between urinary collector/reagent strip (1005 and cotton/reagent strip (1010, however there was no difference between urinary collector/ refractometer (1008 and cotton/ refractometer. Therefore, further research should be conducted with larger sampling using methods investigated in this study and whenever possible, comparing urine density values to laboratory tests.

  20. Statistical physics of medical ultrasonic images

    International Nuclear Information System (INIS)

    Wagner, R.F.; Insana, M.F.; Brown, D.G.; Smith, S.W.

    1987-01-01

    The physical and statistical properties of backscattered signals in medical ultrasonic imaging are reviewed in terms of: 1) the radiofrequency signal; 2) the envelope (video or magnitude) signal; and 3) the density of samples in simple and in compounded images. There is a wealth of physical information in backscattered signals in medical ultrasound. This information is contained in the radiofrequency spectrum - which is not typically displayed to the viewer - as well as in the higher statistical moments of the envelope or video signal - which are not readily accessed by the human viewer of typical B-scans. This information may be extracted from the detected backscattered signals by straightforward signal processing techniques at low resolution

  1. Two-mode Gaussian density matrices and squeezing of photons

    International Nuclear Information System (INIS)

    Tucci, R.R.

    1992-01-01

    In this paper, the authors generalize to 2-mode states the 1-mode state results obtained in a previous paper. The authors study 2-mode Gaussian density matrices. The authors find a linear transformation which maps the two annihilation operators, one for each mode, into two new annihilation operators that are uncorrelated and unsqueezed. This allows the authors to express the density matrix as a product of two 1-mode density matrices. The authors find general conditions under which 2-mode Gaussian density matrices become pure states. Possible pure states include the 2-mode squeezed pure states commonly mentioned in the literature, plus other pure states never mentioned before. The authors discuss the entropy and thermodynamic laws (Second Law, Fundamental Equation, and Gibbs-Duhem Equation) for the 2-mode states being considered

  2. Computing Science and Statistics: Volume 24. Graphics and Visualization

    Science.gov (United States)

    1993-03-20

    Models Mike West Institute of Statistics & Decision Sciences Duke University, Durham NC 27708, USA Abstract density estimation techniques. With an...ratio-of-uniforms halter, D. J., Best, N. G., McNeil, A. method. Statistics and Computing, 1, (in J., Sharples , L. D. and Kirby, A. J. press). (1992b...Dept of Act. Math & Stats Box 13040 SFA Riccarton Edinburgh, Scotland EH 14 4AS Nacognoches, TX 75962 mike @cara.ma.hw.ac.uk Allen McIntosh Michael T

  3. Statistical methods of estimating mining costs

    Science.gov (United States)

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  4. Density Variations in the Earth's Magnetospheric Cusps

    Science.gov (United States)

    Walsh, B. M.; Niehof, J.; Collier, M. R.; Welling, D. T.; Sibeck, D. G.; Mozer, F. S.; Fritz, T. A.; Kuntz, K. D.

    2016-01-01

    Seven years of measurements from the Polar spacecraft are surveyed to monitor the variations of plasma density within the magnetospheric cusps. The spacecraft's orbital precession from 1998 through 2005 allows for coverage of both the northern and southern cusps from low altitude out to the magnetopause. In the mid- and high- altitude cusps, plasma density scales well with the solar wind density (n(sub cusp)/n(sub sw) approximately 0.8). This trend is fairly steady for radial distances greater then 4 R(sub E). At low altitudes (r less than 4R(sub E)) the density increases with decreasing altitude and even exceeds the solar wind density due to contributions from the ionosphere. The density of high charge state oxygen (O(greater +2) also displays a positive trend with solar wind density within the cusp. A multifluid simulation with the Block-Adaptive-Tree Solar Wind Roe-Type Upwind Scheme MHD model was run to monitor the relative contributions of the ionosphere and solar wind plasma within the cusp. The simulation provides similar results to the statistical measurements from Polar and confirms the presence of ionospheric plasma at low altitudes.

  5. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  6. Statistical methods to monitor the West Valley off-gas system

    International Nuclear Information System (INIS)

    Eggett, D.L.

    1990-01-01

    This paper reports on the of-gas system for the ceramic melter operated at the West Valley Demonstration Project at West Valley, NY, monitored during melter operation. A one-at-a-time method of monitoring the parameters of the off-gas system is not statistically sound. Therefore, multivariate statistical methods appropriate for the monitoring of many correlated parameters will be used. Monitoring a large number of parameters increases the probability of a false out-of-control signal. If the parameters being monitored are statistically independent, the control limits can be easily adjusted to obtain the desired probability of a false out-of-control signal. The principal component (PC) scores have desirable statistical properties when the original variables are distributed as multivariate normals. Two statistics derived from the PC scores and used to form multivariate control charts are outlined and their distributional properties reviewed

  7. A statistical model for porous structure of rocks

    Institute of Scientific and Technical Information of China (English)

    JU Yang; YANG YongMing; SONG ZhenDuo; XU WenJing

    2008-01-01

    The geometric features and the distribution properties of pores in rocks were In-vestigated by means of CT scanning tests of sandstones. The centroidal coordl-nares of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob-ability density functions upon which the random distribution of pore position, dis-tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex-amine the stress distribution, the pattern of element failure and the inoaculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.

  8. A statistical model for porous structure of rocks

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The geometric features and the distribution properties of pores in rocks were in- vestigated by means of CT scanning tests of sandstones. The centroidal coordi- nates of pores, the statistic characterristics of pore distance, quantity, size and their probability density functions were formulated in this paper. The Monte Carlo method and the random number generating algorithm were employed to generate two series of random numbers with the desired statistic characteristics and prob- ability density functions upon which the random distribution of pore position, dis- tance and quantity were determined. A three-dimensional porous structural model of sandstone was constructed based on the FLAC3D program and the information of the pore position and distribution that the series of random numbers defined. On the basis of modelling, the Brazil split tests of rock discs were carried out to ex- amine the stress distribution, the pattern of element failure and the inosculation of failed elements. The simulation indicated that the proposed model was consistent with the realistic porous structure of rock in terms of their statistic properties of pores and geometric similarity. The built-up model disclosed the influence of pores on the stress distribution, failure mode of material elements and the inosculation of failed elements.

  9. Application of statistical physics approaches to complex organizations

    Science.gov (United States)

    Matia, Kaushik

    The first part of this thesis studies two different kinds of financial markets, namely, the stock market and the commodity market. Stock price fluctuations display certain scale-free statistical features that are not unlike those found in strongly-interacting physical systems. The possibility that new insights can be gained using concepts and methods developed to understand scale-free physical phenomena has stimulated considerable research activity in the physics community. In the first part of this thesis a comparative study of stocks and commodities is performed in terms of probability density function and correlations of stock price fluctuations. It is found that the probability density of the stock price fluctuation has a power law functional form with an exponent 3, which is similar across different markets around the world. We present an autoregressive model to explain the origin of the power law functional form of the probability density function of the price fluctuation. The first part also presents the discovery of unique features of the Indian economy, which we find displays a scale-dependent probability density function. In the second part of this thesis we quantify the statistical properties of fluctuations of complex systems like business firms and world scientific publications. We analyze class size of these systems mentioned above where units agglomerate to form classes. We find that the width of the probability density function of growth rate decays with the class size as a power law with an exponent beta which is universal in the sense that beta is independent of the system studied. We also identify two other scaling exponents, gamma connecting the unit size to the class size and gamma connecting the number of units to the class size, where products are units and firms are classes. Finally we propose a generalized preferential attachment model to describe the class size distribution. This model is successful in explaining the growth rate and class

  10. Statistical analysis of dragline monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Mirabediny, H.; Baafi, E.Y. [University of Tehran, Tehran (Iran)

    1998-07-01

    Dragline monitoring systems are normally the best tool used to collect data on the machine performance and operational parameters of a dragline operation. This paper discusses results of a time study using data from a dragline monitoring system captured over a four month period. Statistical summaries of the time study in terms of average values, standard deviation and frequency distributions showed that the mode of operation and the geological conditions have a significant influence on the dragline performance parameters. 6 refs., 14 figs., 3 tabs.

  11. Statistical summary of commercial jet aircraft accidents : worldwide operations, 1959-2009

    Science.gov (United States)

    2010-07-01

    The accident statistics presented in this summary are confined to worldwide commercial jet airplanes that are heavier than 60,000 pounds maximum gross weight. Within that set of airplanes, there are two groups excluded: : 1) Airplanes manufactured in...

  12. The structure and statistics of interstellar turbulence

    International Nuclear Information System (INIS)

    Kritsuk, A G; Norman, M L; Ustyugov, S D

    2017-01-01

    We explore the structure and statistics of multiphase, magnetized ISM turbulence in the local Milky Way by means of driven periodic box numerical MHD simulations. Using the higher order-accurate piecewise-parabolic method on a local stencil (PPML), we carry out a small parameter survey varying the mean magnetic field strength and density while fixing the rms velocity to observed values. We quantify numerous characteristics of the transient and steady-state turbulence, including its thermodynamics and phase structure, kinetic and magnetic energy power spectra, structure functions, and distribution functions of density, column density, pressure, and magnetic field strength. The simulations reproduce many observables of the local ISM, including molecular clouds, such as the ratio of turbulent to mean magnetic field at 100 pc scale, the mass and volume fractions of thermally stable Hi, the lognormal distribution of column densities, the mass-weighted distribution of thermal pressure, and the linewidth-size relationship for molecular clouds. Our models predict the shape of magnetic field probability density functions (PDFs), which are strongly non-Gaussian, and the relative alignment of magnetic field and density structures. Finally, our models show how the observed low rates of star formation per free-fall time are controlled by the multiphase thermodynamics and large-scale turbulence. (paper)

  13. Probabilistic and Statistical Aspects of Quantum Theory

    CERN Document Server

    Holevo, Alexander S

    2011-01-01

    This book is devoted to aspects of the foundations of quantum mechanics in which probabilistic and statistical concepts play an essential role. The main part of the book concerns the quantitative statistical theory of quantum measurement, based on the notion of positive operator-valued measures. During the past years there has been substantial progress in this direction, stimulated to a great extent by new applications such as Quantum Optics, Quantum Communication and high-precision experiments. The questions of statistical interpretation, quantum symmetries, theory of canonical commutation re

  14. Semiclassical analysis, Witten Laplacians, and statistical mechanis

    CERN Document Server

    Helffer, Bernard

    2002-01-01

    This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S

  15. Effect of liquid density differences on boiling two-phase flow stability

    International Nuclear Information System (INIS)

    Furuya, Masahiro; Manera, Annalisa; Bragt, David D.B.; Hagen, Tim H.J.J. van der; Kruijf, Willy J.M.de

    2002-01-01

    In order to investigate the effect of considering liquid density dependence on local fluid temperature in the thermal-hydraulic stability, a linear stability analysis is performed for a boiling natural circulation loop with an adiabatic riser. Type-I and Type-II instabilities were to investigate according to Fukuda-Kobori's classification. Type-I instability is dominant when the flow quality is low, while Type-II instability is relevant at high flow quality. Type-II instability is well known as the typical density wave oscillation. Neglecting liquid density differences yields estimates of Type-II instability margins that are too small, due to both a change in system-dynamics features and in the operational point. On the other hand, neglecting liquid density differences yields estimates of Type-I stability margins that are too large, especially due to a change in the operational point. Neglecting density differences is thus non-conservative in this case. Therefore, it is highly recommended to include liquid density dependence on the fluid subcooling in the stability analysis if a flow loop with an adiabatic rise is operated under the condition of low flow quality. (author)

  16. Impact of connection density on regional cost differences for network operators in the Netherlands

    International Nuclear Information System (INIS)

    2009-04-01

    The Dutch Office of Energy Regulation ('Energiekamer') has an obligation to investigate the extent to which the electricity and gas distribution businesses (DNOs) in the Netherlands face different structural environments that result in regional cost differences which, in turn, could justify tariff differences. On the basis of previous studies, Energiekamer has identified 'water crossings' and 'local taxes' as allowable regional differences. To account for them, Energiekamer has introduced an adjustment to the regulated revenues formula in order to guarantee a level-playing field to the Dutch DNOs. In addition to these factors, it has been claimed that connection density may have an impact on distribution costs and that, therefore, regulated revenues should be adjusted to compensate for regional differences in connection density between DNOs. However, so far, the research in this field has been unable to identify a sufficiently robust relationship between cost and connection density to support this claim. In order to address this issue, Energiekamer has asked Frontier Economics and Consentec to further investigate the relationship between connection density and distribution costs in the Netherlands. Therefore, our analysis has aimed at determining whether, and to what extent, connection density in the Netherlands is a significant driver of the costs of electricity and gas distribution networks. The following three questions are answered: (1) Is connection density a significant cost driver in electricity and gas networks in the Netherlands?; (2) If so, which functional form (e.g. U-shaped) does this relationship have in the Netherlands?; (3) Finally, based on the evidence collected, is the influence of connection density sufficiently well-determined to be considered a regional difference in the Dutch regulatory framework?

  17. Kolmogorov complexity, pseudorandom generators and statistical models testing

    Czech Academy of Sciences Publication Activity Database

    Šindelář, Jan; Boček, Pavel

    2002-01-01

    Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002

  18. A note on asymptotic normality in the thermodynamic limit at low densities

    DEFF Research Database (Denmark)

    Jensen, J.L.

    1991-01-01

    We consider a continuous statistical mechanical system with a pair interaction in a region λ tending to infinity. For low densities asymptotic normality of the canonical statistic is proved, both in the grand canonical ensemble and in the canonical ensemble. The results are illustrated through...

  19. Corneal Endothelial Cell Density and Morphology in Healthy Turkish Eyes

    Directory of Open Access Journals (Sweden)

    Ceyhun Arıcı

    2014-01-01

    Full Text Available Purpose. To describe the normative values of corneal endothelial cell density, morphology, and central corneal thickness in healthy Turkish eyes. Methods. Specular microscopy was performed in 252 eyes of 126 healthy volunteers (M : F, 42 : 84. Parameters studied included mean endothelial cell density (MCD, mean cell area (MCA, coefficient of variation (CV in cell size, percentage of hexagonal cells, and central corneal thickness (CCT. Results. The mean age of volunteers was 44.3±13.5 (range, 20 to 70 years. There was a statistically significant decrease in MCD (P<0.001; correlation, −0.388 and percentage of hexagonal cells, (P<0.001; correlation, −0.199 with age. There was also a statistically significant increase in MCA (P<0.001; correlation, 0.363 with increasing age. There was no statistically significant difference in MCD, MCA, CV in cell size, percentage of hexagonal cells, and CCT between genders and there was also no significant difference in these parameters between fellow eyes of subjects. Conclusions. Normotive data for the endothelium in the Turkish population are reported. Endothelial cell density in the Turkish eyes is less than that described in the Japanese, American, Chinese, and Filipino eyes and higher than that described in Indian, Thai, and Iranian eyes.

  20. Statistical Network Analysis for Functional MRI: Mean Networks and Group Comparisons.

    Directory of Open Access Journals (Sweden)

    Cedric E Ginestet

    2014-05-01

    Full Text Available Comparing networks in neuroscience is hard, because the topological properties of a given network are necessarily dependent on the number of edges of that network. This problem arises in the analysis of both weighted and unweighted networks. The term density is often used in this context, in order to refer to the mean edge weight of a weighted network, or to the number of edges in an unweighted one. Comparing families of networks is therefore statistically difficult because differences in topology are necessarily associated with differences in density. In this review paper, we consider this problem from two different perspectives, which include (i the construction of summary networks, such as how to compute and visualize the mean network from a sample of network-valued data points; and (ii how to test for topological differences, when two families of networks also exhibit significant differences in density. In the first instance, we show that the issue of summarizing a family of networks can be conducted by either adopting a mass-univariate approach, which produces a statistical parametric network (SPN, or by directly computing the mean network, provided that a metric has been specified on the space of all networks with a given number of nodes. In the second part of this review, we then highlight the inherent problems associated with the comparison of topological functions of families of networks that differ in density. In particular, we show that a wide range of topological summaries, such as global efficiency and network modularity are highly sensitive to differences in density. Moreover, these problems are not restricted to unweighted metrics, as we demonstrate that the same issues remain present when considering the weighted versions of these metrics. We conclude by encouraging caution, when reporting such statistical comparisons, and by emphasizing the importance of constructing summary networks.

  1. The algebraic geometry of Harper operators

    International Nuclear Information System (INIS)

    Li, Dan

    2011-01-01

    Following an approach developed by Gieseker, Knoerrer and Trubowitz for discretized Schroedinger operators, we study the spectral theory of Harper operators in dimensions 2 and 1, as a discretized model of magnetic Laplacians, from the point of view of algebraic geometry. We describe the geometry of an associated family of Bloch varieties and compute their density of states. Finally, we also compute some spectral functions based on the density of states. We discuss the difference between the cases with rational or irrational parameters: for the two-dimensional Harper operator, the compactification of the Bloch variety is an ordinary variety in the rational case and an ind-pro-variety in the irrational case. This gives rise, at the algebro-geometric level of Bloch varieties, to a phenomenon similar to the Hofstadter butterfly in the spectral theory. In dimension 2, the density of states can be expressed in terms of period integrals over Fermi curves, where the resulting elliptic integrals are independent of the parameters. In dimension 1, for the almost Mathieu operator, with a similar argument, we find the usual dependence of the spectral density on the parameter, which gives rise to the well-known Hofstadter butterfly picture. (paper)

  2. FEATURES BASED ON NEIGHBORHOOD PIXELS DENSITY - A STUDY AND COMPARISON

    Directory of Open Access Journals (Sweden)

    Satish Kumar

    2016-02-01

    Full Text Available In optical character recognition applications, the feature extraction method(s used to recognize document images play an important role. The features are the properties of the pattern that can be statistical, structural and/or transforms or series expansion. The structural features are difficult to compute particularly from hand-printed images. The structure of the strokes present inside the hand-printed images can be estimated using statistical means. In this paper three features have been purposed, those are based on the distribution of B/W pixels on the neighborhood of a pixel in an image. We name these features as Spiral Neighbor Density, Layer Pixel Density and Ray Density. The recognition performance of these features has been compared with two more features Neighborhood Pixels Weight and Total Distances in Four Directions already studied in our work. We have used more than 20000 Devanagari handwritten character images for conducting experiments. The experiments are conducted with two classifiers i.e. PNN and k-NN.

  3. Statistical mixing and aggregation in Feller diffusion

    International Nuclear Information System (INIS)

    Anteneodo, C; Duarte Queirós, S M

    2009-01-01

    We consider Feller mean-reverting square-root diffusion, which has been applied to model a wide variety of processes with linearly state-dependent diffusion, such as stochastic volatility and interest rates in finance, and neuronal and population dynamics in the natural sciences. We focus on the statistical mixing (or superstatistical) process in which the parameter related to the mean value can fluctuate—a plausible mechanism for the emergence of heavy-tailed distributions. We obtain analytical results for the associated probability density function (both stationary and time-dependent), its correlation structure and aggregation properties. Our results are applied to explain the statistics of stock traded volume at different aggregation scales

  4. An Optimization Principle for Deriving Nonequilibrium Statistical Models of Hamiltonian Dynamics

    Science.gov (United States)

    Turkington, Bruce

    2013-08-01

    A general method for deriving closed reduced models of Hamiltonian dynamical systems is developed using techniques from optimization and statistical estimation. Given a vector of resolved variables, selected to describe the macroscopic state of the system, a family of quasi-equilibrium probability densities on phase space corresponding to the resolved variables is employed as a statistical model, and the evolution of the mean resolved vector is estimated by optimizing over paths of these densities. Specifically, a cost function is constructed to quantify the lack-of-fit to the microscopic dynamics of any feasible path of densities from the statistical model; it is an ensemble-averaged, weighted, squared-norm of the residual that results from submitting the path of densities to the Liouville equation. The path that minimizes the time integral of the cost function determines the best-fit evolution of the mean resolved vector. The closed reduced equations satisfied by the optimal path are derived by Hamilton-Jacobi theory. When expressed in terms of the macroscopic variables, these equations have the generic structure of governing equations for nonequilibrium thermodynamics. In particular, the value function for the optimization principle coincides with the dissipation potential that defines the relation between thermodynamic forces and fluxes. The adjustable closure parameters in the best-fit reduced equations depend explicitly on the arbitrary weights that enter into the lack-of-fit cost function. Two particular model reductions are outlined to illustrate the general method. In each example the set of weights in the optimization principle contracts into a single effective closure parameter.

  5. Statistical learning: a powerful mechanism that operates by mere exposure.

    Science.gov (United States)

    Aslin, Richard N

    2017-01-01

    How do infants learn so rapidly and with little apparent effort? In 1996, Saffran, Aslin, and Newport reported that 8-month-old human infants could learn the underlying temporal structure of a stream of speech syllables after only 2 min of passive listening. This demonstration of what was called statistical learning, involving no instruction, reinforcement, or feedback, led to dozens of confirmations of this powerful mechanism of implicit learning in a variety of modalities, domains, and species. These findings reveal that infants are not nearly as dependent on explicit forms of instruction as we might have assumed from studies of learning in which children or adults are taught facts such as math or problem solving skills. Instead, at least in some domains, infants soak up the information around them by mere exposure. Learning and development in these domains thus appear to occur automatically and with little active involvement by an instructor (parent or teacher). The details of this statistical learning mechanism are discussed, including how exposure to specific types of information can, under some circumstances, generalize to never-before-observed information, thereby enabling transfer of learning. WIREs Cogn Sci 2017, 8:e1373. doi: 10.1002/wcs.1373 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  6. Procedure for Uranium-Molybdenum Density Measurements and Porosity Determination

    Energy Technology Data Exchange (ETDEWEB)

    Prabhakaran, Ramprashad [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Devaraj, Arun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Joshi, Vineet V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lavender, Curt A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-08-13

    The purpose of this document is to provide guidelines for preparing uranium-molybdenum (U-Mo) specimens, performing density measurements, and computing sample porosity. Typical specimens (solids) will be sheared to small rectangular foils, disks, or pieces of metal. A mass balance, solid density determination kit, and a liquid of known density will be used to determine the density of U-Mo specimens using the Archimedes principle. A standard test weight of known density would be used to verify proper operation of the system. By measuring the density of a U-Mo sample, it is possible to determine its porosity.

  7. Variation of level density parameter with angular momentum in 119Sb

    International Nuclear Information System (INIS)

    Aggarwal, Mamta; Kailas, S.

    2015-01-01

    Nuclear level density (NLD), a basic ingredient of Statistical Model has been a subject of interest for various decades as it plays an important role in the understanding of a wide variety of Nuclear reactions. There have been various efforts towards the precise determination of NLD and study its dependence on excitation energy and angular momentum as it is crucial in the determination of cross-sections. Here we report our results of theoretical calculations in a microscopic framework to understand the experimental results on inverse level density parameter (k) extracted for different angular momentum regions for 119 Sb corresponding to different γ-ray multiplicities by comparing the experimental neutron energy spectra with statistical model predictions where an increase in the level density with the increasing angular momentum is predicted. NLD and neutron emission spectra dependence on temperature and spin has been studied in our earlier works where the influence of structural transitions due to angular momentum and temperature on level density of states and neutron emission probability was shown

  8. The use of statistical models in heavy-ion reactions studies

    International Nuclear Information System (INIS)

    Stokstad, R.G.

    1984-01-01

    This chapter reviews the use of statistical models to describe nuclear level densities and the decay of equilibrated nuclei. The statistical models of nuclear structure and nuclear reactions presented here have wide application in the analysis of heavy-ion reaction data. Applications are illustrated with examples of gamma-ray decay, the emission of light particles and heavier clusters of nucleons, and fission. In addition to the compound nucleus, the treatment of equilibrated fragments formed in binary reactions is discussed. The statistical model is shown to be an important tool for the identification of products from nonequilibrium decay

  9. Investigation of Physical Processes Limiting Plasma Density in DIII--D

    Science.gov (United States)

    Maingi, R.

    1996-11-01

    Understanding the physical processes which limit operating density is crucial in achieving peak performance in confined plasmas. Studies from many of the world's tokamaks have indicated the existence(M. Greenwald, et al., Nucl. Fusion 28) (1988) 2199 of an operational density limit (Greenwald limit, n^GW_max) which is proportional to the plasma current and independent of heating power. Several theories have reproduced the current dependence, but the lack of a heating power dependence in the data has presented an enigma. This limit impacts the International Thermonuclear Experimental Reactor (ITER) because the nominal operating density for ITER is 1.5 × n^GW_max. In DIII-D, experiments are being conducted to understand the physical processes which limit operating density in H-mode discharges; these processes include X-point MARFE formation, high core recycling and neutral pressure, resistive MHD stability, and core radiative collapse. These processes affect plasma properties, i.e. edge/scrape-off layer conduction and radiation, edge pressure gradient and plasma current density profile, and core radiation, which in turn restrict the accessible density regime. With divertor pumping and D2 pellet fueling, core neutral pressure is reduced and X-point MARFE formation is effectively eliminated. Injection of the largest-sized pellets does cause transient formation of divertor MARFEs which occasionally migrate to the X-point, but these are rapidly extinguished in pumped discharges in the time between pellets. In contrast to Greenwald et al., it is found that the density relaxation time after pellets is largely independent of the density relative to the Greenwald limit. Fourier analysis of Mirnov oscillations indicates the de-stabilization and growth of rotating, tearing-type modes (m/n= 2/1) when the injected pellets cause large density perturbations, and these modes often reduce energy confinement back to L-mode levels. We are examining the mechanisms for de

  10. Electron and current density measurements on tokamak plasmas

    International Nuclear Information System (INIS)

    Lammeren, A.C.A.P. van.

    1991-01-01

    The first part of this thesis describes the Thomson-scattering diagnostic as it was present at the TORTUR tokamak. For the first time with this diagnostic a complete tangential scattering spectrum was recorded during one single laser pulse. From this scattering spectrum the local current density was derived. Small deviations from the expected gaussian scattering spectrum were observed indicating the non-Maxwellian character of the electron-velocity distribution. The second part of this thesis describes the multi-channel interferometer/ polarimeter diagnostic which was constructed, build and operated on the Rijnhuizen Tokamak Project (RTP) tokamak. The diagnostic was operated routinely, yielding the development of the density profiles for every discharge. When ECRH (Electron Cyclotron Resonance Heating) is switched on the density profile broadens, the central density decreases and the total density increases, the opposite takes place when ECRH is switched off. The influence of MHD (magnetohydrodynamics) activity on the density was clearly observable. In the central region of the plasma it was measured that in hydrogen discharges the so-called sawtooth collapse is preceded by an m=1 instability which grows rapidly. An increase in radius of this m=1 mode of 1.5 cm just before the crash is observed. In hydrogen discharges the sawtooth induced density pulse shows an asymmetry for the high- and low-field side propagation. This asymmetry disappeared for helium discharges. From the location of the maximum density variations during an m=2 mode the position of the q=2 surface is derived. The density profiles are measured during the energy quench phase of a plasma disruption. A fast flattening and broadening of the density profile is observed. (author). 95 refs.; 66 figs.; 7 tabs

  11. Wave-function functionals for the density

    International Nuclear Information System (INIS)

    Slamet, Marlina; Pan Xiaoyin; Sahni, Viraht

    2011-01-01

    We extend the idea of the constrained-search variational method for the construction of wave-function functionals ψ[χ] of functions χ. The search is constrained to those functions χ such that ψ[χ] reproduces the density ρ(r) while simultaneously leading to an upper bound to the energy. The functionals are thereby normalized and automatically satisfy the electron-nucleus coalescence condition. The functionals ψ[χ] are also constructed to satisfy the electron-electron coalescence condition. The method is applied to the ground state of the helium atom to construct functionals ψ[χ] that reproduce the density as given by the Kinoshita correlated wave function. The expectation of single-particle operators W=Σ i r i n , n=-2,-1,1,2, W=Σ i δ(r i ) are exact, as must be the case. The expectations of the kinetic energy operator W=-(1/2)Σ i ∇ i 2 , the two-particle operators W=Σ n u n , n=-2,-1,1,2, where u=|r i -r j |, and the energy are accurate. We note that the construction of such functionals ψ[χ] is an application of the Levy-Lieb constrained-search definition of density functional theory. It is thereby possible to rigorously determine which functional ψ[χ] is closer to the true wave function.

  12. Statistical modelling with quantile functions

    CERN Document Server

    Gilchrist, Warren

    2000-01-01

    Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...

  13. Thermodynamics of a one-dimensional ideal gas with fractional exclusion statistics

    International Nuclear Information System (INIS)

    Murthy, M.V.N.; Shankar, R.

    1994-01-01

    We show that the particles in the Calogero-Sutherland model obey fractional exclusion statistics as defined by Haldane. We construct anyon number densities and derive the energy distribution function. We show that the partition function factorizes in the form characteristic of an ideal gas. The virial expansion is exactly computable and interestingly it is only the second virial coefficient that encodes the statistics information

  14. Density equalizing map projections (cartograms) in public health applications

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.

    1998-05-01

    In studying geographic disease distributions, one normally compares rates among arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing some of the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP){copyright}. Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease risk is constant. On the transformed map, the statistical analysis of the observed distribution is greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be calculated with validity. The DEMP algorithm was applied to a data set previously analyzed with conventional techniques; namely, 401 childhood cancer cases in four counties of California. The distribution of cases on the transformed map was analyzed visually and statistically. To check the validity of the method, the identical analysis was performed on 401 artificial cases randomly generated under the assumption of uniform risk. No statistically significant evidence for geographic non-uniformity of rates was found, in agreement with the original analysis performed by the California Department of Health Services.

  15. Statistical Indicators for Religious Studies: Indicators of Level and Structure

    Science.gov (United States)

    Herteliu, Claudiu; Isaic-Maniu, Alexandru

    2009-01-01

    Using statistic indicators as vectors of information relative to the operational status of a phenomenon, including a religious one, is unanimously accepted. By introducing a system of statistic indicators we can also analyze the interfacing areas of a phenomenon. In this context, we have elaborated a system of statistic indicators specific to the…

  16. Statistical variability and confidence intervals for planar dose QA pass rates

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, Daniel W.; Nelms, Benjamin E.; Attwood, Kristopher; Kumaraswamy, Lalith; Podgorsak, Matthew B. [Department of Physics, State University of New York at Buffalo, Buffalo, New York 14260 (United States) and Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, New York 14263 (United States); Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Department of Biostatistics, Roswell Park Cancer Institute, Buffalo, New York 14263 (United States); Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, New York 14263 (United States); Department of Radiation Medicine, Roswell Park Cancer Institute, Buffalo, New York 14263 (United States); Department of Molecular and Cellular Biophysics and Biochemistry, Roswell Park Cancer Institute, Buffalo, New York 14263 (United States) and Department of Physiology and Biophysics, State University of New York at Buffalo, Buffalo, New York 14214 (United States)

    2011-11-15

    Purpose: The most common metric for comparing measured to calculated dose, such as for pretreatment quality assurance of intensity-modulated photon fields, is a pass rate (%) generated using percent difference (%Diff), distance-to-agreement (DTA), or some combination of the two (e.g., gamma evaluation). For many dosimeters, the grid of analyzed points corresponds to an array with a low areal density of point detectors. In these cases, the pass rates for any given comparison criteria are not absolute but exhibit statistical variability that is a function, in part, on the detector sampling geometry. In this work, the authors analyze the statistics of various methods commonly used to calculate pass rates and propose methods for establishing confidence intervals for pass rates obtained with low-density arrays. Methods: Dose planes were acquired for 25 prostate and 79 head and neck intensity-modulated fields via diode array and electronic portal imaging device (EPID), and matching calculated dose planes were created via a commercial treatment planning system. Pass rates for each dose plane pair (both centered to the beam central axis) were calculated with several common comparison methods: %Diff/DTA composite analysis and gamma evaluation, using absolute dose comparison with both local and global normalization. Specialized software was designed to selectively sample the measured EPID response (very high data density) down to discrete points to simulate low-density measurements. The software was used to realign the simulated detector grid at many simulated positions with respect to the beam central axis, thereby altering the low-density sampled grid. Simulations were repeated with 100 positional iterations using a 1 detector/cm{sup 2} uniform grid, a 2 detector/cm{sup 2} uniform grid, and similar random detector grids. For each simulation, %/DTA composite pass rates were calculated with various %Diff/DTA criteria and for both local and global %Diff normalization

  17. spatial statistics of poultry production in anambra state of nigeria

    African Journals Online (AJOL)

    user

    case study. Spatial statistics toolbox in ArcGIS was used to generate point density map which reveal the regional .... Global Positioning System (GPS) .... report generated is shown in Figure . .... for the analysis of crime incident locations. Ned.

  18. Statistical mechanics of dense plasmas and implications for the plasma polarization shift

    International Nuclear Information System (INIS)

    Rogers, F.J.

    1984-01-01

    A brief description of the statistical mechanics of reacting, dense, plasmas is given. The results do not support a Debye-like polarization shift at low density. It is shown that the electronic charge density factors into a strongly quantum mechanical part, that is not much affected by many body correlations and a weakly quantum mechanical part, that is considerably effected by many body correlations. The few body charge density is obtained from direct solution of the Schroedinger equation and the many body charge density is obtained from the hypernetted chain equation through the introduction of a pseudopotential

  19. Landslide susceptibility mapping using GIS-based statistical models and Remote sensing data in tropical environment.

    Science.gov (United States)

    Shahabi, Himan; Hashim, Mazlan

    2015-04-22

    This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning.

  20. An Invitation to Algebraic Statistics: New Outlook and Opportunities

    OpenAIRE

    Çetin, Eyüp

    2012-01-01

    Algebra, a branch of pure mathematics, now advances statistics and operations research of applied mathematics. This synergy is called algebraic statistics as a new discipline. Algebraic statistics offers statisticians, management scientists, business researchers, econometricians and algebraists new opportunities, horizons and connections to advance their fields and related application areas. In this effort, this young, vibrant, quickly growing, and active discipline is briefly discussed and s...

  1. Mammographic Breast Density in a Cohort of Medically Underserved Women

    Science.gov (United States)

    2015-12-01

    was a training year and during Years 2 through 4 a case-control study of obesity , insulin resistance and mammographic breast density was conducted. A...factors including health literacy, and to collect anthropometric measurements and fasting blood, 3) to assay blood for select hormones and growth...factors, 4) to perform statistical analyses to determine the associations between obesity and insulin resistance and mammographic breast density, and 5

  2. Gregor Mendel, His Experiments and Their Statistical Evaluation

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2014-01-01

    Roč. 99, č. 1 (2014), s. 87-99 ISSN 1211-8788 Institutional support: RVO:67985807 Keywords : Mendel * history of genetics * Mendel-Fisher controversy * statistical analysis * binomial distribution * numerical simulation Subject RIV: BB - Applied Statistics, Operational Research http://www.mzm.cz/fileadmin/user_upload/publikace/casopisy/amm_sb_99_1_2014/08kalina.pdf

  3. Meson phase space density from interferometry

    International Nuclear Information System (INIS)

    Bertsch, G.F.

    1993-01-01

    The interferometric analysis of meson correlations a measure of the average phase space density of the mesons in the final state. The quantity is a useful indicator of the statistical properties of the systems, and it can be extracted with a minimum of model assumptions. Values obtained from recent measurements are consistent with the thermal value, but do not rule out superradiance effects

  4. Multivariate statistical methods and data mining in particle physics (4/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  5. Multivariate statistical methods and data mining in particle physics (2/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  6. Multivariate statistical methods and data mining in particle physics (1/4)

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    The lectures will cover multivariate statistical methods and their applications in High Energy Physics. The methods will be viewed in the framework of a statistical test, as used e.g. to discriminate between signal and background events. Topics will include an introduction to the relevant statistical formalism, linear test variables, neural networks, probability density estimation (PDE) methods, kernel-based PDE, decision trees and support vector machines. The methods will be evaluated with respect to criteria relevant to HEP analyses such as statistical power, ease of computation and sensitivity to systematic effects. Simple computer examples that can be extended to more complex analyses will be presented.

  7. Lower hybrid current drive at ITER-relevant high plasma densities

    International Nuclear Information System (INIS)

    Cesario, R.; Amicucci, L.; Cardinali, A.; Castaldo, C.; Marinucci, M.; Panaccione, L.; Pericoli-Ridolfini, V.; Tuccillo, A. A.; Tudisco, O.; Calabro, G.

    2009-01-01

    Recent experiments indicated that a further non-inductive current, besides bootstrap, should be necessary for developing advanced scenario for ITER. The lower hybrid current drive (LHCD) should provide such tool, but its effectiveness was still not proved in operations with ITER-relevant density of the plasma column periphery. Progress of the LH deposition modelling is presented, performed considering the wave physics of the edge, and different ITER-relevant edge parameters. Operations with relatively high edge electron temperatures are expected to reduce the LH || spectral broadening and, consequently, enabling the LH power to propagate also in high density plasmas ( || is the wavenumber component aligned to the confinement magnetic field). New results of FTU experiments are presented, performed by following the aforementioned modeling: they indicate that, for the first time, the LHCD conditions are established by operating at ITER-relevant high edge densities.

  8. Statistical hydrodynamics of lattice-gas automata

    OpenAIRE

    Grosfils, Patrick; Boon, Jean-Pierre; Brito López, Ricardo; Ernst, M. H.

    1993-01-01

    We investigate the space and time behavior of spontaneous thermohydrodynamic fluctuations in a simple fluid modeled by a lattice-gas automaton and develop the statistical-mechanical theory of thermal lattice gases to compute the dynamical structure factor, i.e., the power spectrum of the density correlation function. A comparative analysis of the theoretical predictions with our lattice gas simulations is presented. The main results are (i) the spectral function of the lattice-gas fluctuation...

  9. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  10. Spectral statistics in chiral-orthogonal disordered systems

    International Nuclear Information System (INIS)

    Evangelou, S N; Katsanos, D E

    2003-01-01

    We describe the singularities in the averaged density of states and the corresponding statistics of the energy levels in two- (2D) and three-dimensional (3D) chiral symmetric and time-reversal invariant disordered systems, realized in bipartite lattices with real off-diagonal disorder. For off-diagonal disorder of zero mean, we obtain a singular density of states in 2D which becomes much less pronounced in 3D, while the level-statistics can be described by a semi-Poisson distribution with mostly critical fractal states in 2D and Wigner surmise with mostly delocalized states in 3D. For logarithmic off-diagonal disorder of large strength, we find behaviour indistinguishable from ordinary disorder with strong localization in any dimension but in addition one-dimensional 1/ vertical bar E vertical bar Dyson-like asymptotic spectral singularities. The off-diagonal disorder is also shown to enhance the propagation of two interacting particles similarly to systems with diagonal disorder. Although disordered models with chiral symmetry differ from non-chiral ones due to the presence of spectral singularities, both share the same qualitative localization properties except at the chiral symmetry point E=0 which is critical

  11. Baryon density in alternative BBN models

    International Nuclear Information System (INIS)

    Kirilova, D.

    2002-10-01

    We present recent determinations of the cosmological baryon density ρ b , extracted from different kinds of observational data. The baryon density range is not very wide and is usually interpreted as an indication for consistency. It is interesting to note that all other determinations give higher baryon density than the standard big bang nucleosynthesis (BBN) model. The differences of the ρ b values from the BBN predicted one (the most precise today) may be due to the statistical and systematic errors in observations. However, they may be an indication of new physics. Hence, it is interesting to study alternative BBN models, and the possibility to resolve the discrepancies. We discuss alternative cosmological scenarios: a BBN model with decaying particles (m ∼ MeV, τ ∼ sec) and BBN with electron-sterile neutrino oscillations, which permit to relax BBN constraints on the baryon content of the Universe. (author)

  12. High-Density Signal Interface Electromagnetic Radiation Prediction for Electromagnetic Compatibility Evaluation.

    Energy Technology Data Exchange (ETDEWEB)

    Halligan, Matthew

    2017-11-01

    Radiated power calculation approaches for practical scenarios of incomplete high- density interface characterization information and incomplete incident power information are presented. The suggested approaches build upon a method that characterizes power losses through the definition of power loss constant matrices. Potential radiated power estimates include using total power loss information, partial radiated power loss information, worst case analysis, and statistical bounding analysis. A method is also proposed to calculate radiated power when incident power information is not fully known for non-periodic signals at the interface. Incident data signals are modeled from a two-state Markov chain where bit state probabilities are derived. The total spectrum for windowed signals is postulated as the superposition of spectra from individual pulses in a data sequence. Statistical bounding methods are proposed as a basis for the radiated power calculation due to the statistical calculation complexity to find a radiated power probability density function.

  13. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  14. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  15. Successful operation of continuous reactors at short retention times results in high-density, fast-rate Dehalococcoides dechlorinating cultures.

    Science.gov (United States)

    Delgado, Anca G; Fajardo-Williams, Devyn; Popat, Sudeep C; Torres, César I; Krajmalnik-Brown, Rosa

    2014-03-01

    The discovery of Dehalococcoides mccartyi reducing perchloroethene and trichloroethene (TCE) to ethene was a key landmark for bioremediation applications at contaminated sites. D. mccartyi-containing cultures are typically grown in batch-fed reactors. On the other hand, continuous cultivation of these microorganisms has been described only at long hydraulic retention times (HRTs). We report the cultivation of a representative D. mccartyi-containing culture in continuous stirred-tank reactors (CSTRs) at a short, 3-d HRT, using TCE as the electron acceptor. We successfully operated 3-d HRT CSTRs for up to 120 days and observed sustained dechlorination of TCE at influent concentrations of 1 and 2 mM TCE to ≥ 97 % ethene, coupled to the production of 10(12) D. mccartyi cells Lculture (-1). These outcomes were possible in part by using a medium with low bicarbonate concentrations (5 mM) to minimize the excessive proliferation of microorganisms that use bicarbonate as an electron acceptor and compete with D. mccartyi for H2. The maximum conversion rates for the CSTR-produced culture were 0.13 ± 0.016, 0.06 ± 0.018, and 0.02 ± 0.007 mmol Cl(-) Lculture (-1) h(-1), respectively, for TCE, cis-dichloroethene, and vinyl chloride. The CSTR operation described here provides the fastest laboratory cultivation rate of high-cell density Dehalococcoides cultures reported in the literature to date. This cultivation method provides a fundamental scientific platform for potential future operations of such a system at larger scales.

  16. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  17. The contributions of breast density and common genetic variation to breast cancer risk.

    Science.gov (United States)

    Vachon, Celine M; Pankratz, V Shane; Scott, Christopher G; Haeberle, Lothar; Ziv, Elad; Jensen, Matthew R; Brandt, Kathleen R; Whaley, Dana H; Olson, Janet E; Heusinger, Katharina; Hack, Carolin C; Jud, Sebastian M; Beckmann, Matthias W; Schulz-Wendtland, Ruediger; Tice, Jeffrey A; Norman, Aaron D; Cunningham, Julie M; Purrington, Kristen S; Easton, Douglas F; Sellers, Thomas A; Kerlikowske, Karla; Fasching, Peter A; Couch, Fergus J

    2015-05-01

    We evaluated whether a 76-locus polygenic risk score (PRS) and Breast Imaging Reporting and Data System (BI-RADS) breast density were independent risk factors within three studies (1643 case patients, 2397 control patients) using logistic regression models. We incorporated the PRS odds ratio (OR) into the Breast Cancer Surveillance Consortium (BCSC) risk-prediction model while accounting for its attributable risk and compared five-year absolute risk predictions between models using area under the curve (AUC) statistics. All statistical tests were two-sided. BI-RADS density and PRS were independent risk factors across all three studies (P interaction = .23). Relative to those with scattered fibroglandular densities and average PRS (2(nd) quartile), women with extreme density and highest quartile PRS had 2.7-fold (95% confidence interval [CI] = 1.74 to 4.12) increased risk, while those with low density and PRS had reduced risk (OR = 0.30, 95% CI = 0.18 to 0.51). PRS added independent information (P Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Phase flow and statistical structure of Galton-board systems

    International Nuclear Information System (INIS)

    Lue, A.; Brenner, H.

    1993-01-01

    Galton boards, found in museum exhibits devoted to science and technology, are often used to demonstrate visually the ubiquity of so-called ''laws of probability'' via an experimental realization of normal distributions. A detailed theoretical study of Galton-board phase-space dynamics and statistical behavior is presented. The study is based on a simple inelastic-collision model employing a particle fall- ing through a spatially periodic lattice of rigid, convex scatterers. We show that such systems exhibit indeterminate behavior through the presence of strange attractors or strange repellers in phase space; nevertheless, we also show that these systems exhibit regular and predictable behavior under specific circumstances. Phase-space strange attractors, periodic attractors, and strange repellers are present in numerical simulations, confirming results anticipated from geometric analysis. The system's geometry (dictated by lattice geometry and density as well as the direction of gravity) is observed to play a dominant role in stability, phase-flow topology, and statistical observations. Smale horseshoes appear to exist in the low-lattice-density limit and may exist in other regimes. These horseshoes are generated by homoclinic orbits whose existence is dictated by system characteristics. The horseshoes lead directly to deterministic chaos in the system. Strong evidence exists for ergodicity in all attractors. Phase-space complexities are manifested at all observed levels, particularly statistical ones. Consequently, statistical observations are critically dependent upon system details. Under well-defined circumstances, these observations display behavior which does not constitute a realization of the ''laws of probability.''

  19. Statistics on exponential averaging of periodograms

    Energy Technology Data Exchange (ETDEWEB)

    Peeters, T.T.J.M. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Ciftcioglu, Oe. [Istanbul Technical Univ. (Turkey). Dept. of Electrical Engineering

    1994-11-01

    The algorithm of exponential averaging applied to subsequent periodograms of a stochastic process is used to estimate the power spectral density (PSD). For an independent process, assuming the periodogram estimates to be distributed according to a {chi}{sup 2} distribution with 2 degrees of freedom, the probability density function (PDF) of the PSD estimate is derived. A closed expression is obtained for the moments of the distribution. Surprisingly, the proof of this expression features some new insights into the partitions and Eulers infinite product. For large values of the time constant of the averaging process, examination of the cumulant generating function shows that the PDF approximates the Gaussian distribution. Although restrictions for the statistics are seemingly tight, simulation of a real process indicates a wider applicability of the theory. (orig.).

  20. Statistics on exponential averaging of periodograms

    International Nuclear Information System (INIS)

    Peeters, T.T.J.M.; Ciftcioglu, Oe.

    1994-11-01

    The algorithm of exponential averaging applied to subsequent periodograms of a stochastic process is used to estimate the power spectral density (PSD). For an independent process, assuming the periodogram estimates to be distributed according to a χ 2 distribution with 2 degrees of freedom, the probability density function (PDF) of the PSD estimate is derived. A closed expression is obtained for the moments of the distribution. Surprisingly, the proof of this expression features some new insights into the partitions and Eulers infinite product. For large values of the time constant of the averaging process, examination of the cumulant generating function shows that the PDF approximates the Gaussian distribution. Although restrictions for the statistics are seemingly tight, simulation of a real process indicates a wider applicability of the theory. (orig.)

  1. Value of information to improve daily operations in high-density logistics

    NARCIS (Netherlands)

    Viet, Nguyen Quoc; Behdani, Behzad; Bloemhof, Jacqueline

    2018-01-01

    Agro-food logistics is increasingly challenged to ensure that a wide variety of high-quality products are always available at retail stores. This paper discusses high-density logistics issues caused by more frequent and smaller orders from retailers. Through a case study of the distribution process

  2. Statistical decisions under nonparametric a priori information

    International Nuclear Information System (INIS)

    Chilingaryan, A.A.

    1985-01-01

    The basic module of applied program package for statistical analysis of the ANI experiment data is described. By means of this module tasks of choosing theoretical model most adequately fitting to experimental data, selection of events of definte type, identification of elementary particles are carried out. For mentioned problems solving, the Bayesian rules, one-leave out test and KNN (K Nearest Neighbour) adaptive density estimation are utilized

  3. Variability of footprint ridge density and its use in estimation of sex in forensic examinations.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Pathania, Annu; Sharma, Ruchika; DiMaggio, John A

    2015-10-01

    The present study deals with a comparatively new biometric parameter of footprints called footprint ridge density. The study attempts to evaluate sex-dependent variations in ridge density in different areas of the footprint and its usefulness in discriminating sex in the young adult population of north India. The sample for the study consisted of 160 young adults (121 females) from north India. The left and right footprints were taken from each subject according to the standard procedures. The footprints were analysed using a 5 mm × 5 mm square and the ridge density was calculated in four different well-defined areas of the footprints. These were: F1 - the great toe on its proximal and medial side; F2 - the medial ball of the footprint, below the triradius (the triradius is a Y-shaped group of ridges on finger balls, palms and soles which forms the basis of ridge counting in identification); F3 - the lateral ball of the footprint, towards the most lateral part; and F4 - the heel in its central part where the maximum breadth at heel is cut by a perpendicular line drawn from the most posterior point on heel. This value represents the number of ridges in a 25 mm(2) area and reflects the ridge density value. Ridge densities analysed on different areas of footprints were compared with each other using the Friedman test for related samples. The total footprint ridge density was calculated as the sum of the ridge density in the four areas of footprints included in the study (F1 + F2 + F3 + F4). The results show that the mean footprint ridge density was higher in females than males in all the designated areas of the footprints. The sex differences in footprint ridge density were observed to be statistically significant in the analysed areas of the footprint, except for the heel region of the left footprint. The total footprint ridge density was also observed to be significantly higher among females than males. A statistically significant correlation

  4. Teaching Statistics from the Operating Table: Minimally Invasive and Maximally Educational

    Science.gov (United States)

    Nowacki, Amy S.

    2015-01-01

    Statistics courses that focus on data analysis in isolation, discounting the scientific inquiry process, may not motivate students to learn the subject. By involving students in other steps of the inquiry process, such as generating hypotheses and data, students may become more interested and vested in the analysis step. Additionally, such an…

  5. The algebraic geometry of Harper operators

    Science.gov (United States)

    Li, Dan

    2011-10-01

    Following an approach developed by Gieseker, Knörrer and Trubowitz for discretized Schrödinger operators, we study the spectral theory of Harper operators in dimensions 2 and 1, as a discretized model of magnetic Laplacians, from the point of view of algebraic geometry. We describe the geometry of an associated family of Bloch varieties and compute their density of states. Finally, we also compute some spectral functions based on the density of states. We discuss the difference between the cases with rational or irrational parameters: for the two-dimensional Harper operator, the compactification of the Bloch variety is an ordinary variety in the rational case and an ind-pro-variety in the irrational case. This gives rise, at the algebro-geometric level of Bloch varieties, to a phenomenon similar to the Hofstadter butterfly in the spectral theory. In dimension 2, the density of states can be expressed in terms of period integrals over Fermi curves, where the resulting elliptic integrals are independent of the parameters. In dimension 1, for the almost Mathieu operator, with a similar argument, we find the usual dependence of the spectral density on the parameter, which gives rise to the well-known Hofstadter butterfly picture.

  6. Density meter algorithm and system for estimating sampling/mixing uncertainty

    International Nuclear Information System (INIS)

    Shine, E.P.

    1986-01-01

    The Laboratories Department at the Savannah River Plant (SRP) has installed a six-place density meter with an automatic sampling device. This paper describes the statistical software developed to analyze the density of uranyl nitrate solutions using this automated system. The purpose of this software is twofold: to estimate the sampling/mixing and measurement uncertainties in the process and to provide a measurement control program for the density meter. Non-uniformities in density are analyzed both analytically and graphically. The mean density and its limit of error are estimated. Quality control standards are analyzed concurrently with process samples and used to control the density meter measurement error. The analyses are corrected for concentration due to evaporation of samples waiting to be analyzed. The results of this program have been successful in identifying sampling/mixing problems and controlling the quality of analyses

  7. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  8. Statistical clustering of primordial black holes

    Energy Technology Data Exchange (ETDEWEB)

    Carr, B J [Cambridge Univ. (UK). Inst. of Astronomy

    1977-04-01

    It is shown that Meszaros theory of galaxy formation, in which galaxies form from the density perturbations associated with the statistical fluctuation in the number density of primordial black holes, must be modified if the black holes are initially surrounded by regions of lower radiation density than average (as is most likely). However, even in this situation, the sort of effect Meszaros envisages does occur and could in principle cause galactic mass-scales to bind at the conventional time. In fact, the requirement that galaxies should not form prematurely implies that black holes could not have a critical density in the mass range above 10/sup 5/ M(sun). If the mass spectrum of primordial black holes falls off more slowly than m/sup -3/ (as expected), then the biggest black holes have the largest clustering effect. In this case the black hole clustering theory of galaxy formation reduces to the black hole seed theory of galaxy formation, in which each galaxy becomes bound under the gravitational influence of a single black hole nucleus. The seed theory could be viable only if the early Universe had a soft equation of state until a time exceeding 10/sup -4/ s or if something prevented black hole formation before 1 s.

  9. On the relation between the statistical γ-decay and the level density in 162Dy

    International Nuclear Information System (INIS)

    Henden, L.; Bergholt, L.; Guttormsen, M.; Rekstad, J.; Tveter, T.S.

    1994-12-01

    The level density of low-spin states (0-10ℎ) in 162 Dy has been determined from the ground state up to approximately 6 MeV of excitation energy. Levels in the excitation region up to 8 MeV were populated by means of the 163 Dy( 3 He, α) reaction, and the first-generation γ-rays in the decay of these states has been isolated. The energy distribution of the first-generation γ-rays provides a new source of information about the nuclear level density over a wide energy region. A broad peak is observed in the first-generation spectra, and the authors suggest an interpretation in terms of enhanced M1 transitions between different high-j Nilsson orbitals. 30 refs., 9 figs., 2 tabs

  10. Imaging investigation of metabolic and endocrine bone disease of vertebral density

    International Nuclear Information System (INIS)

    Cai Yuezeng; Tian Xiali; Li Jingxue

    2006-01-01

    Objective: To probe vertebral density of metabolic and endocrine bone disease imaging features, characterize the regional distribution of bone trabecular in sandwich spine. Methods: Thirty-six patients who had the bone density abnormality appearance in radiograms were collected in this study. Twelve patients with sandwich spine were performed lumbar CT scan. Thirty-two healthy volunteers as control group were performed lumbar CT scan too. CT values of two groups were measured from different portions of vertebral body, and then were analysed. Twenty two patients were performed dual-energy x-ray absorptiometry (DXA). One patient was performed bone histomorphometry. Results: Abnormal density included decreased and increased density. Decreased density was found in different portions of all patients, which divided into general and regional type. Increased density was obviously in vertebrae, including diffusely increased density and sandwich spine. The mean CT values of superior, middle and inferior portions of sandwich vertebral body were (259.94±18.08), (182.96±34.85), (270.34±19.40) HU. The mean CT values of both superior and inferior portions of sandwich vertebral body were higher than that of control group. The mean CT values of superior and inferior portions of sandwich spine were higher than that of middle portion. The difference of mean CT values between superior and inferior portions had no statistical significance. The difference of CT values among the regions of superior and inferior portions had no statistical significance (F=0.457, 0.462, P>0.05). The difference of CT values among the regions of middle portion had statistical significance(F=4.539, P<0.05). The DXA measurement of sandwich spine showed high, normal and low BMD. Conclusion: The sandwich spine is useful to measure superior and inferior portions of sandwich vertebral body if QCT would be performed. Sandwich spine sign can be used as an imaging index of state evaluation. Increased density in

  11. Advanced intermediate temperature sodium-nickel chloride batteries with ultra-high energy density

    Science.gov (United States)

    Li, Guosheng; Lu, Xiaochuan; Kim, Jin Y.; Meinhardt, Kerry D.; Chang, Hee Jung; Canfield, Nathan L.; Sprenkle, Vincent L.

    2016-02-01

    Sodium-metal halide batteries have been considered as one of the more attractive technologies for stationary electrical energy storage, however, they are not used for broader applications despite their relatively well-known redox system. One of the roadblocks hindering market penetration is the high-operating temperature. Here we demonstrate that planar sodium-nickel chloride batteries can be operated at an intermediate temperature of 190 °C with ultra-high energy density. A specific energy density of 350 Wh kg-1, higher than that of conventional tubular sodium-nickel chloride batteries (280 °C), is obtained for planar sodium-nickel chloride batteries operated at 190 °C over a long-term cell test (1,000 cycles), and it attributed to the slower particle growth of the cathode materials at the lower operating temperature. Results reported here demonstrate that planar sodium-nickel chloride batteries operated at an intermediate temperature could greatly benefit this traditional energy storage technology by improving battery energy density, cycle life and reducing material costs.

  12. Comparison of Breast Density Between Synthesized Versus Standard Digital Mammography.

    Science.gov (United States)

    Haider, Irfanullah; Morgan, Matthew; McGow, Anna; Stein, Matthew; Rezvani, Maryam; Freer, Phoebe; Hu, Nan; Fajardo, Laurie; Winkler, Nicole

    2018-06-12

    To evaluate perceptual difference in breast density classification using synthesized mammography (SM) compared with standard or full-field digital mammography (FFDM) for screening. This institutional review board-approved, retrospective, multireader study evaluated breast density on 200 patients who underwent baseline screening mammogram during which both SM and FFDM were obtained contemporaneously from June 1, 2016, through November 30, 2016. Qualitative breast density was independently assigned by seven readers initially evaluating FFDM alone. Then, in a separate session, these same readers assigned breast density using synthetic views alone on the same 200 patients. The readers were again blinded to each other's assignment. Qualitative density assessment was based on BI-RADS fifth edition. Interreader agreement was evaluated with κ statistic using 95% confidence intervals. Testing for homogeneity in paired proportions was performed using McNemar's test with a level of significance of .05. For patients across the SM and standard 2-D data set, diagnostic testing with McNemar's test with P = 0.32 demonstrates that the minimal density transitions across FFDM and SM are not statistically significant density shifts. Taking clinical significance into account, only 8 of 200 (4%) patients had clinically significant transition (dense versus not dense). There was substantial interreader agreement with overall κ in FFDM of 0.71 (minimum 0.53, maximum 0.81) and overall SM κ average of 0.63 (minimum 0.56, maximum 0.87). Overall subjective breast density assignment by radiologists on SM is similar to density assignment on standard 2-D mammogram. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  13. Football goal distributions and extremal statistics

    Science.gov (United States)

    Greenhough, J.; Birch, P. C.; Chapman, S. C.; Rowlands, G.

    2002-12-01

    We analyse the distributions of the number of goals scored by home teams, away teams, and the total scored in the match, in domestic football games from 169 countries between 1999 and 2001. The probability density functions (PDFs) of goals scored are too heavy-tailed to be fitted over their entire ranges by Poisson or negative binomial distributions which would be expected for uncorrelated processes. Log-normal distributions cannot include zero scores and here we find that the PDFs are consistent with those arising from extremal statistics. In addition, we show that it is sufficient to model English top division and FA Cup matches in the seasons of 1970/71-2000/01 on Poisson or negative binomial distributions, as reported in analyses of earlier seasons, and that these are not consistent with extremal statistics.

  14. Applications of quantum entropy to statistics

    International Nuclear Information System (INIS)

    Silver, R.N.; Martz, H.F.

    1994-01-01

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to heirarchical Bayes methods

  15. The Effect of Obesity onBone Mineral Density in Primary Fibromyalgia Cases - Original Investigation

    Directory of Open Access Journals (Sweden)

    Bahadır Yesevi

    2005-12-01

    Full Text Available Fibromyalgia is a chronic musculoskeletal disease, characterized by tender points in various areas at body and widespread pain musculoskeletal system and unknown etiology, in which metabolic, immunologic and neuroendocrin abnormalities are seen. In this study, 45 female patients were enrolled according to 1990 ACR fibromyalgia criteria. They were divided to 3 groups, with 15 patients; normal, preobese and obese, depending to the body mass index. They were tested for bone mineral density of the lomber spine and femur, using dual energy x-ray absorptionmeter. The depression presence was investigated by Hamilton Depression Scale. The bone mineral density of L1-4 region of fibromyalgic normal body weight patients were normal range and there was no significant statistical difference between others groups. In contrast, femur bone mineral density vaules were found to be statistically significantly osteopenic, as compared with obese groups. There was a negative statistical correlation between depression and lomber area bone mineral density. Whereas in femur it was seen that bone mineral density was protected in preobese and obese fibromyalgia patients. The number of studies on this subject is not sufficient. Also the number of patients determined on current studies are low. Further studies, with langer patient numbers and more detailed protocols are needed. (Osteoporoz Dünyasından 2005; 4: 148-150

  16. Local Fitting of the Kohn-Sham Density in a Gaussian and Plane Waves Scheme for Large-Scale Density Functional Theory Simulations.

    Science.gov (United States)

    Golze, Dorothea; Iannuzzi, Marcella; Hutter, Jürg

    2017-05-09

    A local resolution-of-the-identity (LRI) approach is introduced in combination with the Gaussian and plane waves (GPW) scheme to enable large-scale Kohn-Sham density functional theory calculations. In GPW, the computational bottleneck is typically the description of the total charge density on real-space grids. Introducing the LRI approximation, the linear scaling of the GPW approach with respect to system size is retained, while the prefactor for the grid operations is reduced. The density fitting is an O(N) scaling process implemented by approximating the atomic pair densities by an expansion in one-center fit functions. The computational cost for the grid-based operations becomes negligible in LRIGPW. The self-consistent field iteration is up to 30 times faster for periodic systems dependent on the symmetry of the simulation cell and on the density of grid points. However, due to the overhead introduced by the local density fitting, single point calculations and complete molecular dynamics steps, including the calculation of the forces, are effectively accelerated by up to a factor of ∼10. The accuracy of LRIGPW is assessed for different systems and properties, showing that total energies, reaction energies, intramolecular and intermolecular structure parameters are well reproduced. LRIGPW yields also high quality results for extended condensed phase systems such as liquid water, ice XV, and molecular crystals.

  17. Diffusion-Based Density-Equalizing Maps: an Interdisciplinary Approach to Visualizing Homicide Rates and Other Georeferenced Statistical Data

    Science.gov (United States)

    Mazzitello, Karina I.; Candia, Julián

    2012-12-01

    In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.

  18. A generalized operational formula based on total electronic densities to obtain 3D pictures of the dual descriptor to reveal nucleophilic and electrophilic sites accurately on closed-shell molecules.

    Science.gov (United States)

    Martínez-Araya, Jorge I

    2016-09-30

    By means of the conceptual density functional theory, the so-called dual descriptor (DD) has been adapted to be used in any closed-shell molecule that presents degeneracy in its frontier molecular orbitals. The latter is of paramount importance because a correct description of local reactivity will allow to predict the most favorable sites on a molecule to undergo nucleophilic or electrophilic attacks; on the contrary, an incomplete description of local reactivity might have serio us consequences, particularly for those experimental chemists that have the need of getting an insight about reactivity of chemical reagents before using them in synthesis to obtain a new compound. In the present work, the old approach based only on electronic densities of frontier molecular orbitals is replaced by the most accurate procedure that implies the use of total electronic densities thus keeping consistency with the essential principle of the DFT in which the electronic density is the fundamental variable and not the molecular orbitals. As a result of the present work, the DD will be able to properly describe local reactivities only in terms of total electronic densities. To test the proposed operational formula, 12 very common molecules were selected as the original definition of the DD was not able to describe their local reactivities properly. The ethylene molecule was additionally used to test the capability of the proposed operational formula to reveal a correct local reactivity even in absence of degeneracy in frontier molecular orbitals. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Density dependent hadron field theory

    International Nuclear Information System (INIS)

    Fuchs, C.; Lenske, H.; Wolter, H.H.

    1995-01-01

    A fully covariant approach to a density dependent hadron field theory is presented. The relation between in-medium NN interactions and field-theoretical meson-nucleon vertices is discussed. The medium dependence of nuclear interactions is described by a functional dependence of the meson-nucleon vertices on the baryon field operators. As a consequence, the Euler-Lagrange equations lead to baryon rearrangement self-energies which are not obtained when only a parametric dependence of the vertices on the density is assumed. It is shown that the approach is energy-momentum conserving and thermodynamically consistent. Solutions of the field equations are studied in the mean-field approximation. Descriptions of the medium dependence in terms of the baryon scalar and vector density are investigated. Applications to infinite nuclear matter and finite nuclei are discussed. Density dependent coupling constants obtained from Dirac-Brueckner calculations with the Bonn NN potentials are used. Results from Hartree calculations for energy spectra, binding energies, and charge density distributions of 16 O, 40,48 Ca, and 208 Pb are presented. Comparisons to data strongly support the importance of rearrangement in a relativistic density dependent field theory. Most striking is the simultaneous improvement of charge radii, charge densities, and binding energies. The results indicate the appearance of a new ''Coester line'' in the nuclear matter equation of state

  20. Modeling of Dissipation Element Statistics in Turbulent Non-Premixed Jet Flames

    Science.gov (United States)

    Denker, Dominik; Attili, Antonio; Boschung, Jonas; Hennig, Fabian; Pitsch, Heinz

    2017-11-01

    The dissipation element (DE) analysis is a method for analyzing and compartmentalizing turbulent scalar fields. DEs can be described by two parameters, namely the Euclidean distance l between their extremal points and the scalar difference in the respective points Δϕ . The joint probability density function (jPDF) of these two parameters P(Δϕ , l) is expected to suffice for a statistical reconstruction of the scalar field. In addition, reacting scalars show a strong correlation with these DE parameters in both premixed and non-premixed flames. Normalized DE statistics show a remarkable invariance towards changes in Reynolds numbers. This feature of DE statistics was exploited in a Boltzmann-type evolution equation based model for the probability density function (PDF) of the distance between the extremal points P(l) in isotropic turbulence. Later, this model was extended for the jPDF P(Δϕ , l) and then adapted for the use in free shear flows. The effect of heat release on the scalar scales and DE statistics is investigated and an extended model for non-premixed jet flames is introduced, which accounts for the presence of chemical reactions. This new model is validated against a series of DNS of temporally evolving jet flames. European Research Council Project ``Milestone''.

  1. High Power Density Power Electronic Converters for Large Wind Turbines

    DEFF Research Database (Denmark)

    Senturk, Osman Selcuk

    . For these VSCs, high power density is required due to limited turbine nacelle space. Also, high reliability is required since maintenance cost of these remotely located wind turbines is quite high and these turbines operate under harsh operating conditions. In order to select a high power density and reliability......In large wind turbines (in MW and multi-MW ranges), which are extensively utilized in wind power plants, full-scale medium voltage (MV) multi-level (ML) voltage source converters (VSCs) are being more preferably employed nowadays for interfacing these wind turbines with electricity grids...... VSC solution for wind turbines, first, the VSC topology and the switch technology to be employed should be specified such that the highest possible power density and reliability are to be attained. Then, this qualitative approach should be complemented with the power density and reliability...

  2. The Statistical Fermi Paradox

    Science.gov (United States)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  3. Particles with small violations of Fermi or Bose statistics

    International Nuclear Information System (INIS)

    Greenberg, O.W.

    1991-01-01

    I discuss the statistics of ''quons'' (pronounced to rhyme with muons), particles whose annihilation and creation operators obey the q-deformed commutation relation (the quon algebra or q-mutator) which interpolates between fermions and bosons. Topics discussed include representations of the quon algebra, proof of the TCP theorem, violation of the usual locality properties, and experimental constraints on violations of the Pauli exclusion principle (i.e., Fermi statistics) and of Bose statistics

  4. Parity dependence of the nuclear level density at high excitation

    International Nuclear Information System (INIS)

    Rao, B.V.; Agrawal, H.M.

    1995-01-01

    The basic underlying assumption ρ(l+1, J)=ρ(l, J) in the level density function ρ(U, J, π) has been checked on the basis of high quality data available on individual resonance parameters (E 0 , Γ n , J π ) for s- and p-wave neutrons in contrast to the earlier analysis where information about p-wave resonance parameters was meagre. The missing level estimator based on the partial integration over a Porter-Thomas distribution of neutron reduced widths and the Dyson-Mehta Δ 3 statistic for the level spacing have been used to ascertain that the s- and p-wave resonance level spacings D(0) and D(1) are not in error because of spurious and missing levels. The present work does not validate the tacit assumption ρ(l+1, J)=ρ(l, J) and confirms that the level density depends upon parity at high excitation. The possible implications of the parity dependence of the level density on the results of statistical model calculations of nuclear reaction cross sections as well as on pre-compound emission have been emphasized. (orig.)

  5. Direct estimation of functionals of density operators by local operations and classical communication

    International Nuclear Information System (INIS)

    Alves, Carolina Moura; Horodecki, Pawel; Oi, Daniel K. L.; Kwek, L. C.; Ekert, Artur K.

    2003-01-01

    We present a method of direct estimation of important properties of a shared bipartite quantum state, within the ''distant laboratories'' paradigm, using only local operations and classical communication. We apply this procedure to spectrum estimation of shared states, and locally implementable structural physical approximations to incompletely positive maps. This procedure can also be applied to the estimation of channel capacity and measures of entanglement

  6. High performance discharges near the operational limit in HT-7

    International Nuclear Information System (INIS)

    Li Jiangang; Wan Baonian; Luo Jiarong; Gao Xiang; Zhao Yanping; Kuang Guangli; Zhang Xiaodong; Yang Yu; Yi Bao; Bojiang Ding; Jikang Xie; Yuanxi Wan

    2001-01-01

    Efforts have been made on the HT-7 tokamak to extend the stable operation boundaries. Extensive RF boronization and siliconization have been used and a wider operational Hugill diagram has been obtained. The transit density reached 1.3 times the Greenwald density limit in ohmic discharges. A stationary high performance discharge with q a =2.1 has been obtained after siliconization. Confinement improvement was obtained as a result of the significant reduction of electron thermal diffusivity χ e in the outer region of the plasma. An improved confinement phase was also observed with LHCD in the density range of 70-120% of the Greenwald density limit. Off-axis LH wave power deposition was attributed to the weak hollow current density profile. Code simulations and measurements showed good agreement with the off-axis LH wave deposition. Supersonic molecular beam injection has been successfully used to achieve stable high density operation in the region of the Greenwald density limit. (author)

  7. Dark matter and gas density profiles - a consequence of entropy bifurcation

    International Nuclear Information System (INIS)

    Leubner, M. P.

    2006-01-01

    The radial profiles of dark matter and hot plasma density distributions of relaxed galaxies and clusters were hitherto commonly fitted by empirical functions. On the other hand, the fundamental concept of non-extensive statistics accounts for long-range interactions and correlations present in gravitationally coupled ensembles and plasmas. We provide a theoretical link of non-extensive statistics to large scale astrophysical structures and show that the underlying tandem character of the entropy results in a bifurcation of the density distribution. A kinetic dark matter and thermodynamic gas branch turn out as natural consequence within the theory and is controlled by one single parameter, measuring physically the degree of correlations in the system. The theoretically derived density profiles are shown to represent accurately the characteristics of both, DM and hot plasma distributions, as observed or generated in N-body and hydro-simulations. The significant advantage over empirical fitting functions is provided by the physical content of the non-extensive approach wherefore it is proposed to model observed density profiles of astrophysical structures within the fundamental context of entropy generalization, accounting for nonlocality and long-range interactions in gravitationally coupled systems

  8. The actual current density of gas-evolving electrodes—Notes on the bubble coverage

    International Nuclear Information System (INIS)

    Vogt, H.

    2012-01-01

    All investigations of electrochemical reactors with gas-evolving electrodes must take account of the fact that the actual current density controlling cell operation commonly differs substantially from the nominal current density used for practical purposes. Both quantities are interrelated by the fractional bubble coverage. This parameter is shown to be affected by a large number of operational quantities. However, available relationships of the bubble coverage take account only of the nominal current density. A further essential insufficiency is their inconsistency with reality for very large values of the bubble coverage being of relevance for operation conditions leading to anode effects. An improved relationship applicable to the total range is proposed.

  9. Tobacco Products Production and Operations Reports

    Data.gov (United States)

    Department of the Treasury — Monthly statistical reports on tobacco products production and operations. Data for Tobacco Statistical Release is derived directly from the Report – Manufacturer of...

  10. Density limit experiments on FTU

    International Nuclear Information System (INIS)

    Pucella, G.; Tudisco, O.; Apicella, M.L.; Apruzzese, G.; Artaserse, G.; Belli, F.; Boncagni, L.; Botrugno, A.; Buratti, P.; Calabrò, G.; Castaldo, C.; Cianfarani, C.; Cocilovo, V.; Dimatteo, L.; Esposito, B.; Frigione, D.; Gabellieri, L.; Giovannozzi, E.; Bin, W.; Granucci, G.

    2013-01-01

    One of the main problems in tokamak fusion devices concerns the capability to operate at a high plasma density, which is observed to be limited by the appearance of catastrophic events causing loss of plasma confinement. The commonly used empirical scaling law for the density limit is the Greenwald limit, predicting that the maximum achievable line-averaged density along a central chord depends only on the average plasma current density. However, the Greenwald density limit has been exceeded in tokamak experiments in the case of peaked density profiles, indicating that the edge density is the real parameter responsible for the density limit. Recently, it has been shown on the Frascati Tokamak Upgrade (FTU) that the Greenwald density limit is exceeded in gas-fuelled discharges with a high value of the edge safety factor. In order to understand this behaviour, dedicated density limit experiments were performed on FTU, in which the high density domain was explored in a wide range of values of plasma current (I p = 500–900 kA) and toroidal magnetic field (B T = 4–8 T). These experiments confirm the edge nature of the density limit, as a Greenwald-like scaling holds for the maximum achievable line-averaged density along a peripheral chord passing at r/a ≃ 4/5. On the other hand, the maximum achievable line-averaged density along a central chord does not depend on the average plasma current density and essentially depends on the toroidal magnetic field only. This behaviour is explained in terms of density profile peaking in the high density domain, with a peaking factor at the disruption depending on the edge safety factor. The possibility that the MARFE (multifaced asymmetric radiation from the edge) phenomenon is the cause of the peaking has been considered, with the MARFE believed to form a channel for the penetration of the neutral particles into deeper layers of the plasma. Finally, the magnetohydrodynamic (MHD) analysis has shown that also the central line

  11. Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing.

    Science.gov (United States)

    Roy, Kevin; Undey, Cenk; Mistretta, Thomas; Naugle, Gregory; Sodhi, Manbir

    2014-01-01

    Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning-in-place (CIP) and steaming-in-place (SIP, also known as sterilization-in-place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real-time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers.

  12. Jacobson generators, Fock representations and statistics of sl(n + 1)

    International Nuclear Information System (INIS)

    Palev, T.D.; Jeugt, J. van der

    2000-10-01

    The properties of A-statistics, related to the class of simple Lie algebras sl(n + 1), n is an element of Z + (Palev, T.D.: Preprint JINR E17-10550 (1977); hep-th/9705032), are further investigated. The description of each sl(n + 1) is carried out via generators and their relations (see eq. (2.5)), first introduced by Jacobson. The related Fock spaces W p , p is an element of N, are finite-dimensional irreducible sl(n + 1)-modules. The Pauli principle of the underlying statistics is formulated. In addition the paper contains the following new results: (a) the A-statistics are interpreted as exclusion statistics; (b) within each W p operators B(p) 1 ± ,...,B(p) n ± , proportional to the Jacobson generators, are introduced. It is proved that in an appropriate topology (Definition 2) lim p→∞ B(p) i ± = B i ± , where B i ± are Bose creation and annihilation operators; (c) it is shown that the local statistics of the degenerated hard-core Bose models and of the related Heisenberg spin models is p = I A-statistics. (author)

  13. Statistical analysis of modal parameters of a suspension bridge based on Bayesian spectral density approach and SHM data

    Science.gov (United States)

    Li, Zhijun; Feng, Maria Q.; Luo, Longxi; Feng, Dongming; Xu, Xiuli

    2018-01-01

    Uncertainty of modal parameters estimation appear in structural health monitoring (SHM) practice of civil engineering to quite some significant extent due to environmental influences and modeling errors. Reasonable methodologies are needed for processing the uncertainty. Bayesian inference can provide a promising and feasible identification solution for the purpose of SHM. However, there are relatively few researches on the application of Bayesian spectral method in the modal identification using SHM data sets. To extract modal parameters from large data sets collected by SHM system, the Bayesian spectral density algorithm was applied to address the uncertainty of mode extraction from output-only response of a long-span suspension bridge. The posterior most possible values of modal parameters and their uncertainties were estimated through Bayesian inference. A long-term variation and statistical analysis was performed using the sensor data sets collected from the SHM system of the suspension bridge over a one-year period. The t location-scale distribution was shown to be a better candidate function for frequencies of lower modes. On the other hand, the burr distribution provided the best fitting to the higher modes which are sensitive to the temperature. In addition, wind-induced variation of modal parameters was also investigated. It was observed that both the damping ratios and modal forces increased during the period of typhoon excitations. Meanwhile, the modal damping ratios exhibit significant correlation with the spectral intensities of the corresponding modal forces.

  14. Statistical inference and Aristotle's Rhetoric.

    Science.gov (United States)

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  15. Stochastic transport models for mixing in variable-density turbulence

    Science.gov (United States)

    Bakosi, J.; Ristorcelli, J. R.

    2011-11-01

    In variable-density (VD) turbulent mixing, where very-different- density materials coexist, the density fluctuations can be an order of magnitude larger than their mean. Density fluctuations are non-negligible in the inertia terms of the Navier-Stokes equation which has both quadratic and cubic nonlinearities. Very different mixing rates of different materials give rise to large differential accelerations and some fundamentally new physics that is not seen in constant-density turbulence. In VD flows material mixing is active in a sense far stronger than that applied in the Boussinesq approximation of buoyantly-driven flows: the mass fraction fluctuations are coupled to each other and to the fluid momentum. Statistical modeling of VD mixing requires accounting for basic constraints that are not important in the small-density-fluctuation passive-scalar-mixing approximation: the unit-sum of mass fractions, bounded sample space, and the highly skewed nature of the probability densities become essential. We derive a transport equation for the joint probability of mass fractions, equivalent to a system of stochastic differential equations, that is consistent with VD mixing in multi-component turbulence and consistently reduces to passive scalar mixing in constant-density flows.

  16. The choice of statistical methods for comparisons of dosimetric data in radiotherapy

    International Nuclear Information System (INIS)

    Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques

    2014-01-01

    Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman’s test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman’s rank and Kendall’s rank tests. The Friedman’s test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p <0.001). The density correction methods yielded to lower doses as compared to PBC by on average (−5 ± 4.4 SD) for MB and (−4.7 ± 5 SD) for ETAR. Post-hoc Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density

  17. Quantum Statistical Entropy of Five-Dimensional Black Hole

    Institute of Scientific and Technical Information of China (English)

    ZHAO Ren; WU Yue-Qin; ZHANG Sheng-Li

    2006-01-01

    The generalized uncertainty relation is introduced to calculate quantum statistic entropy of a black hole.By using the new equation of state density motivated by the generalized uncertainty relation, we discuss entropies of Bose field and Fermi field on the background of the five-dimensional spacetime. In our calculation, we need not introduce cutoff. There is not the divergent logarithmic term as in the original brick-wall method. And it is obtained that the quantum statistic entropy corresponding to black hole horizon is proportional to the area of the horizon. Further it is shown that the entropy of black hole is the entropy of quantum state on the surface of horizon. The black hole's entropy is the intrinsic property of the black hole. The entropy is a quantum effect. It makes people further understand the quantum statistic entropy.

  18. Quantum Statistical Entropy of Five-Dimensional Black Hole

    International Nuclear Information System (INIS)

    Zhao Ren; Zhang Shengli; Wu Yueqin

    2006-01-01

    The generalized uncertainty relation is introduced to calculate quantum statistic entropy of a black hole. By using the new equation of state density motivated by the generalized uncertainty relation, we discuss entropies of Bose field and Fermi field on the background of the five-dimensional spacetime. In our calculation, we need not introduce cutoff. There is not the divergent logarithmic term as in the original brick-wall method. And it is obtained that the quantum statistic entropy corresponding to black hole horizon is proportional to the area of the horizon. Further it is shown that the entropy of black hole is the entropy of quantum state on the surface of horizon. The black hole's entropy is the intrinsic property of the black hole. The entropy is a quantum effect. It makes people further understand the quantum statistic entropy.

  19. Density measurements of microsecond-conduction-time POS plasmas

    International Nuclear Information System (INIS)

    Hinshelwood, D.; Goodrich, P.J.; Weber, B.V.; Commisso, R.J.; Grossmann, J.M.; Kellogg, J.C.

    1993-01-01

    Measurements of the electron density in a coaxial microsecond conduction time plasma opening switch during switch operation are described. Current conduction is observed to cause a radial redistribution of the switch plasma. A local reduction in axial line density of more than an order of magnitude occurs by the time opening begins. This reduction, and the scaling of conduction current with plasma density, indicate that current conduction in this experiment is limited by hydrodynamic effects. It is hypothesized that the density reduction allows the switch to open by an erosion mechanism. Initial numerical modeling efforts have reproduced the principal observed results. A model that predicts accurately the conduction current is presented

  20. Phase statistics in non-Gaussian scattering

    International Nuclear Information System (INIS)

    Watson, Stephen M; Jakeman, Eric; Ridley, Kevin D

    2006-01-01

    Amplitude weighting can improve the accuracy of frequency measurements in signals corrupted by multiplicative speckle noise. When the speckle field constitutes a circular complex Gaussian process, the optimal function of amplitude weighting is provided by the field intensity, corresponding to the intensity-weighted phase derivative statistic. In this paper, we investigate the phase derivative and intensity-weighted phase derivative returned from a two-dimensional random walk, which constitutes a generic scattering model capable of producing both Gaussian and non-Gaussian fluctuations. Analytical results are developed for the correlation properties of the intensity-weighted phase derivative, as well as limiting probability densities of the scattered field. Numerical simulation is used to generate further probability densities and determine optimal weighting criteria from non-Gaussian fields. The results are relevant to frequency retrieval in radiation scattered from random media

  1. Tube problems: worldwide statistics reviewed

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    EPRI's Steam Generator Strategic Management Project issues an annual report on the progress being made in tackling steam generator problems worldwide, containing a wealth of detailed statistics on the status of operating units and degradation mechanisms encountered. A few highlights are presented from the latest report, issued in October 1993, which covers the period to 31 December 1992. (Author)

  2. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  3. Introduction to mathematical statistical physics

    CERN Document Server

    Minlos, R A

    1999-01-01

    This book presents a mathematically rigorous approach to the main ideas and phenomena of statistical physics. The introduction addresses the physical motivation, focussing on the basic concept of modern statistical physics, that is the notion of Gibbsian random fields. Properties of Gibbsian fields are analyzed in two ranges of physical parameters: "regular" (corresponding to high-temperature and low-density regimes) where no phase transition is exhibited, and "singular" (low temperature regimes) where such transitions occur. Next, a detailed approach to the analysis of the phenomena of phase transitions of the first kind, the Pirogov-Sinai theory, is presented. The author discusses this theory in a general way and illustrates it with the example of a lattice gas with three types of particles. The conclusion gives a brief review of recent developments arising from this theory. The volume is written for the beginner, yet advanced students will benefit from it as well. The book will serve nicely as a supplement...

  4. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  5. Theory of open quantum systems with bath of electrons and phonons and spins: many-dissipaton density matrixes approach.

    Science.gov (United States)

    Yan, YiJing

    2014-02-07

    This work establishes a strongly correlated system-and-bath dynamics theory, the many-dissipaton density operators formalism. It puts forward a quasi-particle picture for environmental influences. This picture unifies the physical descriptions and algebraic treatments on three distinct classes of quantum environments, electron bath, phonon bath, and two-level spin or exciton bath, as their participating in quantum dissipation processes. Dynamical variables for theoretical description are no longer just the reduced density matrix for system, but remarkably also those for quasi-particles of bath. The present theoretical formalism offers efficient and accurate means for the study of steady-state (nonequilibrium and equilibrium) and real-time dynamical properties of both systems and hybridizing environments. It further provides universal evaluations, exact in principle, on various correlation functions, including even those of environmental degrees of freedom in coupling with systems. Induced environmental dynamics could be reflected directly in experimentally measurable quantities, such as Fano resonances and quantum transport current shot noise statistics.

  6. Statistical representation of a spray as a point process

    International Nuclear Information System (INIS)

    Subramaniam, S.

    2000-01-01

    The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed. (c) 2000 American Institute of Physics

  7. Breeding bird density does not drive vocal individuality

    Directory of Open Access Journals (Sweden)

    Daniel T. BLUMSTEIN, Douglas R. MCCLAIN, Carrie DE JESUS, Gustavo ALARCÓN-NIETO

    2012-10-01

    Full Text Available Many species produce individually specific vocalizations and sociality is a hypothesized driver of such individuality. Previous studies of how social variation influenced individuality focused on colonial or non-colonial avian species, and how social group size influenced individuality in sciurid rodents. Since sociality is an important driver of individuality, we expected that bird species that defend nesting territories in higher density neighborhoods should have more individually-distinctive calls than those that defend nesting territories in lower-density neighborhoods. We used Beecher’s information statistic to quantify individuality, and we examined the relationship between bird density (calculated with point-counts and vocal individuality on seven species of passerines. We found non-significant relationships between breeding bird density and vocal individuality whether regressions were fitted on species values, or on phylogenetically-independent contrast values. From these results, we infer that while individuality may be explained by social factors, breeding bird density is unlikely to be generally important in driving the evolution of individually-specific vocalizations [Current Zoology 58 (5: 765–772, 2012].

  8. Angular momentum dependence of the nuclear level density parameter

    International Nuclear Information System (INIS)

    Aggarwal, Mamta; Kailas, S.

    2010-01-01

    Dependence of nuclear level density parameter on the angular momentum and temperature is investigated in a theoretical framework using the statistical theory of hot rotating nuclei. The structural effects are incorporated by including shell correction, shape, and deformation. The nuclei around Z≅50 with an excitation energy range of 30 to 40 MeV are considered. The calculations are in good agreement with the experimentally deduced inverse level density parameter values especially for 109 In, 113 Sb, 122 Te, 123 I, and 127 Cs nuclei.

  9. Path integrals for electronic densities, reactivity indices, and localization functions in quantum systems.

    Science.gov (United States)

    Putz, Mihai V

    2009-11-10

    The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.

  10. Path Integrals for Electronic Densities, Reactivity Indices, and Localization Functions in Quantum Systems

    Directory of Open Access Journals (Sweden)

    Mihai V. Putz

    2009-11-01

    Full Text Available The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving many-electronic systems.

  11. Statistical relation between particle contaminations in ultra pure water and defects generated by process tools

    NARCIS (Netherlands)

    Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke

    2007-01-01

    Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,

  12. Post-operative pain control after tonsillectomy: dexametasone vs tramadol.

    Science.gov (United States)

    Topal, Kubra; Aktan, Bulent; Sakat, Muhammed Sedat; Kilic, Korhan; Gozeler, Mustafa Sitki

    2017-06-01

    Tramadol was found to be more effective than dexamethasone in post-operative pain control, with long-lasting relief of pain. This study aimed to compare the effects of pre-operative local injections of tramadol and dexamethasone on post-operative pain, nausea and vomiting in patients who underwent tonsillectomy. Sixty patients between 3-13 years of age who were planned for tonsillectomy were included in the study. Patients were divided into three groups. Group 1 was the control group. Patients in Group 2 received 0.3 mg/kg Dexamethasone and Group 3 received 0.1 mg/kg Tramadol injection to the peritonsillary space just before the operation. Patients were evaluated for nausea, vomiting, and pain. When the control and the dexamethasone groups were compared; there were statistically significant differences in pain scores at post-operative 15 and 30 min, whereas there was no statistically significant difference in pain scores at other hours. When the control and tramadol groups were compared, there was a statistically significant difference in pain scores at all intervals. When tramadol and dexamethasone groups were compared, there was no statistically significant difference in pain scores at post-operative 15 and 30 min, 1 and 2 h, whereas there was a statistically significant difference in pain scores at post-operative 6 and 24 h.

  13. Operational limits and disruptions in ITER

    Energy Technology Data Exchange (ETDEWEB)

    Tsunematsu, T; Mizoguchi, T; Yoshino, R [Japan Atomic Energy Research Inst., Tokyo (Japan); Borrass, K; Engelmann, F; Pacher, G; Pacher, H [Max-Planck-Institut fuer Plasmaphysik, Garching (Germany, F.R.). NET Design Team; Cohen, S; Post, D [Princeton Univ., NJ (USA). Plasma Physics Lab.; Hogan, J; Uckan, N A [Oak Ridge National Lab., TN (USA); Krasheninnikov, S; Mukhovatov, V; Parail, V

    1990-12-15

    Detailed knowledge of the operational limits for beta, q and the plasma density will be required for successful and flexible operation of ITER. In this paper, the present data base and guidelines on operational limits and disruptions in the ITER design are presented. 10 refs., 1 fig.

  14. Atomistic mechanisms of ReRAM cell operation and reliability

    Science.gov (United States)

    Pandey, Sumeet C.

    2018-01-01

    We present results from first-principles-based modeling that captures functionally important physical phenomena critical to cell materials selection, operation, and reliability for resistance-switching memory technologies. An atomic-scale description of retention, the low- and high-resistance states (RS), and the sources of intrinsic cell-level variability in ReRAM is discussed. Through the results obtained from density functional theory, non-equilibrium Green’s function, molecular dynamics, and kinetic Monte Carlo simulations; the role of variable-charge vacancy defects and metal impurities in determining the RS, the LRS-stability, and electron-conduction in such RS is reported. Although, the statistical electrical characteristics of the oxygen-vacancy (Ox-ReRAM) and conductive-bridging RAM (M-ReRAM) are notably different, the underlying similar electrochemical phenomena describing retention and formation/dissolution of RS are being discussed.

  15. Traffic density determination and its applications using smartphone

    Directory of Open Access Journals (Sweden)

    Al-Sayed Ahmed Al-Sobky

    2016-03-01

    Full Text Available Smartphone is progressively becoming a dominant platform for many transportation applications. This paper introduces a new application for using smartphones to measure traffic density and speed. The proposed system consists of two smartphones and two cars, with observer to count vehicles between the two cars. This count is utilized with tracking data to give “measured” density and “measured” speed. The travel speed and manual traffic counts were used to derive “calculated” density. Measured density was validated against calculated one, and statistical t-test confirmed that the mean difference between two densities is not significant at 5% level. Calculated flow rates were also comparable to actual counts, with an average error of 8.2%. The proposed system was then applied to measure density on 6 of October Elevated Road in Egypt, and the level of service was determined accordingly on 15 road sections studied on this road. Furthermore, actual speed-density data were fitted using exponential model with R2 of 0.85. Advantages of proposed system qualify it for potential applications in developing countries where available resources limit installation of more costly systems. The application of proposed system is limited to daytime, uninterrupted flow conditions, and traffic streams with less percentage of heavy vehicles.

  16. Automated breast tissue density assessment using high order regional texture descriptors in mammography

    Science.gov (United States)

    Law, Yan Nei; Lieng, Monica Keiko; Li, Jingmei; Khoo, David Aik-Aun

    2014-03-01

    Breast cancer is the most common cancer and second leading cause of cancer death among women in the US. The relative survival rate is lower among women with a more advanced stage at diagnosis. Early detection through screening is vital. Mammography is the most widely used and only proven screening method for reliably and effectively detecting abnormal breast tissues. In particular, mammographic density is one of the strongest breast cancer risk factors, after age and gender, and can be used to assess the future risk of disease before individuals become symptomatic. A reliable method for automatic density assessment would be beneficial and could assist radiologists in the evaluation of mammograms. To address this problem, we propose a density classification method which uses statistical features from different parts of the breast. Our method is composed of three parts: breast region identification, feature extraction and building ensemble classifiers for density assessment. It explores the potential of the features extracted from second and higher order statistical information for mammographic density classification. We further investigate the registration of bilateral pairs and time-series of mammograms. The experimental results on 322 mammograms demonstrate that (1) a classifier using features from dense regions has higher discriminative power than a classifier using only features from the whole breast region; (2) these high-order features can be effectively combined to boost the classification accuracy; (3) a classifier using these statistical features from dense regions achieves 75% accuracy, which is a significant improvement from 70% accuracy obtained by the existing approaches.

  17. Statistical characterization of tensile strengths for a nuclear-type core graphite

    International Nuclear Information System (INIS)

    Kennedy, C.R.; Eatherly, W.P.

    1986-09-01

    A data set of tensile strengths comprising over 1200 experimental points has been analyzed statistically in conformance with the observed phenomenon of background and disparate flaws. The data are consistent with a bimodal normal distribution. If corrections are made for strength dependence on density, the background mode is Weibull. It is proposed the disparate mode can be represented by a combination of binomial and order statistics. The resultant bimodal model would show a strong dependence on stress volume

  18. Self-similar density turbulence in the TCV tokamak scrape-off layer

    International Nuclear Information System (INIS)

    Graves, J P; Horacek, J; Pitts, R A; Hopcraft, K I

    2005-01-01

    Plasma fluctuations in the scrape-off layer (SOL) of the TCV tokamak exhibit statistical properties which are universal across a broad range of discharge conditions. Electron density fluctuations, from just inside the magnetic separatrix to the plasma-wall interface, are described well by a gamma distributed random variable. The density fluctuations exhibit clear evidence of self-similarity in the far SOL, such that the corresponding probability density functions collapse upon renormalization solely by the mean particle density. This constitutes a demonstration that the amplitude of the density fluctuations is simply proportional to the mean density and is consistent with the further observation that the radial particle flux fluctuations scale solely with the mean density over two orders of magnitude. Such findings indicate that it may be possible to improve the prediction of transport in the critical plasma-wall interaction region of future large scale tokamaks. (letter to the editor)

  19. Response of the ionospheric electron density to different types of seismic events

    Directory of Open Access Journals (Sweden)

    Y. He

    2011-08-01

    Full Text Available The electron density data recorded by the Langmuir Probe Instrument (ISL, Instrument Sonde de Langmuir onboard the DEMETER satellite have been collected for nearly 4 yr (during 2006–2009 to perform a statistical analysis. During this time, more than 7000 earthquakes with a magnitude larger than or equal to 5.0 occurred all over the world. For the statistical studies, all these events have been divided into various categories on the basis of the seismic information, including Southern or Northern Hemisphere earthquakes, inland or sea earthquakes, earthquakes at different magnitude levels, earthquakes at different depth levels, isolated events and all events. To distinguish the pre-earthquake anomalies from the possible ionospheric anomalies related to the geomagnetic activity, the data were filtered with the Kp index. The statistical results obviously show that the electron density increases close to the epicentres both in the Northern and the Southern Hemisphere, but the position of the anomaly is slightly shifted to the north in the Northern Hemisphere and to the south in the Southern Hemisphere. The electron density related to both inland and sea earthquakes presents an anomaly approximately close to the epicentres, but the anomaly for sea earthquakes is more significant than for inland earthquakes. The intensity of the anomalies is enhanced when the magnitude increases and is reduced when the depth increases. A similar anomaly can also be seen in the statistical results concerning the isolated earthquakes. All these statistical results can help to better understand the preparation process of the earthquakes and their influence up to the ionospheric levels.

  20. Density functional theory for polymeric systems in 2D

    International Nuclear Information System (INIS)

    Słyk, Edyta; Bryk, Paweł; Roth, Roland

    2016-01-01

    We propose density functional theory for polymeric fluids in two dimensions. The approach is based on Wertheim’s first order thermodynamic perturbation theory (TPT) and closely follows density functional theory for polymers proposed by Yu and Wu (2002 J. Chem. Phys . 117 2368). As a simple application we evaluate the density profiles of tangent hard-disk polymers at hard walls. The theoretical predictions are compared against the results of the Monte Carlo simulations. We find that for short chain lengths the theoretical density profiles are in an excellent agreement with the Monte Carlo data. The agreement is less satisfactory for longer chains. The performance of the theory can be improved by recasting the approach using the self-consistent field theory formalism. When the self-avoiding chain statistics is used, the theory yields a marked improvement in the low density limit. Further improvements for long chains could be reached by going beyond the first order of TPT. (paper)