WorldWideScience

Sample records for statistical density operators

  1. Density operators in quantum mechanics

    International Nuclear Information System (INIS)

    Burzynski, A.

    1979-01-01

    A brief discussion and resume of density operator formalism in the way it occurs in modern physics (in quantum optics, quantum statistical physics, quantum theory of radiation) is presented. Particularly we emphasize the projection operator method, application of spectral theorems and superoperators formalism in operator Hilbert spaces (Hilbert-Schmidt type). The paper includes an appendix on direct sums and direct products of spaces and operators, and problems of reducibility for operator class by using the projection operators. (author)

  2. Operation statistics of KEKB

    International Nuclear Information System (INIS)

    Kawasumi, Takeshi; Funakoshi, Yoshihiro

    2008-01-01

    KEKB accelerator has been operated since December 1998. We achieved the design peak luminosity of 10.00/nb/s. The present record is 17.12/nb/s. Detailed data of the KEKB Operation is important to evaluate the KEKB performance and to suggest the direction of the performance enhancement. We have classified all KEKB machine time into the following seven categories (1) Physics Run (2) Machine Study (3) Machine Tuning (4) Beam Tuning (5) Trouble (6) Maintenance (7) Others, to estimate the accelerator availability. In this paper we report the operation statistics of the KEKB accelerator. (author)

  3. Statistical theory of electron densities

    International Nuclear Information System (INIS)

    Pratt, L.R.; Hoffman, G.G.; Harris, R.A.

    1988-01-01

    An optimized Thomas--Fermi theory is proposed which retains the simplicity of the original theory and is a suitable reference theory for Monte Carlo density functional treatments of condensed materials. The key ingredient of the optimized theory is a neighborhood sampled potential which contains effects of the inhomogeneities in the one-electron potential. In contrast to the traditional Thomas--Fermi approach, the optimized theory predicts a finite electron density in the vicinity of a nucleus. Consideration of the example of an ideal electron gas subject to a central Coulomb field indicates that implementation of the approach is straightforward. The optimized theory is found to fail completely when a classically forbidden region is approached. However, these circumstances are not of primary interest for calculations of interatomic forces. It is shown how the energy functional of the density may be constructed by integration of a generalized Hellmann--Feynman relation. This generalized Hellmann--Feynman relation proves to be equivalent to the variational principle of density functional quantum mechanics, and, therefore, the present density theory can be viewed as a variational consequence of the constructed energy functional

  4. Statistical density modification using local pattern matching

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    Statistical density modification can make use of local patterns of density found in protein structures to improve crystallographic phases. A method for improving crystallographic phases is presented that is based on the preferential occurrence of certain local patterns of electron density in macromolecular electron-density maps. The method focuses on the relationship between the value of electron density at a point in the map and the pattern of density surrounding this point. Patterns of density that can be superimposed by rotation about the central point are considered equivalent. Standard templates are created from experimental or model electron-density maps by clustering and averaging local patterns of electron density. The clustering is based on correlation coefficients after rotation to maximize the correlation. Experimental or model maps are also used to create histograms relating the value of electron density at the central point to the correlation coefficient of the density surrounding this point with each member of the set of standard patterns. These histograms are then used to estimate the electron density at each point in a new experimental electron-density map using the pattern of electron density at points surrounding that point and the correlation coefficient of this density to each of the set of standard templates, again after rotation to maximize the correlation. The method is strengthened by excluding any information from the point in question from both the templates and the local pattern of density in the calculation. A function based on the origin of the Patterson function is used to remove information about the electron density at the point in question from nearby electron density. This allows an estimation of the electron density at each point in a map, using only information from other points in the process. The resulting estimates of electron density are shown to have errors that are nearly independent of the errors in the original map using

  5. Statistical density of nuclear excited states

    Directory of Open Access Journals (Sweden)

    V. M. Kolomietz

    2015-10-01

    Full Text Available A semi-classical approximation is applied to the calculations of single-particle and statistical level densities in excited nuclei. Landau's conception of quasi-particles with the nucleon effective mass m* < m is used. The approach provides the correct description of the continuum contribution to the level density for realistic finite-depth potentials. It is shown that the continuum states does not affect significantly the thermodynamic calculations for sufficiently small temperatures T ≤ 1 MeV but reduce strongly the results for the excitation energy at high temperatures. By use of standard Woods - Saxon potential and nucleon effective mass m* = 0.7m the A-dependency of the statistical level density parameter K was evaluated in a good qualitative agreement with experimental data.

  6. Quantum Statistical Operator and Classically Chaotic Hamiltonian ...

    African Journals Online (AJOL)

    Quantum Statistical Operator and Classically Chaotic Hamiltonian System. ... Journal of the Nigerian Association of Mathematical Physics ... In a Hamiltonian system von Neumann Statistical Operator is used to tease out the quantum consequence of (classical) chaos engendered by the nonlinear coupling of system to its ...

  7. Projection operator techniques in nonequilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Grabert, H.

    1982-01-01

    This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)

  8. Density by Moduli and Lacunary Statistical Convergence

    Directory of Open Access Journals (Sweden)

    Vinod K. Bhardwaj

    2016-01-01

    Full Text Available We have introduced and studied a new concept of f-lacunary statistical convergence, where f is an unbounded modulus. It is shown that, under certain conditions on a modulus f, the concepts of lacunary strong convergence with respect to a modulus f and f-lacunary statistical convergence are equivalent on bounded sequences. We further characterize those θ for which Sθf=Sf, where Sθf and Sf denote the sets of all f-lacunary statistically convergent sequences and f-statistically convergent sequences, respectively. A general description of inclusion between two arbitrary lacunary methods of f-statistical convergence is given. Finally, we give an Sθf-analog of the Cauchy criterion for convergence and a Tauberian theorem for Sθf-convergence is also proved.

  9. High density operation in pulsator

    International Nuclear Information System (INIS)

    Klueber, O.; Cannici, B.; Engelhardt, W.; Gernhardt, J.; Glock, E.; Karger, F.; Lisitano, G.; Mayer, H.M.; Meisel, D.; Morandi, P.

    1976-03-01

    This report summarizes the results of experiments at high electron densities (>10 14 cm -3 ) which have been achieved by pulsed gas inflow during the discharge. At these densities a regime is established which is characterized by βsub(p) > 1, nsub(i) approximately nsub(e), Tsub(i) approximately Tsub(e) and tausub(E) proportional to nsub(e). Thus the toroidal magnetic field contributes considerably to the plasma confinement and the ions constitute almost half of the plasma pressure. Furthermore, the confinement is appreciably improved and the plasma becomes impermeable to hot neutrals. (orig.) [de

  10. 14 CFR Section 19 - Uniform Classification of Operating Statistics

    Science.gov (United States)

    2010-01-01

    ... Statistics Section 19 Section 19 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... AIR CARRIERS Operating Statistics Classifications Section 19 Uniform Classification of Operating Statistics ...

  11. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  12. Planar-channeling spatial density under statistical equilibrium

    International Nuclear Information System (INIS)

    Ellison, J.A.; Picraux, S.T.

    1978-01-01

    The phase-space density for planar channeled particles has been derived for the continuum model under statistical equilibrium. This is used to obtain the particle spatial probability density as a function of incident angle. The spatial density is shown to depend on only two parameters, a normalized incident angle and a normalized planar spacing. This normalization is used to obtain, by numerical calculation, a set of universal curves for the spatial density and also for the channeled-particle wavelength as a function of amplitude. Using these universal curves, the statistical-equilibrium spatial density and the channeled-particle wavelength can be easily obtained for any case for which the continuum model can be applied. Also, a new one-parameter analytic approximation to the spatial density is developed. This parabolic approximation is shown to give excellent agreement with the exact calculations

  13. Statistics of peaks in cosmological nonlinear density fields

    International Nuclear Information System (INIS)

    Suginohara, Tatsushi; Suto, Yasushi.

    1990-06-01

    Distribution of the high-density peaks in the universe is examined using N-body simulations. Nonlinear evolution of the underlying density field significantly changes the statistical properties of the peaks, compared with the analytic results valid for the random Gaussian field. In particular, the abundances and correlations of the initial density peaks are discussed in the context of biased galaxy formation theory. (author)

  14. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  15. Statistical operation of nuclear power plants

    International Nuclear Information System (INIS)

    Gauzit, Maurice; Wilmart, Yves

    1976-01-01

    A comparison of the statistical operating results of nuclear power stations as issued in the literature shows that the values given for availability and the load factor often differ considerably from each other. This may be due to different definitions given to these terms or even to a poor translation from one language into another. A critical analysis of these terms as well as the choice of a parameter from which it is possible to have a quantitative idea of the actual quality of the operation obtained is proposed. The second section gives, on an homogenous basis and from the results supplied by 83 nuclear power stations now in operation, a statistical analysis of their operating results: in particular, the two light water lines, during 1975, as well as the evolution in terms of age, of the units or the starting conditions of the units during their first two operating years. Test values thus obtained are compared also to those taken 'a priori' as hypothesis in some economic studies [fr

  16. Projected evolution superoperators and the density operator

    International Nuclear Information System (INIS)

    Turner, R.E.; Dahler, J.S.; Snider, R.F.

    1982-01-01

    The projection operator method of Zwanzig and Feshbach is used to construct the time dependent density operator associated with a binary scattering event. The formula developed to describe this time dependence involves time-ordered cosine and sine projected evolution (memory) superoperators. Both Schroedinger and interaction picture results are presented. The former is used to demonstrate the equivalence of the time dependent solution of the von Neumann equation and the more familiar frequency dependent Laplace transform solution. For two particular classes of projection superoperators projected density operators are shown to be equivalent to projected wave functions. Except for these two special cases, no projected wave function analogs of projected density operators exist. Along with the decoupled-motions approximation, projected interaction picture density operators are applied to inelastic scattering events. Simple illustrations are provided of how this formalism is related to previously established results for two-state processes, namely, the theory of resonant transfer events, the first order Magnus approximation, and the Landau-Zener theory

  17. Experimental investigation of statistical density function of decaying radioactive sources

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1991-01-01

    The validity of the Poisson and the λ P(k) modified Poisson statistical density functions of observing k events in a short time interval is investigated experimentally in radioactive decay detection for various measuring times. The experiments to measure radioactive decay were performed with 89m Y, using a multichannel analyzer. According to the results, Poisson statistics adequately describes the counting experiment for short measuring times. (author) 13 refs.; 4 figs

  18. Local Finite Density Theory, Statistical Blocking and Color Superconductivity

    OpenAIRE

    Ying, S.

    2000-01-01

    The motivation for the development of a local finite density theory is discussed. One of the problems related to an instability in the baryon number fluctuation of the chiral symmetry breaking phase of the quark system in the local theory is shown to exist. Such an instability problem is removed by taking into account the statistical blocking effects for the quark propagator, which depends on a macroscopic {\\em statistical blocking parameter} $\\epsilon$. This new frame work is then applied to...

  19. Nonequilibrium statistical operator in hot-electron transport theory

    International Nuclear Information System (INIS)

    Xing, D.Y.; Liu, M.

    1991-09-01

    The Nonequilibrium Statistical Operator method developed by Zubarev is generalized and applied to the study of hot-electron transport in semiconductors. The steady-state balance equations for momentum and energy are derived to the lowest order in the electron-lattice coupling. We show that the derived balance equations are exactly the same as those obtained by Lei and Ting. This equivalence stems from the fact that to the linear order in the electron-lattice coupling, two statistical density matrices have identical effect when they are used to calculate the average value of a dynamical operator. The application to the steady-state and transient hot-electron transport in multivalley semiconductors is also discussed. (author). 28 refs, 1 fig

  20. Wind power statistics and an evaluation of wind energy density

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, M.; Parsa, S.; Majidi, M. [Materials and Energy Research Centre, Tehran (Iran, Islamic Republic of)

    1995-11-01

    In this paper the statistical data of fifty days` wind speed measurements at the MERC- solar site are used to find out the wind energy density and other wind characteristics with the help of the Weibull probability distribution function. It is emphasized that the Weibull and Rayleigh probability functions are useful tools for wind energy density estimation but are not quite appropriate for properly fitting the actual wind data of low mean speed, short-time records. One has to use either the actual wind data (histogram) or look for a better fit by other models of the probability function. (Author)

  1. Statistical mechanics of low-density parity-check codes

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Yokohama 2268502 (Japan); Saad, David [Neural Computing Research Group, Aston University, Birmingham B4 7ET (United Kingdom)

    2004-02-13

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  2. Statistical mechanics of low-density parity-check codes

    International Nuclear Information System (INIS)

    Kabashima, Yoshiyuki; Saad, David

    2004-01-01

    We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multi-spin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems. (topical review)

  3. Statistical study of density fluctuations in the tore supra tokamak

    International Nuclear Information System (INIS)

    Devynck, P.; Fenzi, C.; Garbet, X.; Laviron, C.

    1998-03-01

    It is believed that the radial anomalous transport in tokamaks is caused by plasma turbulence. Using infra-red laser scattering technique on the Tore Supra tokamak, statistical properties of the density fluctuations are studied as a function of the scales in ohmic as well as additional heating regimes using the lower hybrid or the ion cyclotron frequencies. The probability distributions are compared to a Gaussian in order to estimate the role of intermittency which is found to be negligible. The temporal behaviour of the three-dimensional spectrum is thoroughly discussed; its multifractal character is reflected in the singularity spectrum. The autocorrelation coefficient as well as their long-time incoherence and statistical independence. We also put forward the existence of fluctuations transfer between two distinct but close wavenumbers. A rather clearer image is thus obtained about the way energy is transferred through the turbulent scales. (author)

  4. Nonequilibrium statistical Zubarev's operator and Green's functions for an inhomogeneous electron gas

    Directory of Open Access Journals (Sweden)

    P.Kostrobii

    2006-01-01

    Full Text Available Nonequilibrium properties of an inhomogeneous electron gas are studied using the method of the nonequilibrium statistical operator by D.N. Zubarev. Generalized transport equations for the mean values of inhomogeneous operators of the electron number density, momentum density, and total energy density for weakly and strongly nonequilibrium states are obtained. We derive a chain of equations for the Green's functions, which connects commutative time-dependent Green's functions "density-density", "momentum-momentum", "enthalpy-enthalpy" with reduced Green's functions of the generalized transport coefficients and with Green's functions for higher order memory kernels in the case of a weakly nonequilibrium spatially inhomogeneous electron gas.

  5. Optimizing refiner operation with statistical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, G [Noranda Research Centre, Pointe Claire, PQ (Canada)

    1997-02-01

    The impact of refining conditions on the energy efficiency of the process and on the handsheet quality of a chemi-mechanical pulp was studied as part of a series of pilot scale refining trials. Statistical models of refiner performance were constructed from these results and non-linear optimization of process conditions were conducted. Optimization results indicated that increasing the ratio of specific energy applied in the first stage led to a reduction of some 15 per cent in the total energy requirement. The strategy can also be used to obtain significant increases in pulp quality for a given energy input. 20 refs., 6 tabs.

  6. Statistical inference of level densities from resolved resonance parameters

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1983-08-01

    Level densities are most directly obtained by counting the resonances observed in the resolved resonance range. Even in the measurements, however, weak levels are invariably missed so that one has to estimate their number and add it to the raw count. The main categories of missinglevel estimators are discussed in the present review, viz. (I) ladder methods including those based on the theory of Hamiltonian matrix ensembles (Dyson-Mehta statistics), (II) methods based on comparison with artificial cross section curves (Monte Carlo simulation, Garrison's autocorrelation method), (III) methods exploiting the observed neutron width distribution by means of Bayesian or more approximate procedures such as maximum-likelihood, least-squares or moment methods, with various recipes for the treatment of detection thresholds and resolution effects. The language of mathematical statistics is employed to clarify the basis of, and the relationship between, the various techniques. Recent progress in the treatment of resolution effects, detection thresholds and p-wave admixture is described. (orig.) [de

  7. Completely contained and remotely operated digital density meter

    International Nuclear Information System (INIS)

    Goergen, C.R.

    1979-10-01

    A completely contained and remotely operated density determination system having unique features was designed, fabricated, and installed at the Savannah River Plant. The system, based on a Mettler calculating digital density meter, provides more precise and accurate results than the falling drop technique for measuring densities. The system is fast, simple, easy to operate, and has demonstrated both reliability and durability

  8. Statistical mechanics of high-density bond percolation

    Science.gov (United States)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  9. The statistics of maxima in primordial density perturbations

    International Nuclear Information System (INIS)

    Peacock, J.A.; Heavens, A.F.

    1985-01-01

    An investigation has been made of the hypothesis that protogalaxies/protoclusters form at the sites of maxima in a primordial field of normally distributed density perturbations. Using a mixture of analytic and numerical techniques, the properties of the maxima, have been studied. The results provide a natural mechanism for biased galaxy formation in which galaxies do not necessarily follow the large-scale density. Methods for obtained the true autocorrelation function of the density field and implications for Microwave Background studies are discussed. (author)

  10. Infinite statistics and the SU(1, 1) phase operator

    International Nuclear Information System (INIS)

    Gerry, Christopher C

    2005-01-01

    A few years ago, Agarwal (1991 Phys. Rev. A 44 8398) showed that the Susskind-Glogower phase operators, expressible in terms of Bose operators, provide a realization of the algebra for particles obeying infinite statistics. In this paper we show that the SU(1, 1) phase operators, constructed in terms of the elements of the su(1, 1) Lie algebra, also provide a realization of the algebra for infinite statistics. There are many realizations of the su(1, 1) algebra in terms of single or multimode bose operators, three of which are discussed along with their corresponding phase states. The Susskind-Glogower phase operator is a special case of the SU(1, 1) phase operator associated with the Holstein-Primakoff realization of su(1, 1). (letter to the editor)

  11. Wigner Function of Density Operator for Negative Binomial Distribution

    International Nuclear Information System (INIS)

    Xu Xinglei; Li Hongqi

    2008-01-01

    By using the technique of integration within an ordered product (IWOP) of operator we derive Wigner function of density operator for negative binomial distribution of radiation field in the mixed state case, then we derive the Wigner function of squeezed number state, which yields negative binomial distribution by virtue of the entangled state representation and the entangled Wigner operator

  12. Nonequilibrium Statistical Operator Method and Generalized Kinetic Equations

    Science.gov (United States)

    Kuzemsky, A. L.

    2018-01-01

    We consider some principal problems of nonequilibrium statistical thermodynamics in the framework of the Zubarev nonequilibrium statistical operator approach. We present a brief comparative analysis of some approaches to describing irreversible processes based on the concept of nonequilibrium Gibbs ensembles and their applicability to describing nonequilibrium processes. We discuss the derivation of generalized kinetic equations for a system in a heat bath. We obtain and analyze a damped Schrödinger-type equation for a dynamical system in a heat bath. We study the dynamical behavior of a particle in a medium taking the dissipation effects into account. We consider the scattering problem for neutrons in a nonequilibrium medium and derive a generalized Van Hove formula. We show that the nonequilibrium statistical operator method is an effective, convenient tool for describing irreversible processes in condensed matter.

  13. High density operation on the HT-7 superconducting tokamak

    International Nuclear Information System (INIS)

    Xiang Gao

    2000-01-01

    The structure of the operation region has been studied in the HT-7 superconducting tokamak, and progress on the extension of the HT-7 ohmic discharge operation region is reported. A density corresponding to 1.2 times the Greenwald limit was achieved by RF boronization. The density limit appears to be connected to the impurity content and the edge parameters, so the best results are obtained with very clean plasmas and peaked electron density profiles. The peaking factors of electron density profiles for different current and line averaged densities were observed. The density behaviour and the fuelling efficiency for gas puffing (20-30%), pellet injection (70-80%) and molecular beam injection (40-50%) were studied. The core crash sawteeth and MHD behaviour, which were induced by an injected pellet, were observed and the events correlated with the change of current profile and reversed magnetic shear. The MARFE phenomena on HT-7 are summarized. The best correlation has been found between the total input ohmic power and the product of the edge line averaged density and Z eff . HT-7 could be easily operated in the high density region MARFE-free using RF boronization. (author)

  14. Statistical analysis of first period of operation of FTU Tokamak

    International Nuclear Information System (INIS)

    Crisanti, F.; Apruzzese, G.; Frigione, D.; Kroegler, H.; Lovisetto, L.; Mazzitelli, G.; Podda, S.

    1996-09-01

    On the FTU Tokamak the plasma physics operations started on the 20/4/90. The first plasma had a plasma current Ip=0.75 MA for about a second. The experimental phase lasted until 7/7/94, when a long shut-down begun for installing the toroidal limiter in the inner side of the vacuum vessel. In these four years of operations plasma experiments have been successfully exploited, e.g. experiments of single and multiple pellet injections; full current drive up to Ip=300 KA was obtained by using waves at the frequency of the Lower Hybrid; analysis of ohmic plasma parameters with different materials (from the low Z silicon to high Z tungsten) as plasma facing element was performed. In this work a statistical analysis of the full period of operation is presented. Moreover, a comparison with the statistical data from other Tokamaks is attempted

  15. Zubarev's Nonequilibrium Statistical Operator Method in the Generalized Statistics of Multiparticle Systems

    Science.gov (United States)

    Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.

    2018-01-01

    We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.

  16. Operation and control of high density tokamak reactors

    International Nuclear Information System (INIS)

    Attenberger, S.E.; McAlees, D.G.

    1976-01-01

    The incentive for high density operation of a tokamak reactor is discussed. The plasma size required to attain ignition is determined. Ignition is found to be possible in a relatively small system provided other design criteria are met. These criteria are described and the technology developments and operating procedures required by them are outlined. The parameters for such a system and its dynamic behavior during the operating cycle are also discussed

  17. Securing co-operation from persons supplying statistical data

    Science.gov (United States)

    Aubenque, M. J.; Blaikley, R. M.; Harris, F. Fraser; Lal, R. B.; Neurdenburg, M. G.; Hernández, R. de Shelly

    1954-01-01

    Securing the co-operation of persons supplying information required for medical statistics is essentially a problem in human relations, and an understanding of the motivations, attitudes, and behaviour of the respondents is necessary. Before any new statistical survey is undertaken, it is suggested by Aubenque and Harris that a preliminary review be made so that the maximum use is made of existing information. Care should also be taken not to burden respondents with an overloaded questionnaire. Aubenque and Harris recommend simplified reporting. Complete population coverage is not necessary. Neurdenburg suggests that the co-operation and support of such organizations as medical associations and social security boards are important and that propaganda should be directed specifically to the groups whose co-operation is sought. Informal personal contacts are valuable and desirable, according to Blaikley, but may have adverse effects if the right kind of approach is not made. Financial payments as an incentive in securing co-operation are opposed by Neurdenburg, who proposes that only postage-free envelopes or similar small favours be granted. Blaikley and Harris, on the other hand, express the view that financial incentives may do much to gain the support of those required to furnish data; there are, however, other incentives, and full use should be made of the natural inclinations of respondents. Compulsion may be necessary in certain instances, but administrative rather than statutory measures should be adopted. Penalties, according to Aubenque, should be inflicted only when justified by imperative health requirements. The results of surveys should be made available as soon as possible to those who co-operated, and Aubenque and Harris point out that they should also be of practical value to the suppliers of the information. Greater co-operation can be secured from medical persons who have an understanding of the statistical principles involved; Aubenque and

  18. A torque-measuring micromotor provides operator independent measurements marking four different density areas in maxillae.

    Science.gov (United States)

    Di Stefano, Danilo Alessio; Arosio, Paolo; Piattelli, Adriano; Perrotti, Vittoria; Iezzi, Giovanna

    2015-02-01

    Bone density at implant placement site is a key factor to obtain the primary stability of the fixture, which, in turn, is a prognostic factor for osseointegration and long-term success of an implant supported rehabilitation. Recently, an implant motor with a bone density measurement probe has been introduced. The aim of the present study was to test the objectiveness of the bone densities registered by the implant motor regardless of the operator performing them. A total of 3704 bone density measurements, performed by means of the implant motor, were registered by 39 operators at different implant sites during routine activity. Bone density measurements were grouped according to their distribution across the jaws. Specifically, four different areas were distinguished: a pre-antral (between teeth from first right maxillary premolar to first left maxillary premolar) and a sub-antral (more distally) zone in the maxilla, and an interforaminal (between and including teeth from first left mandibular premolar to first right mandibular premolar) and a retroforaminal (more distally) zone in the lower one. A statistical comparison was performed to check the inter-operators variability of the collected data. The device produced consistent and operator-independent bone density values at each tooth position, showing a reliable bone-density measurement. The implant motor demonstrated to be a helpful tool to properly plan implant placement and loading irrespective of the operator using it.

  19. Spectral density and a family of Dirac operators

    International Nuclear Information System (INIS)

    Niemi, A.J.

    1985-01-01

    The spectral density for a class Dirac operators is investigated by relating its even and odd parts to the Riemann zeta-function and to the eta-invariant by Atiyah, Padoti and Singer. Asymptotic expansions are studied and a 'hidden' supersymmetry is revealed and used to relate the Dirac operator to a supersymmetric quantum mechanics. A general method for the computation of the odd spectral density is developed, and various applications are discussed. In particular the connection to the fermion number and a relation between the odd spectral density and some ratios of Jost functions and relative phase shifts are pointed out. Chiral symmetry breaking is investigated using methods analogous to those applied in the investigation of the fermion number, and related to supersymmetry breaking in the corresponding quantum mechanical model. (orig.)

  20. Statistical factors affecting the success of nuclear operations

    International Nuclear Information System (INIS)

    Sunder, S.; Stephenson, J.R.; Hochman, D.

    1999-01-01

    In this article, the authors present a statistical analysis to determine the operational, financial, technical, and managerial factors that most significantly affect the success of nuclear operations. The study analyzes data for over 70 nuclear plants and 40 operating companies over a period of five years in order to draw conclusions that they hope will be of interest to utility companies and public utility commissions as they seek ways to improve rates of success in nuclear operations. Some of these conclusions will not be surprising--for example, that older plants have heavier maintenance requirements--but others are less intuitive. For instance, the observation that operators of fewer plants have lower costs suggests that any experience curve benefits associated with managing multiple nuclear facilities is overshadowed by the logistic problems of multiple facilities. After presenting a brief history of nuclear power in America, the authors outline the motivations of the study and the methodology of their analysis. They end the article with the results of the study and discuss some of the managerial implications of these findings

  1. Particle-hole state densities for statistical multi-step compound reactions

    International Nuclear Information System (INIS)

    Oblozinsky, P.

    1986-01-01

    An analytical relation is derived for the density of particle-hole bound states applying the equidistant-spacing approximation and the Darwin-Fowler statistical method. The Pauli exclusion principle as well as the finite depth of the potential well are taken into account. The set of densities needed for calculations of multi-step compound reactions is completed by deriving the densities of accessible final states for escape and damping. (orig.)

  2. Nuclear Level Densities for Modeling Nuclear Reactions: An Efficient Approach Using Statistical Spectroscopy

    International Nuclear Information System (INIS)

    Calvin W. Johnson

    2005-01-01

    The general goal of the project is to develop and implement computer codes and input files to compute nuclear densities of state. Such densities are important input into calculations of statistical neutron capture, and are difficult to access experimentally. In particular, we will focus on calculating densities for nuclides in the mass range A ∼ 50-100. We use statistical spectroscopy, a moments method based upon a microscopic framework, the interacting shell model. Second year goals and milestones: Develop two or three competing interactions (based upon surface-delta, Gogny, and NN-scattering) suitable for application to nuclei up to A = 100. Begin calculations for nuclides with A = 50-70

  3. Density limit in FTU tokamak during Ohmic operation

    International Nuclear Information System (INIS)

    Frigione, D.; Pieroni, L.

    1993-01-01

    The understanding of the physical mechanisms that regulate the density limit in a Tokamak is very important in view of a future fusion reactor. On one hand density enters as a factor in the figure of merit needed to achieve a burning plasma, and on the other hand a high edge density is a prerequisite for avoiding excessive erosion of the first walls and to limit the impurity influx into the hot plasma core. Furthermore a reactor should work in a safe zone of the operation parameters in order to avoid disruptive instabilities. The density limit problem has been tackled since the 70's, but so far a unique physics picture has not still emerged. In the last few years, due to the availability of better diagnostics, especially for the plasma edge, the use of pellet injectors to fuel the plasma and the experience gained on many different Tokamak, a consensus has been reached on the edge density as the real parameter responsible for the density limit. There are still two main mechanisms invoked to explain this limit: one refers to the power balance between the heat conducted and/or convected across the plasma radius and the power lost by impurity line radiation at the edge. When the latter overcomes the former, shrinking of the current channel occurs, which leads to instabilities due to tearing modes (usually the m/n=2/1) and then to disruption. The other explanation, for now valid for divertor machines, is based on the particle and energy balance in the scrape off layer (SOL). The limit in the edge density is then associated with the thermal collapse of the divertor plasma. In this work we describe the experiments on the density limit in FTU with Ohmic heating, the reason why we also believe that the limit is on the edge density, and discuss its relation to a simple model based on the SOL power balance valid for a limiter Tokamak. (author) 7 refs., 4 figs

  4. Operation and control of high density tokamak reactors

    International Nuclear Information System (INIS)

    Attenberger, S.E.; McAlees, D.G.

    1976-01-01

    The incentive for high density operation of a tokamak reactor was discussed. It is found that high density permits ignition in a relatively small, moderately elongated plasma with a moderate magnetic field strength. Under these conditions, neutron wall loadings approximately 4 MW/m 2 must be tolerated. The sensitivity analysis with respect to impurity effects shows that impurity control will most likely be necessary to achieve the desired plasma conditions. The charge exchange sputtered impurities are found to have an important effect so that maintaining a low neutral density in the plasma is critical. If it is assumed that neutral beams will be used to heat the plasma to ignition, high energy injection is required (approximately 250 keV) when heating is accompished at full density. A scenario is outlined where the ignition temperature is established at low density and then the fueling rate is increased to attain ignition. This approach may permit beams with energies being developed for use in TFTR to be successfully used to heat a high density device of the type described here to ignition

  5. Common approximations for density operators may lead to imaginary entropy

    International Nuclear Information System (INIS)

    Lendi, K.; Amaral Junior, M.R. do

    1983-01-01

    The meaning and validity of usual second order approximations for density operators are illustrated with the help of a simple exactly soluble two-level model in which all relevant quantities can easily be controlled. This leads to exact upper bound error estimates which help to select more precisely permissible correlation times as frequently introduced if stochastic potentials are present. A final consideration of information entropy reveals clearly the limitations of this kind of approximation procedures. (Author) [pt

  6. Statistics for demodulation RFI in inverting operational amplifier circuits

    Science.gov (United States)

    Sutu, Y.-H.; Whalen, J. J.

    An investigation was conducted with the objective to determine statistical variations for RFI demodulation responses in operational amplifier (op amp) circuits. Attention is given to the experimental procedures employed, a three-stage op amp LED experiment, NCAP (Nonlinear Circuit Analysis Program) simulations of demodulation RFI in 741 op amps, and a comparison of RFI in four op amp types. Three major recommendations for future investigations are presented on the basis of the obtained results. One is concerned with the conduction of additional measurements of demodulation RFI in inverting amplifiers, while another suggests the employment of an automatic measurement system. It is also proposed to conduct additional NCAP simulations in which parasitic effects are accounted for more thoroughly.

  7. Probability density cloud as a geometrical tool to describe statistics of scattered light.

    Science.gov (United States)

    Yaitskova, Natalia

    2017-04-01

    First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.

  8. Statistical algorithm for automated signature analysis of power spectral density data

    International Nuclear Information System (INIS)

    Piety, K.R.

    1977-01-01

    A statistical algorithm has been developed and implemented on a minicomputer system for on-line, surveillance applications. Power spectral density (PSD) measurements on process signals are the performance signatures that characterize the ''health'' of the monitored equipment. Statistical methods provide a quantitative basis for automating the detection of anomalous conditions. The surveillance algorithm has been tested on signals from neutron sensors, proximeter probes, and accelerometers to determine its potential for monitoring nuclear reactors and rotating machinery

  9. Density scaling and quasiuniversality of flow-event statistics for athermal plastic flows

    DEFF Research Database (Denmark)

    Lerner, Edan; Bailey, Nicholas; Dyre, J. C.

    2014-01-01

    Athermal steady-state plastic flows were simulated for the Kob-Andersen binary Lennard-Jones system and its repulsive version in which the sign of the attractive terms is changed to a plus. Properties evaluated include the distributions of energy drops, stress drops, and strain intervals between...... the flow events. We show that simulations at a single density in conjunction with an equilibrium-liquid simulation at the same density allow one to predict the plastic flow-event statistics at other densities. This is done by applying the recently established “hidden scale invariance” of simple liquids...

  10. Operation of a semiconductor opening switch at ultrahigh current densities

    International Nuclear Information System (INIS)

    Lyubutin, S. K.; Rukin, S. N.; Slovikovsky, B. G.; Tsyranov, S. N.

    2012-01-01

    The operation of a semiconductor opening switch (SOS diode) at cutoff current densities of tens of kA/cm 2 is studied. In experiments, the maximum reverse current density reached 43 kA/cm 2 for ∼40 ns. Experimental data on SOS diodes with a p + -p-n-n + structure and a p-n junction depth from 145 to 180 μm are presented. The dynamics of electron-hole plasma in the diode at pumping and current cutoff stages is studied by numerical simulation methods. It is shown that current cutoff is associated with the formation of an electric field region in a thin (∼45 μm) layer of the structure’s heavily doped p-region, in which the acceptor concentration exceeds 10 16 cm −3 , and the current cutoff process depends weakly on the p-n junction depth.

  11. Statistical properties of kinetic and total energy densities in reverberant spaces

    DEFF Research Database (Denmark)

    Jacobsen, Finn; Molares, Alfonso Rodriguez

    2010-01-01

    Many acoustical measurements, e.g., measurement of sound power and transmission loss, rely on determining the total sound energy in a reverberation room. The total energy is usually approximated by measuring the mean-square pressure (i.e., the potential energy density) at a number of discrete....... With the advent of a three-dimensional particle velocity transducer, it has become somewhat easier to measure total rather than only potential energy density in a sound field. This paper examines the ensemble statistics of kinetic and total sound energy densities in reverberant enclosures theoretically...... positions. The idea of measuring the total energy density instead of the potential energy density on the assumption that the former quantity varies less with position than the latter goes back to the 1930s. However, the phenomenon was not analyzed until the late 1970s and then only for the region of high...

  12. Lorentz-covariant reduced-density-operator theory for relativistic-quantum-information processing

    International Nuclear Information System (INIS)

    Ahn, Doyeol; Lee, Hyuk-jae; Hwang, Sung Woo

    2003-01-01

    In this paper, we derived a Lorentz-covariant quantum Liouville equation for the density operator which describes the relativistic-quantum-information processing from Tomonaga-Schwinger equation and an exact formal solution for the reduced density operator is obtained using the projector operator technique and the functional calculus. When all the members of the family of the hypersurfaces become flat hyperplanes, it is shown that our results agree with those of the nonrelativistic case, which is valid only in some specified reference frame. To show that our formulation can be applied to practical problems, we derived the polarization of the vacuum in quantum electrodynamics up to the second order. The formulation presented in this work is general and could be applied to related fields such as quantum electrodynamics and relativistic statistical mechanics

  13. Definition and density operator for unpolarized fermion state

    International Nuclear Information System (INIS)

    Prakash, H.

    1981-04-01

    The unpolarized state of fermions is defined as one which does not change in rotations in the spin space. It is shown that, for a fermion field with a specified value of momentum of particles, the density operator is of the form, rho = (1-2a-b)|0,0> 1 , n 2 > is the occupation number state having occupancies n 1 and n 2 in the two spin modes, and a and b are positive quantities which are less than one and give 1-2a-b>=0. (author)

  14. Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function

    Science.gov (United States)

    Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.

    2011-01-01

    In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.

  15. A look at the links between drainage density and flood statistics

    Directory of Open Access Journals (Sweden)

    A. Montanari

    2009-07-01

    Full Text Available We investigate the links between the drainage density of a river basin and selected flood statistics, namely, mean, standard deviation, coefficient of variation and coefficient of skewness of annual maximum series of peak flows. The investigation is carried out through a three-stage analysis. First, a numerical simulation is performed by using a spatially distributed hydrological model in order to highlight how flood statistics change with varying drainage density. Second, a conceptual hydrological model is used in order to analytically derive the dependence of flood statistics on drainage density. Third, real world data from 44 watersheds located in northern Italy were analysed. The three-level analysis seems to suggest that a critical value of the drainage density exists for which a minimum is attained in both the coefficient of variation and the absolute value of the skewness coefficient. Such minima in the flood statistics correspond to a minimum of the flood quantile for a given exceedance probability (i.e., recurrence interval. Therefore, the results of this study may provide useful indications for flood risk assessment in ungauged basins.

  16. Exact statistical results for binary mixing and reaction in variable density turbulence

    Science.gov (United States)

    Ristorcelli, J. R.

    2017-02-01

    We report a number of rigorous statistical results on binary active scalar mixing in variable density turbulence. The study is motivated by mixing between pure fluids with very different densities and whose density intensity is of order unity. Our primary focus is the derivation of exact mathematical results for mixing in variable density turbulence and we do point out the potential fields of application of the results. A binary one step reaction is invoked to derive a metric to asses the state of mixing. The mean reaction rate in variable density turbulent mixing can be expressed, in closed form, using the first order Favre mean variables and the Reynolds averaged density variance, ⟨ρ2⟩ . We show that the normalized density variance, ⟨ρ2⟩ , reflects the reduction of the reaction due to mixing and is a mix metric. The result is mathematically rigorous. The result is the variable density analog, the normalized mass fraction variance ⟨c2⟩ used in constant density turbulent mixing. As a consequence, we demonstrate that use of the analogous normalized Favre variance of the mass fraction, c″ ⁣2˜ , as a mix metric is not theoretically justified in variable density turbulence. We additionally derive expressions relating various second order moments of the mass fraction, specific volume, and density fields. The central role of the density specific volume covariance ⟨ρ v ⟩ is highlighted; it is a key quantity with considerable dynamical significance linking various second order statistics. For laboratory experiments, we have developed exact relations between the Reynolds scalar variance ⟨c2⟩ its Favre analog c″ ⁣2˜ , and various second moments including ⟨ρ v ⟩ . For moment closure models that evolve ⟨ρ v ⟩ and not ⟨ρ2⟩ , we provide a novel expression for ⟨ρ2⟩ in terms of a rational function of ⟨ρ v ⟩ that avoids recourse to Taylor series methods (which do not converge for large density differences). We have derived

  17. The role of statistics in operations research: Some personal re ections

    African Journals Online (AJOL)

    The role of statistics in operations research: Some personal re ections. ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING AJOL ... Statistics has a very important role to play in Operations Research (OR), yet many ...

  18. A STATISTICAL STUDY OF THE MASS AND DENSITY STRUCTURE OF INFRARED DARK CLOUDS

    International Nuclear Information System (INIS)

    Peretto, N.; Fuller, G. A.

    2010-01-01

    How and when the mass distribution of stars in the Galaxy is set is one of the main issues of modern astronomy. Here, we present a statistical study of mass and density distributions of infrared dark clouds (IRDCs) and fragments within them. These regions are pristine molecular gas structures and progenitors of stars and so provide insights into the initial conditions of star formation. This study makes use of an IRDC catalog, the largest sample of IRDC column density maps to date, containing a total of ∼11,000 IRDCs with column densities exceeding N H 2 = 1x10 22 cm -2 and over 50,000 single-peaked IRDC fragments. The large number of objects constitutes an important strength of this study, allowing a detailed analysis of the completeness of the sample and so statistically robust conclusions. Using a statistical approach to assigning distances to clouds, the mass and density distributions of the clouds and the fragments within them are constructed. The mass distributions show a steepening of the slope when switching from IRDCs to fragments, in agreement with previous results of similar structures. IRDCs and fragments are divided into unbound/bound objects by assuming Larson's relation and calculating their virial parameter. IRDCs are mostly gravitationally bound, while a significant fraction of the fragments are not. The density distribution of gravitationally unbound fragments shows a steep characteristic slope such as ΔN/Δlog(n) ∝ n -4.0±0.5 , rather independent of the range of fragment mass. However, the incompleteness limit at a number density of ∼10 3 cm -3 does not allow us to exclude a potential lognormal density distribution. In contrast, gravitationally bound fragments show a characteristic density peak at n ≅ 10 4 cm -3 but the shape of the density distributions changes with the range of fragment masses. An explanation for this could be the differential dynamical evolution of the fragment density with respect to their mass as more massive

  19. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  20. Matrix product density operators: Renormalization fixed points and boundary theories

    Energy Technology Data Exchange (ETDEWEB)

    Cirac, J.I. [Max-Planck-Institut für Quantenoptik, Hans-Kopfermann-Str. 1, D-85748 Garching (Germany); Pérez-García, D., E-mail: dperezga@ucm.es [Departamento de Análisis Matemático, Universidad Complutense de Madrid, Plaza de Ciencias 3, 28040 Madrid (Spain); ICMAT, Nicolas Cabrera, Campus de Cantoblanco, 28049 Madrid (Spain); Schuch, N. [Max-Planck-Institut für Quantenoptik, Hans-Kopfermann-Str. 1, D-85748 Garching (Germany); Verstraete, F. [Department of Physics and Astronomy, Ghent University (Belgium); Vienna Center for Quantum Technology, University of Vienna (Austria)

    2017-03-15

    We consider the tensors generating matrix product states and density operators in a spin chain. For pure states, we revise the renormalization procedure introduced in (Verstraete et al., 2005) and characterize the tensors corresponding to the fixed points. We relate them to the states possessing zero correlation length, saturation of the area law, as well as to those which generate ground states of local and commuting Hamiltonians. For mixed states, we introduce the concept of renormalization fixed points and characterize the corresponding tensors. We also relate them to concepts like finite correlation length, saturation of the area law, as well as to those which generate Gibbs states of local and commuting Hamiltonians. One of the main result of this work is that the resulting fixed points can be associated to the boundary theories of two-dimensional topological states, through the bulk-boundary correspondence introduced in (Cirac et al., 2011).

  1. Global quantum discord and matrix product density operators

    Science.gov (United States)

    Huang, Hai-Lin; Cheng, Hong-Guang; Guo, Xiao; Zhang, Duo; Wu, Yuyin; Xu, Jian; Sun, Zhao-Yu

    2018-06-01

    In a previous study, we have proposed a procedure to study global quantum discord in 1D chains whose ground states are described by matrix product states [Z.-Y. Sun et al., Ann. Phys. 359, 115 (2015)]. In this paper, we show that with a very simple generalization, the procedure can be used to investigate quantum mixed states described by matrix product density operators, such as quantum chains at finite temperatures and 1D subchains in high-dimensional lattices. As an example, we study the global discord in the ground state of a 2D transverse-field Ising lattice, and pay our attention to the scaling behavior of global discord in 1D sub-chains of the lattice. We find that, for any strength of the magnetic field, global discord always shows a linear scaling behavior as the increase of the length of the sub-chains. In addition, global discord and the so-called "discord density" can be used to indicate the quantum phase transition in the model. Furthermore, based upon our numerical results, we make some reliable predictions about the scaling of global discord defined on the n × n sub-squares in the lattice.

  2. Structure and representation of correlation functions and the density matrix for a statistical wave field in optics

    International Nuclear Information System (INIS)

    Sudarshan, E.C.G.; Mukunda, N.

    1978-03-01

    A systematic structure analysis of the correlation functions of statistical quantum optics is carried out. From a suitably defined auxiliary two-point function identification of the excited modes in the wave field is found. The relative simplicity of the higher order correlation functions emerges as a by-product and the conditions under which these are made pure are derived. These results depend in a crucial manner on the notion of coherence indices aand of unimodular coherence indices. A new class of approximate expressions for the density operator of a statistical wave field is worked out based on discrete characteristic sets. These are even more economical than the diagonal coherent state representations. An appreciation of the subtleties of quantum theory obtains. Certain implications for the physics of light beams are cited. 28 references

  3. Statistical measurement of power spectrum density of large aperture optical component

    International Nuclear Information System (INIS)

    Xu Jiancheng; Xu Qiao; Chai Liqun

    2010-01-01

    According to the requirement of ICF, a method based on statistical theory has been proposed to measure the power spectrum density (PSD) of large aperture optical components. The method breaks the large-aperture wavefront into small regions, and obtains the PSD of the large-aperture wavefront by weighted averaging of the PSDs of the regions, where the weight factor is each region's area. Simulation and experiment demonstrate the effectiveness of the proposed method. They also show that, the obtained PSDs of the large-aperture wavefront by statistical method and sub-aperture stitching method fit well, when the number of small regions is no less than 8 x 8. The statistical method is not sensitive to translation stage's errors and environment instabilities, thus it is appropriate for PSD measurement during the process of optical fabrication. (authors)

  4. Automated Material Accounting Statistics System at Rockwell Hanford Operations

    International Nuclear Information System (INIS)

    Eggers, R.F.; Giese, E.W.; Kodman, G.P.

    1986-01-01

    The Automated Material Accounting Statistics System (AMASS) was developed under the sponsorship of the U.S. Nuclear Regulatory Commission. The AMASS was developed when it was realized that classical methods of error propagation, based only on measured quantities, did not properly control false alarm rate and that errors other than measurement errors affect inventory differences. The classical assumptions that (1) the mean value of the inventory difference (ID) for a particular nuclear material processing facility is zero, and (2) the variance of the inventory difference is due only to errors in measured quantities are overly simplistic. The AMASS provides a valuable statistical tool for estimating the true mean value and variance of the ID data produced by a particular material balance area. In addition it provides statistical methods of testing both individual and cumulative sums of IDs, taking into account the estimated mean value and total observed variance of the ID

  5. Operation Statistics of the CERN Accelerators Complex for 2003

    CERN Document Server

    CERN. Geneva; Baird, S A; Rey, A; Steerenberg, R; CERN. Geneva. AB Department

    2004-01-01

    This report gives an overview of the performance of the different Accelerators (Linacs, PS Booster, PS, AD and SPS) of the CERN Accelerator Complex for 2003. It includes scheduled activities, beam availabilities, beam intensities and an analysis of faults and breakdowns by system and by beam. MORE INFORATION by using the OP Statistics Tool: http://eLogbook.web.cern.ch/eLogbook/statistics.php and on the SPS HomePage: http://ab-div-op-sps.web.cern.ch/ab-div-op-sps/SPSss.html

  6. A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals

    KAUST Repository

    Mohamed, Mamdouh S.

    2015-05-18

    The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoretical analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kröner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.

  7. A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals

    KAUST Repository

    Mohamed, Mamdouh S.; Larson, Ben C.; Tischler, Jon Z.; El-Azab, Anter

    2015-01-01

    The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoretical analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kröner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.

  8. Time Evolution of the Wigner Operator as a Quasi-density Operator in Amplitude Dessipative Channel

    Science.gov (United States)

    Yu, Zhisong; Ren, Guihua; Yu, Ziyang; Wei, Chenhuinan; Fan, Hongyi

    2018-06-01

    For developing quantum mechanics theory in phase space, we explore how the Wigner operator {Δ } (α ,α ^{\\ast } )≡ {1}/{π } :e^{-2(α ^{\\ast } -α ^{\\dag })(α -α )}:, when viewed as a quasi-density operator correponding to the Wigner quasiprobability distribution, evolves in a damping channel. with the damping constant κ. We derive that it evolves into 1/T + 1:\\exp 2/T + 1[-(α^{\\ast} e^{-κ t}-a^{\\dag} )(α e^{-κ t}-a)]: where T ≡ 1 - e - 2 κ t . This in turn helps to directly obtain the final state ρ( t) out of the dessipative channel from the initial classical function corresponding to initial ρ(0). Throught the work, the method of integration within ordered product (IWOP) of operators is employed.

  9. Nuclear Level Densities for Modeling Nuclear Reactions: An Efficient Approach Using Statistical Spectroscopy: Annual Scientific Report July 2004

    International Nuclear Information System (INIS)

    Calvin W. Johnson

    2004-01-01

    The general goal of the project is to develop and implement computer codes and input files to compute nuclear densities of state. Such densities are important input into calculations of statistical neutron capture, and are difficult to access experimentally. In particular, we will focus on calculating densities for nuclides in the mass range A ?????? 50 - 100. We use statistical spectroscopy, a moments method based upon a microscopic framework, the interacting shell model. In this report we present our progress for the past year

  10. High density internal transport barriers for burning plasma operation

    Energy Technology Data Exchange (ETDEWEB)

    Ridolfini, V Pericoli [Associazione EURATOM-ENEA sulla Fusione, CR Frascati, Rome (Italy); Barbato, E [Associazione EURATOM-ENEA sulla Fusione, CR Frascati, Rome (Italy); Buratti, P [Associazione EURATOM-ENEA sulla Fusione, CR Frascati, Rome (Italy)] (and others)

    2005-12-15

    A tokamak plasma with internal transport barriers (ITBs) is the best candidate for a steady ITER operation, since the high energy confinement allows working at plasma currents (I{sub p}) lower than the reference scenario. To build and sustain an ITB at the ITER high density ({>=}10{sup 20} m{sup -3}) and largely dominant electron (e{sup -}) heating is not trivial in most existing tokamaks. FTU can instead meet both requests, thanks to its radiofrequency heating systems, lower hybrid (LH, up to 1.9 MW) and electron cyclotron (EC up to 1.2 MW). By the combined use of them, ITBs are obtained up to peak densities n{sub e0} > 1.3 x 10{sup 20} m{sup -3}, with central e{sup -} temperatures T{sub e0} {approx} 5.5 keV, and are sustained for as long as the heating pulse is applied (>35 confinement times, {tau}{sub E}). At n{sub e0} {approx} 0.8 x 10{sup 20} m{sup -3} T{sub e0} can be larger than 11 keV. Almost full current drive (CD) and an overall good steadiness is attained within about one {tau}{sub E}, 20 times faster than the ohmic current relaxation time. The ITB extends over a central region with an almost flat or slightly reversed q profile and q{sub min} {approx} 1.3 that is fully sustained by off-axis lower hybrid current drive. Consequent to this is the beneficial good alignment of the bootstrap current, generated by the ITB large pressure gradients, with the LH driven current. Reflectometry shows a clear change in the turbulence close to the ITB radius, consistent with the reduced e{sup -} transport. Ions (i{sup +}) are significantly heated via collisions, but thermal equilibrium with electrons cannot be attained since the e{sup -}-i{sup +} equipartition time is always 4-5 times longer than {tau}{sub E}. No degradation of the overall ion transport, rather a reduction of the i{sup +} heat diffusivity, is observed inside the ITB. The global confinement has been improved up to 1.6 times over the scaling predictions. The ITB radius can be controlled by adjusting the

  11. Use of a mixture statistical model in studying malaria vectors density.

    Directory of Open Access Journals (Sweden)

    Olayidé Boussari

    Full Text Available Vector control is a major step in the process of malaria control and elimination. This requires vector counts and appropriate statistical analyses of these counts. However, vector counts are often overdispersed. A non-parametric mixture of Poisson model (NPMP is proposed to allow for overdispersion and better describe vector distribution. Mosquito collections using the Human Landing Catches as well as collection of environmental and climatic data were carried out from January to December 2009 in 28 villages in Southern Benin. A NPMP regression model with "village" as random effect is used to test statistical correlations between malaria vectors density and environmental and climatic factors. Furthermore, the villages were ranked using the latent classes derived from the NPMP model. Based on this classification of the villages, the impacts of four vector control strategies implemented in the villages were compared. Vector counts were highly variable and overdispersed with important proportion of zeros (75%. The NPMP model had a good aptitude to predict the observed values and showed that: i proximity to freshwater body, market gardening, and high levels of rain were associated with high vector density; ii water conveyance, cattle breeding, vegetation index were associated with low vector density. The 28 villages could then be ranked according to the mean vector number as estimated by the random part of the model after adjustment on all covariates. The NPMP model made it possible to describe the distribution of the vector across the study area. The villages were ranked according to the mean vector density after taking into account the most important covariates. This study demonstrates the necessity and possibility of adapting methods of vector counting and sampling to each setting.

  12. Reason of method of density functional in classical and quantum statistical mechanisms

    International Nuclear Information System (INIS)

    Dinariev, O.Yu.

    2000-01-01

    Interaction between phenomenological description of a multi-component mixture on the basis of entropy functional with members, square in terms of component density gradients and temperature, on the one hand, and description in the framework of classical and quantum statistical mechanics, on the other hand, was investigated. Explicit expressions for the entropy functional in the classical and quantum theory were derived. Then a square approximation for the case of minor disturbances of uniform state was calculated. In the approximation the addends square in reference to the gradient were singlet out. It permits calculation of the relevant phenomenological coefficients from the leading principles [ru

  13. Enhancement opportunities in operating room utilization; with a statistical appendix

    NARCIS (Netherlands)

    van Veen-Berkx, Elizabeth; Elkhuizen, Sylvia G.; van Logten, Sanne; Buhre, Wolfgang F.; Kalkman, Cor J.; Gooszen, Hein G.; Kazemier, Geert; Balm, Ron; Cornelisse, Diederich C. C.; Ackermans, Hub J.; Stolker, Robert Jan; Bezstarosti, Jeanne; Pelger, Rob C. M.; Schaad, Roald R.; Krooneman-Smits, Irmgard; Meyer, Peter; van Dijk-Jager, Mirjam; Broecheler, Simon A. W.; Kroese, A. Christiaan; Kanters, Jeffrey; Krabbendam, Johannes J.; Hans, Erwin W.; Veerman, Derk P.; Aij, Kjeld H.

    2015-01-01

    Background: The purpose of this study was to assess the direct and indirect relationships between first-case tardiness (or "late start"), turnover time, underused operating room (OR) time, and raw utilization, as well as to determine which indicator had the most negative impact on OR utilization to

  14. Enhancement opportunities in operating room utilization; with a statistical appendix

    NARCIS (Netherlands)

    Veen-Berkx, E. van; Elkhuizen, S.G.; Logten, S. van; Buhre, W.F.; Kalkman, C.J.; Gooszen, H.G.; Kazemier, G.

    2015-01-01

    BACKGROUND: The purpose of this study was to assess the direct and indirect relationships between first-case tardiness (or "late start"), turnover time, underused operating room (OR) time, and raw utilization, as well as to determine which indicator had the most negative impact on OR utilization to

  15. Statistical learning: a powerful mechanism that operates by mere exposure.

    Science.gov (United States)

    Aslin, Richard N

    2017-01-01

    How do infants learn so rapidly and with little apparent effort? In 1996, Saffran, Aslin, and Newport reported that 8-month-old human infants could learn the underlying temporal structure of a stream of speech syllables after only 2 min of passive listening. This demonstration of what was called statistical learning, involving no instruction, reinforcement, or feedback, led to dozens of confirmations of this powerful mechanism of implicit learning in a variety of modalities, domains, and species. These findings reveal that infants are not nearly as dependent on explicit forms of instruction as we might have assumed from studies of learning in which children or adults are taught facts such as math or problem solving skills. Instead, at least in some domains, infants soak up the information around them by mere exposure. Learning and development in these domains thus appear to occur automatically and with little active involvement by an instructor (parent or teacher). The details of this statistical learning mechanism are discussed, including how exposure to specific types of information can, under some circumstances, generalize to never-before-observed information, thereby enabling transfer of learning. WIREs Cogn Sci 2017, 8:e1373. doi: 10.1002/wcs.1373 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  16. Practical guidance for statistical analysis of operational event data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies

  17. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  18. Refilling process in the plasmasphere: a 3-D statistical characterization based on Cluster density observations

    Directory of Open Access Journals (Sweden)

    G. Lointier

    2013-02-01

    Full Text Available The Cluster mission offers an excellent opportunity to investigate the evolution of the plasma population in a large part of the inner magnetosphere, explored near its orbit's perigee, over a complete solar cycle. The WHISPER sounder, on board each satellite of the mission, is particularly suitable to study the electron density in this region, between 0.2 and 80 cm−3. Compiling WHISPER observations during 1339 perigee passes distributed over more than three years of the Cluster mission, we present first results of a statistical analysis dedicated to the study of the electron density morphology and dynamics along and across magnetic field lines between L = 2 and L = 10. In this study, we examine a specific topic: the refilling of the plasmasphere and trough regions during extended periods of quiet magnetic conditions. To do so, we survey the evolution of the ap index during the days preceding each perigee crossing and sort out electron density profiles along the orbit according to three classes, namely after respectively less than 2 days, between 2 and 4 days, and more than 4 days of quiet magnetic conditions (ap ≤ 15 nT following an active episode (ap > 15 nT. This leads to three independent data subsets. Comparisons between density distributions in the 3-D plasmasphere and trough regions at the three stages of quiet magnetosphere provide novel views about the distribution of matter inside the inner magnetosphere during several days of low activity. Clear signatures of a refilling process inside an expended plasmasphere in formation are noted. A plasmapause-like boundary, at L ~ 6 for all MLT sectors, is formed after 3 to 4 days and expends somewhat further after that. In the outer part of the plasmasphere (L ~ 8, latitudinal profiles of median density values vary essentially according to the MLT sector considered rather than according to the refilling duration. The shape of these density profiles indicates that magnetic flux tubes are not

  19. Dynamic Statistical Models for Pyroclastic Density Current Generation at Soufrière Hills Volcano

    Science.gov (United States)

    Wolpert, Robert L.; Spiller, Elaine T.; Calder, Eliza S.

    2018-05-01

    To mitigate volcanic hazards from pyroclastic density currents, volcanologists generate hazard maps that provide long-term forecasts of areas of potential impact. Several recent efforts in the field develop new statistical methods for application of flow models to generate fully probabilistic hazard maps that both account for, and quantify, uncertainty. However a limitation to the use of most statistical hazard models, and a key source of uncertainty within them, is the time-averaged nature of the datasets by which the volcanic activity is statistically characterized. Where the level, or directionality, of volcanic activity frequently changes, e.g. during protracted eruptive episodes, or at volcanoes that are classified as persistently active, it is not appropriate to make short term forecasts based on longer time-averaged metrics of the activity. Thus, here we build, fit and explore dynamic statistical models for the generation of pyroclastic density current from Soufrière Hills Volcano (SHV) on Montserrat including their respective collapse direction and flow volumes based on 1996-2008 flow datasets. The development of this approach allows for short-term behavioral changes to be taken into account in probabilistic volcanic hazard assessments. We show that collapses from the SHV lava dome follow a clear pattern, and that a series of smaller flows in a given direction often culminate in a larger collapse and thereafter directionality of the flows change. Such models enable short term forecasting (weeks to months) that can reflect evolving conditions such as dome and crater morphology changes and non-stationary eruptive behavior such as extrusion rate variations. For example, the probability of inundation of the Belham Valley in the first 180 days of a forecast period is about twice as high for lava domes facing Northwest toward that valley as it is for domes pointing East toward the Tar River Valley. As rich multi-parametric volcano monitoring dataset become

  20. Using features of local densities, statistics and HMM toolkit (HTK for offline Arabic handwriting text recognition

    Directory of Open Access Journals (Sweden)

    El Moubtahij Hicham

    2017-12-01

    Full Text Available This paper presents an analytical approach of an offline handwritten Arabic text recognition system. It is based on the Hidden Markov Models (HMM Toolkit (HTK without explicit segmentation. The first phase is preprocessing, where the data is introduced in the system after quality enhancements. Then, a set of characteristics (features of local densities and features statistics are extracted by using the technique of sliding windows. Subsequently, the resulting feature vectors are injected to the Hidden Markov Model Toolkit (HTK. The simple database “Arabic-Numbers” and IFN/ENIT are used to evaluate the performance of this system. Keywords: Hidden Markov Models (HMM Toolkit (HTK, Sliding windows

  1. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  2. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    Science.gov (United States)

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  3. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae habitat and population densities

    Directory of Open Access Journals (Sweden)

    Khalifa M. Al-Kindi

    2017-08-01

    Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  4. Strong density of a class of simple operators

    International Nuclear Information System (INIS)

    Somasundaram, S.; Mohammad, N.

    1991-08-01

    An algebra of simple operators has been shown to be strongly dense in the algebra of all bounded linear operators on function spaces of a compact (not necessarily abelian) group. Further, it is proved that the same result is also true for L 2 (G) if G is a locally compact (not necessarily compact) abelian group. (author). 6 refs

  5. Marine Traffic Density Over Port Klang, Malaysia Using Statistical Analysis of AIS Data: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Masnawi MUSTAFFA

    2016-12-01

    Full Text Available Port Klang Malaysia is the 13th busiest port in the world, the capacity at the port expected to be able to meet the demand until 2018. It is one of the busiest ports in the world and also the busiest port in Malaysia. Even though there are statistics published by Port Klang Authority showing that a lot of ships using this port, this number is only based on ships that entering Port Klang. Therefore, no study has been done to investigate on how dense the traffic is in Port Klang, Malaysia the surrounding sea including Strait of Malacca . This paper has investigated on traffic density over Port Klang Malaysia and its surrounding sea using statistical analysis from AIS data. As a preliminary study, this study only collected AIS data for 7 days to represent daily traffic weekly. As a result, an hourly number of vessels, daily number of vessels, vessels classification and sizes and also traffic paths will be plotted.

  6. Theoretical remarks on the statistics of three discriminants in Piety's automated signature analysis of PSD [Power Spectral Density] data

    International Nuclear Information System (INIS)

    Behringer, K.; Spiekerman, G.

    1984-01-01

    Piety (1977) proposed an automated signature analysis of power spectral density data. Eight statistical decision discriminants are introduced. For nearly all the discriminants, improved confidence statements can be made. The statistical characteristics of the last three discriminants, which are applications of non-parametric tests, are considered. (author)

  7. On nonequilibrium many-body systems. 1: The nonequilibrium statistical operator method

    International Nuclear Information System (INIS)

    Algarte, A.C.S.; Vasconcellos, A.R.; Luzzi, R.; Sampaio, A.J.C.

    1985-01-01

    The theoretical aspects involved in the treatment of many-body systems strongly departed from equilibrium are discussed. The nonequilibrium statistical operator (NSO) method is considered in detail. Using Jaynes' maximum entropy formalism complemented with an ad hoc hypothesis a nonequilibrium statistical operator is obtained. This approach introduces irreversibility from the outset and we recover statistical operators like those of Green-Mori and Zubarev as particular cases. The connection with Generalized Thermodynamics and the construction of nonlinear transport equations are briefly described. (Author) [pt

  8. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  9. Breakdown of the Siegert theorem and the many-body charge density operators

    International Nuclear Information System (INIS)

    Hyuga, H.; Ohtsubo, H.

    1978-01-01

    The exchange charge density operator is studied in the two-boson exchange model with consistent treatment of the exchange current and nuclear wave functions. A non-vanishing exchange charge density operator even in the static limit, which leads to the breakdown of the Siegert theorem, is found. (Auth.)

  10. High density internal transport barriers for burning plasma operation

    International Nuclear Information System (INIS)

    Pericoli Ridolfini, V.

    2005-01-01

    One of the proposed ITER scenarios foresees the creation and sustainment of an internal transport barrier (ITB) in order to improve the confinement properties of the hot core plasma. The more stringent requests are: the ITB must be sustained with electron heating only with no or very small external momentum source, the strong collisional coupling at the envisaged density (line average >1.0 1020 m-3) must not prevent the barrier existence, the bootstrap current created by the large induced gradients must have a radial profile consistent with that requested by the barrier creation and sustainment. To all these items the studies carried out in FTU in the same density range (ne0 ?1.5 1020 m-3) provide encouraging prospects. With pure electron heating and current drive (LH+ECH) steady electron barrier are generated and maintained with central e- temperature >5.0 keV. Almost full CD conditions are established with a bootstrap current close to 25% of the total and well aligned with that driven by the LH waves and responsible for the barrier building. The clear change in the density fluctuations close to the ITB radius, observed by reflectometry, indicates stabilization of turbulence that is consistent with the drop of the thermal electron diffusivity inside the ITB to very low values, ?e<0.5 m2/s estimated by the transport analysis. The 10 fold neutron rate increase testifies a significant collisional ion heating, even though usually ?Ti0/Ti0 does not exceed 40%, because the e--i + equipartition time, always 4-5 times longer than the energy confinement time, does not allow thermal equilibrium with electrons to be attained. The ion thermal diffusivity inside the barrier must be lowered to the neoclassical level to account for the observed Ti(r) profiles, clearly indicating at least a non-degraded ion transport. The global confinement in turn improves by 1.6 times above the FTU L-scaling. The ITB radius can be controlled by varying the LH power deposition profile that is

  11. Density operator description of geometric phenomena in the ray space

    Indian Academy of Sciences (India)

    set of generators for the related 2-sphere ray subspace (Ь2), highlighting the physical oper- ations performable ... generators, we propose a single-query quantum search algorithm to extract a desired ray exactly from a ..... The first observation [22] of noncyclic amplitudes and phases was made in a neutron in- terference ...

  12. Experimental study of high density foods for the Space Operations Center

    Science.gov (United States)

    Ahmed, S. M.

    1981-01-01

    The experimental study of high density foods for the Space Operations Center is described. A sensory evaluation of the high density foods was conducted first to test the acceptability of the products. A shelf-life study of the high density foods was also conducted for three different time lengths at three different temperatures. The nutritional analysis of the high density foods is at present incomplete.

  13. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  14. HI column density distribution function at z=0 : Connection to damped Ly alpha statistics

    NARCIS (Netherlands)

    Zwaan, Martin; Verheijen, MAW; Briggs, FH

    We present a measurement of the HI column density distribution function f(N-HI) at the present epoch for column densities > 10(20) cm(-2). These high column densities compare to those measured in damped Ly alpha lines seen in absorption against background quasars. Although observationally rare, it

  15. Connection between perturbation theory, projection-operator techniques, and statistical linearization for nonlinear systems

    International Nuclear Information System (INIS)

    Budgor, A.B.; West, B.J.

    1978-01-01

    We employ the equivalence between Zwanzig's projection-operator formalism and perturbation theory to demonstrate that the approximate-solution technique of statistical linearization for nonlinear stochastic differential equations corresponds to the lowest-order β truncation in both the consolidated perturbation expansions and in the ''mass operator'' of a renormalized Green's function equation. Other consolidated equations can be obtained by selectively modifying this mass operator. We particularize the results of this paper to the Duffing anharmonic oscillator equation

  16. Computer-aided assessment of breast density: comparison of supervised deep learning and feature-based statistical learning

    Science.gov (United States)

    Li, Songfeng; Wei, Jun; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2018-01-01

    Breast density is one of the most significant factors that is associated with cancer risk. In this study, our purpose was to develop a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammograms (DMs). The input ‘for processing’ DMs was first log-transformed, enhanced by a multi-resolution preprocessing scheme, and subsampled to a pixel size of 800 µm  ×  800 µm from 100 µm  ×  100 µm. A deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD) by using a domain adaptation resampling method. The PD was estimated as the ratio of the dense area to the breast area based on the PMD. The DCNN approach was compared to a feature-based statistical learning approach. Gray level, texture and morphological features were extracted and a least absolute shrinkage and selection operator was used to combine the features into a feature-based PMD. With approval of the Institutional Review Board, we retrospectively collected a training set of 478 DMs and an independent test set of 183 DMs from patient files in our institution. Two experienced mammography quality standards act radiologists interactively segmented PD as the reference standard. Ten-fold cross-validation was used for model selection and evaluation with the training set. With cross-validation, DCNN obtained a Dice’s coefficient (DC) of 0.79  ±  0.13 and Pearson’s correlation (r) of 0.97, whereas feature-based learning obtained DC  =  0.72  ±  0.18 and r  =  0.85. For the independent test set, DCNN achieved DC  =  0.76  ±  0.09 and r  =  0.94, while feature-based learning achieved DC  =  0.62  ±  0.21 and r  =  0.75. Our DCNN approach was significantly better and more robust than the feature-based learning approach for automated PD estimation on DMs, demonstrating its potential use for automated density reporting as

  17. Computer-aided assessment of breast density: comparison of supervised deep learning and feature-based statistical learning.

    Science.gov (United States)

    Li, Songfeng; Wei, Jun; Chan, Heang-Ping; Helvie, Mark A; Roubidoux, Marilyn A; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir M; Samala, Ravi K

    2018-01-09

    Breast density is one of the most significant factors that is associated with cancer risk. In this study, our purpose was to develop a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammograms (DMs). The input 'for processing' DMs was first log-transformed, enhanced by a multi-resolution preprocessing scheme, and subsampled to a pixel size of 800 µm  ×  800 µm from 100 µm  ×  100 µm. A deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD) by using a domain adaptation resampling method. The PD was estimated as the ratio of the dense area to the breast area based on the PMD. The DCNN approach was compared to a feature-based statistical learning approach. Gray level, texture and morphological features were extracted and a least absolute shrinkage and selection operator was used to combine the features into a feature-based PMD. With approval of the Institutional Review Board, we retrospectively collected a training set of 478 DMs and an independent test set of 183 DMs from patient files in our institution. Two experienced mammography quality standards act radiologists interactively segmented PD as the reference standard. Ten-fold cross-validation was used for model selection and evaluation with the training set. With cross-validation, DCNN obtained a Dice's coefficient (DC) of 0.79  ±  0.13 and Pearson's correlation (r) of 0.97, whereas feature-based learning obtained DC  =  0.72  ±  0.18 and r  =  0.85. For the independent test set, DCNN achieved DC  =  0.76  ±  0.09 and r  =  0.94, while feature-based learning achieved DC  =  0.62  ±  0.21 and r  =  0.75. Our DCNN approach was significantly better and more robust than the feature-based learning approach for automated PD estimation on DMs, demonstrating its potential use for automated density reporting as well as

  18. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  19. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  20. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  1. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  3. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  4. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  5. Spectra of random operators with absolutely continuous integrated density of states

    Energy Technology Data Exchange (ETDEWEB)

    Rio, Rafael del, E-mail: delrio@iimas.unam.mx, E-mail: delriomagia@gmail.com [Departamento de Fisica Matematica, Instituto de Investigaciones en Matemáticas Aplicadas y en Sistemas, Universidad Nacional Autónoma de México, C.P. 04510, México D.F. (Mexico)

    2014-04-15

    The structure of the spectrum of random operators is studied. It is shown that if the density of states measure of some subsets of the spectrum is zero, then these subsets are empty. In particular follows that absolute continuity of the integrated density of states implies singular spectra of ergodic operators is either empty or of positive measure. Our results apply to Anderson and alloy type models, perturbed Landau Hamiltonians, almost periodic potentials, and models which are not ergodic.

  6. Spectra of random operators with absolutely continuous integrated density of states

    International Nuclear Information System (INIS)

    Rio, Rafael del

    2014-01-01

    The structure of the spectrum of random operators is studied. It is shown that if the density of states measure of some subsets of the spectrum is zero, then these subsets are empty. In particular follows that absolute continuity of the integrated density of states implies singular spectra of ergodic operators is either empty or of positive measure. Our results apply to Anderson and alloy type models, perturbed Landau Hamiltonians, almost periodic potentials, and models which are not ergodic

  7. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    Science.gov (United States)

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  8. Probability density function shape sensitivity in the statistical modeling of turbulent particle dispersion

    Science.gov (United States)

    Litchford, Ron J.; Jeng, San-Mou

    1992-01-01

    The performance of a recently introduced statistical transport model for turbulent particle dispersion is studied here for rigid particles injected into a round turbulent jet. Both uniform and isosceles triangle pdfs are used. The statistical sensitivity to parcel pdf shape is demonstrated.

  9. Summary of Key Operating Statistics: Data Collected from the 2009 Annual Institutional Report

    Science.gov (United States)

    Accrediting Council for Independent Colleges and Schools, 2010

    2010-01-01

    The Accrediting Council for Independent Colleges and Schools (ACICS) provides the Summary of Key Operating Statistics (KOS) as an annual review of the performance and key measurements of the more than 800 private post-secondary institutions we accredit. This edition of the KOS contains information based on the 2009 Annual Institutional Reports…

  10. Density by moduli and Wijsman lacunary statistical convergence of sequences of sets

    Directory of Open Access Journals (Sweden)

    Vinod K Bhardwaj

    2017-01-01

    Full Text Available Abstract The main object of this paper is to introduce and study a new concept of f-Wijsman lacunary statistical convergence of sequences of sets, where f is an unbounded modulus. The definition of Wijsman lacunary strong convergence of sequences of sets is extended to a definition of Wijsman lacunary strong convergence with respect to a modulus for sequences of sets and it is shown that, under certain conditions on a modulus f, the concepts of Wijsman lacunary strong convergence with respect to a modulus f and f-Wijsman lacunary statistical convergence are equivalent on bounded sequences. We further characterize those θ for which WS θ f = WS f $\\mathit{WS}_{\\theta}^{f} = \\mathit{WS}^{f}$ , where WS θ f $\\mathit{WS}_{\\theta}^{f}$ and WS f $\\mathit{WS}^{f}$ denote the sets of all f-Wijsman lacunary statistically convergent sequences and f-Wijsman statistically convergent sequences, respectively.

  11. Statistical process control: separating signal from noise in emergency department operations.

    Science.gov (United States)

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Feedback control of plasma density and heating power for steady state operation in LHD

    Energy Technology Data Exchange (ETDEWEB)

    Kamio, Shuji, E-mail: kamio@nifs.ac.jp; Kasahara, Hiroshi; Seki, Tetsuo; Saito, Kenji; Seki, Ryosuke; Nomura, Goro; Mutoh, Takashi

    2015-12-15

    Highlights: • We upgraded a control system for steady state operation in LHD. • This system contains gas fueling system and ICRF power control system. • Automatic power boost system is also attached for stable operation. • As a result, we achieved the long pulse up to 48 min in the electron density of more than 1 × 10{sup 19} m{sup −3}. - Abstract: For steady state operation, the feedback control of plasma density and heating power system was developed in the Large Helical Device (LHD). In order to achieve a record of the long pulse discharge, stable plasma density and heating power are needed. This system contains the radio frequency (RF) heating power control, interlocks, gas fueling, automatic RF phase control, ion cyclotron range of frequency (ICRF) antenna position control, and graphical user interface (GUI). Using the density control system, the electron density was controlled to the target density and using the RF heating power control system, the RF power injection could be stable. As a result of using this system, we achieved the long pulse up to 48 min in the electron density of more than 1 × 10{sup 19} m{sup −3}. Further, the ICRF hardware experienced no critical accidents during the 17th LHD experiment campaign in 2013.

  13. Surface of Maximums of AR(2 Process Spectral Densities and its Application in Time Series Statistics

    Directory of Open Access Journals (Sweden)

    Alexander V. Ivanov

    2017-09-01

    Conclusions. The obtained formula of surface of maximums of noise spectral densities gives an opportunity to realize for which values of AR(2 process characteristic polynomial coefficients it is possible to look for greater rate of convergence to zero of the probabilities of large deviations of the considered estimates.

  14. On the statistical interpretation of quantum mechanics: evolution of the density matrix

    International Nuclear Information System (INIS)

    Benzecri, J.P.

    1986-01-01

    Without attempting to identify ontological interpretation with a mathematical structure, we reduce philosophical speculation to five theses. In the discussion of these, a central role is devoted to the mathematical problem of the evolution of the density matrix. This article relates to the first 3 of these 5 theses [fr

  15. Detecting reduced bone mineral density from dental radiographs using statistical shape models

    NARCIS (Netherlands)

    Allen, P.D.; Graham, J.; Farnell, D.J.J.; Harrison, E.J.; Jacobs, R.; Nicopoulou-Karyianni, K.; Lindh, C.; van der Stelt, P.F.; Horner, K.; Devlin, H.

    2007-01-01

    We describe a novel method of estimating reduced bone mineral density (BMD) from dental panoramic tomograms (DPTs), which show the entire mandible. Careful expert width measurement of the inferior mandibular cortex has been shown to be predictive of BMD in hip and spine osteopenia and osteoporosis.

  16. On the statistical interpretation of quantum mechanics: evolution of the density matrix

    International Nuclear Information System (INIS)

    Benzecri, J.-P.

    1986-01-01

    Using two classical examples (the Young slit experiment and coherent and incoherent crystal diffraction of neutrons) we show in a general framework, that for a system viewed as consisting of two components, depolarisation of the density matrix by one of these can result from the application of the Schroedinger equation to the global system [fr

  17. Cluster observations of near-Earth magnetospheric lobe plasma densities – a statistical study

    Directory of Open Access Journals (Sweden)

    K. R. Svenes

    2008-09-01

    Full Text Available The Cluster-mission has enabled a study of the near-Earth magnetospheric lobes throughout the waning part of solar cycle 23. During the first seven years of the mission the satellites crossed this region of space regularly from about July to October. We have obtained new and more accurate plasma densities in this region based on spacecraft potential measurements from the EFW-instrument. The plasma density measurements are found by converting the potential measurements using a functional relationship between these two parameters. Our observations have shown that throughout this period a full two thirds of the measurements were contained in the range 0.007–0.092 cm−3 irrespective of solar wind conditions or geomagnetic activity. In fact, the most probable density encountered was 0.047 cm−3, staying roughly constant throughout the entire observation period. The plasma population in this region seems to reflect an equilibrium situation in which the density is independent of the solar wind condition or geomagnetic activity. However, the high density tail of the population (ne>0.2 cm−3 seemed to decrease with the waning solar cycle. This points to a source region influenced by the diminishing solar UV/EUV-intensity. Noting that the quiet time polar wind has just such a development and that it is magnetically coupled to the lobes, it seems likely to assume that this is a prominent source for the lobe plasma.

  18. Direct estimation of functionals of density operators by local operations and classical communication

    International Nuclear Information System (INIS)

    Alves, Carolina Moura; Horodecki, Pawel; Oi, Daniel K. L.; Kwek, L. C.; Ekert, Artur K.

    2003-01-01

    We present a method of direct estimation of important properties of a shared bipartite quantum state, within the ''distant laboratories'' paradigm, using only local operations and classical communication. We apply this procedure to spectrum estimation of shared states, and locally implementable structural physical approximations to incompletely positive maps. This procedure can also be applied to the estimation of channel capacity and measures of entanglement

  19. On the relation between the statistical γ-decay and the level density in 162Dy

    International Nuclear Information System (INIS)

    Henden, L.; Bergholt, L.; Guttormsen, M.; Rekstad, J.; Tveter, T.S.

    1994-12-01

    The level density of low-spin states (0-10ℎ) in 162 Dy has been determined from the ground state up to approximately 6 MeV of excitation energy. Levels in the excitation region up to 8 MeV were populated by means of the 163 Dy( 3 He, α) reaction, and the first-generation γ-rays in the decay of these states has been isolated. The energy distribution of the first-generation γ-rays provides a new source of information about the nuclear level density over a wide energy region. A broad peak is observed in the first-generation spectra, and the authors suggest an interpretation in terms of enhanced M1 transitions between different high-j Nilsson orbitals. 30 refs., 9 figs., 2 tabs

  20. Treatment of automotive industry oily wastewater by electrocoagulation: statistical optimization of the operational parameters.

    Science.gov (United States)

    GilPavas, Edison; Molina-Tirado, Kevin; Gómez-García, Miguel Angel

    2009-01-01

    An electrocoagulation process was used for the treatment of oily wastewater generated from an automotive industry in Medellín (Colombia). An electrochemical cell consisting of four parallel electrodes (Fe and Al) in bipolar configuration was implemented. A multifactorial experimental design was used for evaluating the influence of several parameters including: type and arrangement of electrodes, pH, and current density. Oil and grease removal was defined as the response variable for the statistical analysis. Additionally, the BOD(5), COD, and TOC were monitored during the treatment process. According to the results, at the optimum parameter values (current density = 4.3 mA/cm(2), distance between electrodes = 1.5 cm, Fe as anode, and pH = 12) it was possible to reach a c.a. 95% oils removal, COD and mineralization of 87.4% and 70.6%, respectively. A final biodegradability (BOD(5)/COD) of 0.54 was reached.

  1. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermi's equation. Approximate calculus methods are found from analytic study of the T-Fermi's equation for non zero temperature. T-Fermi's equation is solved with the code ''Golem''written in Fortran V (Univac). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (author) [es

  2. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermis equation. Approximate calculus methods are found from analytic study of the T-Fermis equation for non zero temperature. T-Fermis equation is solved with the code GOLEM written in FORTRAN V (UNIVAC). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (Author) 24 refs

  3. Remotely operable compact instruments for measuring atmospheric CO2 and CH4 column densities at surface monitoring sites

    Directory of Open Access Journals (Sweden)

    I. Morino

    2010-08-01

    Full Text Available Remotely operable compact instruments for measuring atmospheric CO2 and CH4 column densities were developed in two independent systems: one utilizing a grating-based desktop optical spectrum analyzer (OSA with a resolution enough to resolve rotational lines of CO2 and CH4 in the regions of 1565–1585 and 1674–1682 nm, respectively; the other is an application of an optical fiber Fabry-Perot interferometer (FFPI to obtain the CO2 column density. Direct sunlight was collimated via a small telescope installed on a portable sun tracker and then transmitted through an optical fiber into the OSA or the FFPI for optical analysis. The near infrared spectra of the OSA were retrieved by a least squares spectral fitting algorithm. The CO2 and CH4 column densities deduced were in excellent agreement with those measured by a Fourier transform spectrometer with high resolution. The rovibronic lines in the wavelength region of 1570–1575 nm were analyzed by the FFPI. The I0 and I values in the Beer-Lambert law equation to obtain CO2 column density were deduced by modulating temperature of the FFPI, which offered column CO2 with the statistical error less than 0.2% for six hours measurement.

  4. Effects of tillage operations and plant density on leaf spot disease ...

    African Journals Online (AJOL)

    Two seasons experiments conducted in 2002 and 2003 revealed that Tillage operations significantly influenced leafspot disease severity; Percentage lodging 3.14; 2.08 and Grain yield 3.02; 3.84 in 2002 and 2003 respectively. Plant density also had significant difference on leafspot disease severity; Percentage lodging ...

  5. Stochastic optimal control as non-equilibrium statistical mechanics: calculus of variations over density and current

    Science.gov (United States)

    Chernyak, Vladimir Y.; Chertkov, Michael; Bierkens, Joris; Kappen, Hilbert J.

    2014-01-01

    In stochastic optimal control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.

  6. Image Denoising via Bayesian Estimation of Statistical Parameter Using Generalized Gamma Density Prior in Gaussian Noise Model

    Science.gov (United States)

    Kittisuwan, Pichid

    2015-03-01

    The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.

  7. Whole brain analysis of postmortem density changes of grey and white matter on computed tomography by statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nishiyama, Yuichi; Mori, Hiroshi; Katsube, Takashi; Kitagaki, Hajime [Shimane University Faculty of Medicine, Department of Radiology, Izumo-shi, Shimane (Japan); Kanayama, Hidekazu; Tada, Keiji; Yamamoto, Yasushi [Shimane University Hospital, Department of Radiology, Izumo-shi, Shimane (Japan); Takeshita, Haruo [Shimane University Faculty of Medicine, Department of Legal Medicine, Izumo-shi, Shimane (Japan); Kawakami, Kazunori [Fujifilm RI Pharma, Co., Ltd., Tokyo (Japan)

    2017-06-15

    This study examined the usefulness of statistical parametric mapping (SPM) for investigating postmortem changes on brain computed tomography (CT). This retrospective study included 128 patients (23 - 100 years old) without cerebral abnormalities who underwent unenhanced brain CT before and after death. The antemortem CT (AMCT) scans and postmortem CT (PMCT) scans were spatially normalized using our original brain CT template, and postmortem changes of CT values (in Hounsfield units; HU) were analysed by the SPM technique. Compared with AMCT scans, 58.6 % and 98.4 % of PMCT scans showed loss of the cerebral sulci and an unclear grey matter (GM)-white matter (WM) interface, respectively. SPM analysis revealed a significant decrease in cortical GM density within 70 min after death on PMCT scans, suggesting cytotoxic brain oedema. Furthermore, there was a significant increase in the density of the WM, lenticular nucleus and thalamus more than 120 min after death. The SPM technique demonstrated typical postmortem changes on brain CT scans, and revealed that the unclear GM-WM interface on early PMCT scans is caused by a rapid decrease in cortical GM density combined with a delayed increase in WM density. SPM may be useful for assessment of whole brain postmortem changes. (orig.)

  8. Whole brain analysis of postmortem density changes of grey and white matter on computed tomography by statistical parametric mapping

    International Nuclear Information System (INIS)

    Nishiyama, Yuichi; Mori, Hiroshi; Katsube, Takashi; Kitagaki, Hajime; Kanayama, Hidekazu; Tada, Keiji; Yamamoto, Yasushi; Takeshita, Haruo; Kawakami, Kazunori

    2017-01-01

    This study examined the usefulness of statistical parametric mapping (SPM) for investigating postmortem changes on brain computed tomography (CT). This retrospective study included 128 patients (23 - 100 years old) without cerebral abnormalities who underwent unenhanced brain CT before and after death. The antemortem CT (AMCT) scans and postmortem CT (PMCT) scans were spatially normalized using our original brain CT template, and postmortem changes of CT values (in Hounsfield units; HU) were analysed by the SPM technique. Compared with AMCT scans, 58.6 % and 98.4 % of PMCT scans showed loss of the cerebral sulci and an unclear grey matter (GM)-white matter (WM) interface, respectively. SPM analysis revealed a significant decrease in cortical GM density within 70 min after death on PMCT scans, suggesting cytotoxic brain oedema. Furthermore, there was a significant increase in the density of the WM, lenticular nucleus and thalamus more than 120 min after death. The SPM technique demonstrated typical postmortem changes on brain CT scans, and revealed that the unclear GM-WM interface on early PMCT scans is caused by a rapid decrease in cortical GM density combined with a delayed increase in WM density. SPM may be useful for assessment of whole brain postmortem changes. (orig.)

  9. Microstructure characterisation of solid oxide electrolysis cells operated at high current density

    DEFF Research Database (Denmark)

    Bowen, Jacob R.; Bentzen, Janet Jonna; Chen, Ming

    degradation of cell components in relation to the loss of electrochemical performance specific to the mode of operation. Thus descriptive microstructure characterization methods are required in combination with electrochemical characterization methods to decipher degradation mechanisms. In the present work......High temperature solid oxide cells can be operated either as fuel cells or electrolysis cells for efficient power generation or production of hydrogen from steam or synthesis gas (H2 + CO) from steam and CO2 respectively. When operated under harsh conditions, they often exhibit microstructural...... quantified using the mean linear intercept method as a function of current density and correlated to increases in serial resistance. The above structural changes are then compared in terms of electrode degradation observed during the co-electrolysis of steam and CO2 at current densities up to -1.5 A cm-2...

  10. Convergence of statistical moments of particle density time series in scrape-off layer plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Kube, R., E-mail: ralph.kube@uit.no; Garcia, O. E. [Department of Physics and Technology, UiT - The Arctic University of Norway, N-9037 Tromsø (Norway)

    2015-01-15

    Particle density fluctuations in the scrape-off layer of magnetically confined plasmas, as measured by gas-puff imaging or Langmuir probes, are modeled as the realization of a stochastic process in which a superposition of pulses with a fixed shape, an exponential distribution of waiting times, and amplitudes represents the radial motion of blob-like structures. With an analytic formulation of the process at hand, we derive expressions for the mean squared error on estimators of sample mean and sample variance as a function of sample length, sampling frequency, and the parameters of the stochastic process. Employing that the probability distribution function of a particularly relevant stochastic process is given by the gamma distribution, we derive estimators for sample skewness and kurtosis and expressions for the mean squared error on these estimators. Numerically, generated synthetic time series are used to verify the proposed estimators, the sample length dependency of their mean squared errors, and their performance. We find that estimators for sample skewness and kurtosis based on the gamma distribution are more precise and more accurate than common estimators based on the method of moments.

  11. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    Science.gov (United States)

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  12. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    Directory of Open Access Journals (Sweden)

    Carmen Moret-Tatay

    2018-05-01

    Full Text Available The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area. The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  13. Convergence of statistical moments of particle density time series in scrape-off layer plasmas

    International Nuclear Information System (INIS)

    Kube, R.; Garcia, O. E.

    2015-01-01

    Particle density fluctuations in the scrape-off layer of magnetically confined plasmas, as measured by gas-puff imaging or Langmuir probes, are modeled as the realization of a stochastic process in which a superposition of pulses with a fixed shape, an exponential distribution of waiting times, and amplitudes represents the radial motion of blob-like structures. With an analytic formulation of the process at hand, we derive expressions for the mean squared error on estimators of sample mean and sample variance as a function of sample length, sampling frequency, and the parameters of the stochastic process. Employing that the probability distribution function of a particularly relevant stochastic process is given by the gamma distribution, we derive estimators for sample skewness and kurtosis and expressions for the mean squared error on these estimators. Numerically, generated synthetic time series are used to verify the proposed estimators, the sample length dependency of their mean squared errors, and their performance. We find that estimators for sample skewness and kurtosis based on the gamma distribution are more precise and more accurate than common estimators based on the method of moments

  14. Improvement of the environmental and operational characteristics of vehicles through decreasing the motor fuel density.

    Science.gov (United States)

    Magaril, Elena

    2016-04-01

    The environmental and operational characteristics of motor transport, one of the main consumers of motor fuel and source of toxic emissions, soot, and greenhouse gases, are determined to a large extent by the fuel quality which is characterized by many parameters. Fuel density is one of these parameters and it can serve as an indicator of fuel quality. It has been theoretically substantiated that an increased density of motor fuel has a negative impact both on the environmental and operational characteristics of motor transport. The use of fuels with a high density leads to an increase in carbonization within the engine, adversely affecting the vehicle performance and increasing environmental pollution. A program of technological measures targeted at reducing the density of the fuel used was offered. It includes a solution to the problem posed by changes in the refining capacities ratio and the temperature range of gasoline and diesel fuel boiling, by introducing fuel additives and adding butanes to the gasoline. An environmental tax has been developed which allows oil refineries to have a direct impact on the production of fuels with improved environmental performance, taking into account the need to minimize the density of the fuel within a given category of quality.

  15. Electrical Conductivity of Charged Particle Systems and Zubarev's Nonequilibrium Statistical Operator Method

    Science.gov (United States)

    Röpke, G.

    2018-01-01

    One of the fundamental problems in physics that are not yet rigorously solved is the statistical mechanics of nonequilibrium processes. An important contribution to describing irreversible behavior starting from reversible Hamiltonian dynamics was given by D. N. Zubarev, who invented the method of the nonequilibrium statistical operator. We discuss this approach, in particular, the extended von Neumann equation, and as an example consider the electrical conductivity of a system of charged particles. We consider the selection of the set of relevant observables. We show the relation between kinetic theory and linear response theory. Using thermodynamic Green's functions, we present a systematic treatment of correlation functions, but the convergence needs investigation. We compare different expressions for the conductivity and list open questions.

  16. Structural characterization and condition for measurement statistics preservation of a unital quantum operation

    International Nuclear Information System (INIS)

    Lee, Kai-Yan; Fung, Chi-Hang Fred; Chau, H F

    2013-01-01

    We investigate the necessary and sufficient condition for a convex cone of positive semidefinite operators to be fixed by a unital quantum operation ϕ acting on finite-dimensional quantum states. By reducing this problem to the problem of simultaneous diagonalization of the Kraus operators associated with ϕ, we can completely characterize the kinds of quantum states that are fixed by ϕ. Our work has several applications. It gives a simple proof of the structural characterization of a unital quantum operation that acts on finite-dimensional quantum states—a result not explicitly mentioned in earlier studies. It also provides a necessary and sufficient condition for determining what kind of measurement statistics is preserved by a unital quantum operation. Finally, our result clarifies and extends the work of Størmer by giving a proof of a reduction theorem on the unassisted and entanglement-assisted classical capacities, coherent information, and minimal output Renyi entropy of a unital channel acting on a finite-dimensional quantum state. (paper)

  17. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    Science.gov (United States)

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  18. Current density distribution mapping in PEM fuel cells as an instrument for operational measurements

    Energy Technology Data Exchange (ETDEWEB)

    Geske, M.; Heuer, M.; Heideck, G.; Styczynski, Z. A. [Otto-von-Guericke University Magdeburg, Chair Electric Power Networks and Renewable Energy Sources, Magdeburg (Germany)

    2010-07-01

    A newly developed measurement system for current density distribution mapping has enabled a new approach for operational measurements in proton exchange membrane fuel cells (PEMFC). Taking into account previously constructed measurement systems, a method based on a multi layer printed circuit board was chosen for the development of the new system. This type of system consists of a sensor, a special electronic device and the control and visualization PC. For the acquisition of the current density distribution values, a sensor device was designed and installed within a multilayer printed circuit board with integrated shunt resistors. Varying shunt values can be taken into consideration with a newly developed and evaluated calibration method. The sensor device was integrated in a PEM fuel cell stack to prove the functionality of the whole measurement system. A software application was implemented to visualize and save the measurement values. Its functionality was verified by operational measurements within a PEMFC system. Measurement accuracy and possible negative reactions of the sensor device during PEMFC operation are discussed in detail in this paper. The developed system enables operational measurements for different operating phases of PEM fuel cells. Additionally, this can be seen as a basis for new opportunities of optimization for fuel cell design and operation modes. (author)

  19. Current Density Distribution Mapping in PEM Fuel Cells as An Instrument for Operational Measurements

    Directory of Open Access Journals (Sweden)

    Martin Geske

    2010-04-01

    Full Text Available A newly developed measurement system for current density distribution mapping has enabled a new approach for operational measurements in proton exchange membrane fuel cells (PEMFC. Taking into account previously constructed measurement systems, a method based on a multi layer printed circuit board was chosen for the development of the new system. This type of system consists of a sensor, a special electronic device and the control and visualization PC. For the acquisition of the current density distribution values, a sensor device was designed and installed within a multilayer printed circuit board with integrated shunt resistors. Varying shunt values can be taken into consideration with a newly developed and evaluated calibration method. The sensor device was integrated in a PEM fuel cell stack to prove the functionality of the whole measurement system. A software application was implemented to visualize and save the measurement values. Its functionality was verified by operational measurements within a PEMFC system. Measurement accuracy and possible negative reactions of the sensor device during PEMFC operation are discussed in detail in this paper. The developed system enables operational measurements for different operating phases of PEM fuel cells. Additionally, this can be seen as a basis for new opportunities of optimization for fuel cell design and operation modes.

  20. Effect of low density H-mode operation on edge and divertor plasma parameters

    International Nuclear Information System (INIS)

    Maingi, R.; Mioduszewski, P.K.; Cuthbertson, J.W.

    1994-07-01

    We present a study of the impact of H-mode operation at low density on divertor plasma parameters on the DIII-D tokamak. The line-average density in H-mode was scanned by variation of the particle exhaust rate, using the recently installed divertor cryo-condensation pump. The maximum decrease (50%) in line-average electron density was accompanied by a factor of 2 increase in the edge electron temperature, and 10% and 20% reductions in the measured core and divertor radiated power, respectively. The measured total power to the inboard divertor target increased by a factor of 3, with the major contribution coming from a factor of 5 increase in the peak heat flux very close to the inner strike point. The measured increase in power at the inboard divertor target was approximately equal to the measured decrease in core and divertor radiation

  1. The Canopy Graph and Level Statistics for Random Operators on Trees

    International Nuclear Information System (INIS)

    Aizenman, Michael; Warzel, Simone

    2006-01-01

    For operators with homogeneous disorder, it is generally expected that there is a relation between the spectral characteristics of a random operator in the infinite setup and the distribution of the energy gaps in its finite volume versions, in corresponding energy ranges. Whereas pure point spectrum of the infinite operator goes along with Poisson level statistics, it is expected that purely absolutely continuous spectrum would be associated with gap distributions resembling the corresponding random matrix ensemble. We prove that on regular rooted trees, which exhibit both spectral types, the eigenstate point process has always Poissonian limit. However, we also find that this does not contradict the picture described above if that is carefully interpreted, as the relevant limit of finite trees is not the infinite homogenous tree graph but rather a single-ended 'canopy graph.' For this tree graph, the random Schroedinger operator is proven here to have only pure-point spectrum at any strength of the disorder. For more general single-ended trees it is shown that the spectrum is always singular - pure point possibly with singular continuous component which is proven to occur in some cases

  2. Statistical quantization of GUT models and phase diagrams of W condensation for the Universe with finite fermion density

    International Nuclear Information System (INIS)

    Kalashnikov, O.K.; Razumov, L.V.; Perez Rojas, H.

    1990-01-01

    The problems of statistical quantization for grand-unified-theory models are studied using as an example the Weinberg-Salam model with finite fermion density under the conditions of neutral and electric charge conservation. The relativistic R γ gauge with an arbitrary parameter is used and the one-loop effective potential together with its extremum equations are found. We demonstrate (and this is our main result) that the thermodynamic potential obtained from the effective one, after the mass shell for ξ is used, remains gauge dependent if all temperature ranges (not only the leading high-temperature terms) are considered. The contradiction detected within the calculational scheme is eliminated after the redefinition of the model studied is made with the aid of the terms which are proportional to the ''non-Abelian'' chemical potential and equal to zero identically when the unitary gauge is fixed. The phase diagrams of the W condensation are established and all their peculiarities are displayed. We found for the universe with a zero neutral charge density that the W condensate occurs at any small fermion density ρ and appears at first near the point of symmetry restoration. For all ρ≠0 this condensate exists only in the finite-temperature domain and evaporates completely or partially when T goes to zero

  3. Matrix product operators, matrix product states, and ab initio density matrix renormalization group algorithms

    Science.gov (United States)

    Chan, Garnet Kin-Lic; Keselman, Anna; Nakatani, Naoki; Li, Zhendong; White, Steven R.

    2016-07-01

    Current descriptions of the ab initio density matrix renormalization group (DMRG) algorithm use two superficially different languages: an older language of the renormalization group and renormalized operators, and a more recent language of matrix product states and matrix product operators. The same algorithm can appear dramatically different when written in the two different vocabularies. In this work, we carefully describe the translation between the two languages in several contexts. First, we describe how to efficiently implement the ab initio DMRG sweep using a matrix product operator based code, and the equivalence to the original renormalized operator implementation. Next we describe how to implement the general matrix product operator/matrix product state algebra within a pure renormalized operator-based DMRG code. Finally, we discuss two improvements of the ab initio DMRG sweep algorithm motivated by matrix product operator language: Hamiltonian compression, and a sum over operators representation that allows for perfect computational parallelism. The connections and correspondences described here serve to link the future developments with the past and are important in the efficient implementation of continuing advances in ab initio DMRG and related algorithms.

  4. Electron density and temperature in NIO1 RF source operated in oxygen and argon

    Science.gov (United States)

    Barbisan, M.; Zaniol, B.; Cavenago, M.; Pasqualotto, R.; Serianni, G.; Zanini, M.

    2017-08-01

    The NIO1 experiment, built and operated at Consorzio RFX, hosts an RF negative ion source, from which it is possible to produce a beam of maximum 130 mA in H- ions, accelerated up to 60 kV. For the preliminary tests of the extraction system the source has been operated in oxygen, whose high electronegativity allows to reach useful levels of extracted beam current. The efficiency of negative ions extraction is strongly influenced by the electron density and temperature close to the Plasma Grid, i.e. the grid of the acceleration system which faces the source. To support the tests, these parameters have been measured by means of the Optical Emission Spectroscopy diagnostic. This technique has involved the use of an oxygen-argon mixture to produce the plasma in the source. The intensities of specific Ar I and Ar II lines have been measured along lines of sight close to the Plasma Grid, and have been interpreted with the ADAS package to get the desired information. This work will describe the diagnostic hardware, the analysis method and the measured values of electron density and temperature, as function of the main source parameters (RF power, pressure, bias voltage and magnetic filter field). The main results show that not only electron density but also electron temperature increase with RF power; both decrease with increasing magnetic filter field. Variations of source pressure and plasma grid bias voltage appear to affect only electron temperature and electron density, respectively.

  5. SEGMENTATION AND CLASSIFICATION OF CERVICAL CYTOLOGY IMAGES USING MORPHOLOGICAL AND STATISTICAL OPERATIONS

    Directory of Open Access Journals (Sweden)

    S Anantha Sivaprakasam

    2017-02-01

    Full Text Available Cervical cancer that is a disease, in which malignant (cancer cells form in the tissues of the cervix, is one of the fourth leading causes of cancer death in female community worldwide. The cervical cancer can be prevented and/or cured if it is diagnosed in the pre-cancerous lesion stage or earlier. A common physical examination technique widely used in the screening is called Papanicolaou test or Pap test which is used to detect the abnormality of the cell. Due to intricacy of the cell nature, automating of this procedure is still a herculean task for the pathologist. This paper addresses solution for the challenges in terms of a simple and novel method to segment and classify the cervical cell automatically. The primary step of this procedure is pre-processing in which de-nosing, de-correlation operation and segregation of colour components are carried out, Then, two new techniques called Morphological and Statistical Edge based segmentation and Morphological and Statistical Region Based segmentation Techniques- put forward in this paper, and that are applied on the each component of image to segment the nuclei from cervical image. Finally, all segmented colour components are combined together to make a final segmentation result. After extracting the nuclei, the morphological features are extracted from the nuclei. The performance of two techniques mentioned above outperformed than standard segmentation techniques. Besides, Morphological and Statistical Edge based segmentation is outperformed than Morphological and Statistical Region based Segmentation. Finally, the nuclei are classified based on the morphological value The segmentation accuracy is echoed in classification accuracy. The overall segmentation accuracy is 97%.

  6. Statistics of AUV's Missions for Operational Ocean Observation at the South Brazilian Bight.

    Science.gov (United States)

    dos Santos, F. A.; São Tiago, P. M.; Oliveira, A. L. S. C.; Barmak, R. B.; Miranda, T. C.; Guerra, L. A. A.

    2016-02-01

    The high costs and logistics limitations of ship-based data collection represent an obstacle for a persistent in-situ data collection. Satellite-operated Autonomous Underwater Vehicles (AUV's) or gliders (as these AUV's are generally known by the scientific community) are presented as an inexpensive and reliable alternative to perform long-term and real-time ocean monitoring of important parameters such as temperature, salinity, water-quality and acoustics. This work is focused on the performance statistics and the reliability for continuous operation of a fleet of seven gliders navigating in Santos Basin - Brazil, since March 2013. The gliders performance were evaluated by the number of standby days versus the number of operating days, the number of interrupted missions due to (1) equipment failure, (2) weather, (3) accident versus the number of successful missions and the amount and quality of data collected. From the start of the operations in March 2013 to the preparation of this work (July 2015), a total of 16 glider missions were accomplished, operating during 728 of the 729 days passed since then. From this total, 11 missions were successful, 3 missions were interrupted due to equipment failure and 2 gliders were lost. Most of the identified issues were observed in the communication with the glider (when recovery was necessary) or the optode sensors (when remote settings solved the problem). The average duration of a successful mission was 103 days while interrupted ones ended on average in 7 days. The longest mission lasted for 139 days, performing 859 continuous profiles and covering a distance of 2734 Km. The 2 projects performed together 6856 dives, providing an average of 9,5 profiles per day or one profile every 2,5 hours each day during 2 consecutive years.

  7. Density functional representation of quantum chemistry. II. Local quantum field theories of molecular matter in terms of the charge density operator do not work

    International Nuclear Information System (INIS)

    Primas, H.; Schleicher, M.

    1975-01-01

    A comprehensive review of the attempts to rephrase molecular quantum mechanics in terms of the particle density operator and the current density or phase density operator is given. All pertinent investigations which have come to attention suffer from severe mathematical inconsistencies and are not adequate to the few-body problem of quantum chemistry. The origin of the failure of these attempts is investigated, and it is shown that a realization of a local quantum field theory of molecular matter in terms of observables would presuppose the solution of many highly nontrivial mathematical problems

  8. Density fluctuation measurements via reflectometry on DIII-D during L- and H-mode operation

    International Nuclear Information System (INIS)

    Doyle, E.J.; Lehecka, T.; Luhmann, N.C. Jr.; Peebles, W.A.; Philipona, R.

    1990-01-01

    The unique ability of reflectometers to provide radial density fluctuation measurements with high spatial resolution (of the order of ≤ centimeters, is ideally suited to the study of the edge plasma modifications associated with H-mode operation. Consequently, attention has been focused on the study of these phenomena since an improved understanding of the physics of H-mode plasmas is essential if a predictive capability for machine performance is to be developed. In addition, DIII-D is ideally suited for such studies since it is a major device noted for its robust H-mode operation and excellent basic plasma profile diagnostic information. The reflectometer system normally used for fluctuation studies is an O-mode, homodyne, system utilizing 7 discrete channels spanning 15-75 GHz, with corresponding critical densities of 2.8x10 18 to 7x10 19 m -3 . The Gunn diode sources in this system are only narrowly tunable in frequency, so the critical densities are essentially fixed. An X-mode system, utilizing a frequency tunable BWO source, has also been used to obtain fluctuation data, and in particular, to 'fill in the gaps' between the discrete O-mode channels. (author) 12 refs., 5 figs

  9. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  10. Origin of structure: statistical characterization of the primordial density fluctuations and the collapse of the wave function

    Energy Technology Data Exchange (ETDEWEB)

    León, Gabriel [Departamento de Física, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Ciudad Universitaria - Pab. I, Buenos Aires 1428 (Argentina); Sudarsky, Daniel, E-mail: gleon@df.uba.ar, E-mail: sudarsky@nucleares.unam.mx [Instituto de Ciencias Nucleares, Universidad Nacional Autónoma de México, México D.F. 04510, México (Mexico)

    2015-06-01

    The statistical properties of the primordial density perturbations has been considered in the past decade as a powerful probe of the physical processes taking place in the early universe. Within the inflationary paradigm, the properties of the bispectrum are one of the keys that serves to discriminate among competing scenarios concerning the details of the origin of cosmological perturbations. However, all of the scenarios, based on the conventional approach to the so-called ''quantum-to-classical transition'' during inflation, lack the ability to point out the precise physical mechanism responsible for generating the inhomogeneity and anisotropy of our universe starting from and exactly homogeneous and isotropic vacuum state associated with the early inflationary regime. In past works, we have shown that the proposals involving a spontaneous dynamical reduction of the quantum state provide plausible explanations for the birth of said primordial inhomogeneities and anisotropies. In the present manuscript we show that, when considering within the context of such proposals, the characterization of the spectrum and bispectrum turn out to be quite different from those found in the traditional approach, and in particular, some of the statistical features, must be treated in a different way leading to some rather different conclusions.

  11. Advanced calibration, adjustment, and operation of a density and sound speed analyzer

    International Nuclear Information System (INIS)

    Fortin, Tara J.; Laesecke, Arno; Freund, Malte; Outcalt, Stephanie

    2013-01-01

    Highlights: ► Detail important considerations for reference quality measurements of thermophysical property data with benchtop instruments. ► Density and speed of sound of isooctane and speed of sound of toluene at (278 K to 343 K) and atmospheric pressure. ► Experimental data compared to available literature data and equations of state. - Abstract: Benchtop measurement systems have emerged as powerful tools in the ongoing quest for thermophysical property data. We demonstrate that these instruments can yield results of high quality if operated in an informed manner. The importance of sample purity, reproducibility over repeatability, expanded calibration and adjustment protocols, and rigorous uncertainty estimates are emphasized. We report measurement results at ambient atmospheric pressure and temperatures from 343 K to 278 K, including expanded uncertainty estimates, for the density and speed of sound of isooctane and for the speed of sound of toluene. These data are useful for validating the performance of such instruments.

  12. Degradation of Solid Oxide Electrolysis Cells Operated at High Current Densities

    DEFF Research Database (Denmark)

    Tao, Youkun; Ebbesen, Sune Dalgaard; Mogensen, Mogens Bjerg

    2014-01-01

    In this work the durability of solid oxide cells for co-electrolysis of steam and carbon dioxide (45 % H2O + 45 % CO2 + 10 % H2) at high current densities was investigated. The tested cells are Ni-YSZ electrode supported, with a YSZ electrolyte and either a LSM-YSZ or LSCF-CGO oxygen electrode....... A current density of -1.5 and -2.0 A/cm2 was applied to the cell and the gas conversion was 45 % and 60 %, respectively. The cells were operated for a period of up to 700 hours. The electrochemical analysis revealed significant performance degradation for the ohmic process, oxygen ion interfacial transfer...

  13. [Development of a software standardizing optical density with operation settings related to several limitations].

    Science.gov (United States)

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  14. Relating N2O emissions during biological nitrogen removal with operating conditions using multivariate statistical techniques.

    Science.gov (United States)

    Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E

    2018-04-26

    Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants

  15. On a decomposition theorem for density operators of a pure quantum state

    International Nuclear Information System (INIS)

    Giannoni, M.J.

    1979-03-01

    Conditions for the existence of a decomposition of a hermitian projector rho into two hermitian and time reversal invariant operators r/rho 0 and chi under the form rho=esup(i,chi)rho 0 esup(-i,chi) are investigated. Sufficient conditions are given, and an explicit construction of a decomposition is performed when they are fulfilled. A stronger theorem of existence and unicity is studied. All the proofs are valid for any p-body reduced density operator of a pure state of a system of bosons as well as fermions. The decomposition studied in this work has already been used in Nuclear Physics, and may be of interest in other fields of Physics

  16. Statistics of the hubble diagram. II. The form of the luminosity function and density variations with application to quasars

    International Nuclear Information System (INIS)

    Turner, E.L.

    1979-01-01

    New techniques for deriving a luminosity function LF and a spatial density distribution rho (r) from magnitude-redshift data are presented. These techniques do not require iterative improvement of an initially guessed solution or the adoption of arbitrary analytic forms; instead, they provide explicit numerical estimates of the LF and rho (r). Thus, sources of systematic uncertainty are eliminated. This is achieved at the cost of an increase in the statistical noise. As in Paper I of this series, it is necessary to assume that the LF does not vary in functional form. An internal test of this assumption is described.These techniques are illustrated by application to a sample of 3 CR and 4C quasars. The radio luminosity function is found to be a steep power law with no features. The optical luminosity function is found to be a shallow power law cut off roughly exponentially above a characteristic luminosity L/sub opt/* (Z) corresponding roughly to M/sub B/=-22-6 log (1+Z) The comoving density evolution is not well fitted by any simple function of 1+Z [e.g., (1+Z) 6 errs by factors as large as approx.5 at some redshifts] but is well represented by an exponential of look-back time. Specific analytic fits and numerical tabulations are given for each of these functions. The constant LF form assumption is found to be a reasonable first approximation for the quasars.Other possible applications of the new methods to problems in extragalactic and stellar astronomy are suggested

  17. Durability of Low Platinum Fuel Cells Operating at High Power Density

    Energy Technology Data Exchange (ETDEWEB)

    Polevaya, Olga [Nuvera Fuel Cells Inc.; Blanchet, Scott [Nuvera Fuel Cells Inc.; Ahluwalia, Rajesh [Argonne National Lab; Borup, Rod [Los-Alamos National Lab; Mukundan, Rangachary [Los-Alamos National Lab

    2014-03-19

    Understanding and improving the durability of cost-competitive fuel cell stacks is imperative to successful deployment of the technology. Stacks will need to operate well beyond today’s state-of-the-art rated power density with very low platinum loading in order to achieve the cost targets set forth by DOE ($15/kW) and ultimately be competitive with incumbent technologies. An accelerated cost-reduction path presented by Nuvera focused on substantially increasing power density to address non-PGM material costs as well as platinum. The study developed a practical understanding of the degradation mechanisms impacting durability of fuel cells with low platinum loading (≤0.2mg/cm2) operating at high power density (≥1.0W/cm2) and worked out approaches for improving the durability of low-loaded, high-power stack designs. Of specific interest is the impact of combining low platinum loading with high power density operation, as this offers the best chance of achieving long-term cost targets. A design-of-experiments approach was utilized to reveal and quantify the sensitivity of durability-critical material properties to high current density at two levels of platinum loading (the more conventional 0.45 mgPt.cm–1 and the much lower 0.2 mgPt.cm–2) across several cell architectures. We studied the relevance of selected component accelerated stress tests (AST) to fuel cell operation in power producing mode. New stress tests (NST) were designed to investigate the sensitivity to the addition of electrical current on the ASTs, along with combined humidity and load cycles and, eventually, relate to the combined city/highway drive cycle. Changes in the cathode electrochemical surface area (ECSA) and average oxygen partial pressure on the catalyst layer with aging under AST and NST protocols were compared based on the number of completed cycles. Studies showed elevated sensitivity of Pt growth to the potential limits and the initial particle size distribution. The ECSA loss

  18. Extension of electron cyclotron heating at ASDEX Upgrade with respect to high density operation

    Directory of Open Access Journals (Sweden)

    Schubert Martin

    2017-01-01

    Full Text Available The ASDEX Upgrade electron cyclotron resonance heating operates at 105 GHz and 140 GHz with flexible launching geometry and polarization. In 2016 four Gyrotrons with 10 sec pulse length and output power close to 1 MW per unit were available. The system is presently being extended to eight similar units in total. High heating power and high plasma density operation will be a part of the future ASDEX Upgrade experiment program. For the electron cyclotron resonance heating, an O-2 mode scheme is proposed, which is compatible with the expected high plasma densities. It may, however, suffer from incomplete single-pass absorption. The situation can be improved significantly by installing holographic mirrors on the inner column, which allow for a second pass of the unabsorbed fraction of the millimetre wave beam. Since the beam path in the plasma is subject to refraction, the beam position on the holographic mirror has to be controlled. Thermocouples built into the mirror surface are used for this purpose. As a protective measure, the tiles of the heat shield on the inner column were modified in order to increase the shielding against unabsorbed millimetre wave power.

  19. A new electron density model of the plasmasphere for operational applications and services

    Science.gov (United States)

    Jakowski, Norbert; Hoque, Mohammed Mainul

    2018-03-01

    The Earth's plasmasphere contributes essentially to total electron content (TEC) measurements from ground or satellite platforms. Furthermore, as an integral part of space weather, associated plasmaspheric phenomena must be addressed in conjunction with ionosphere weather monitoring by operational space weather services. For supporting space weather services and mitigation of propagation errors in Global Navigation Satellite Systems (GNSS) applications we have developed the empirical Neustrelitz plasmasphere model (NPSM). The model consists of an upper L shell dependent part and a lower altitude dependent part, both described by specific exponential decays. Here the McIllwain parameter L defines the geomagnetic field lines in a centered dipole model for the geomagnetic field. The coefficients of the developed approaches are successfully fitted to numerous electron density data derived from dual frequency GPS measurements on-board the CHAMP satellite mission from 2000 to 2005. The data are utilized for fitting up to the L shell L = 3 because a previous validation has shown a good agreement with IMAGE/RPI measurements up to this value. Using the solar radio flux index F10.7 as the only external parameter, the operation of the model is robust, with 40 coefficients fast and sufficiently accurate to be used as a background model for estimating TEC or electron density profiles in near real time GNSS applications and services. In addition to this, the model approach is sensitive to ionospheric coupling resulting in anomalies such as the Nighttime Winter Anomaly and the related Mid-Summer Nighttime Anomaly and even shows a slight plasmasphere compression of the dayside plasmasphere due to solar wind pressure. Modelled electron density and TEC values agree with estimates reported in the literature in similar cases.

  20. Weighted A-statistical convergence for sequences of positive linear operators.

    Science.gov (United States)

    Mohiuddine, S A; Alotaibi, Abdullah; Hazarika, Bipan

    2014-01-01

    We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.

  1. Statistical analysis of modal parameters of a suspension bridge based on Bayesian spectral density approach and SHM data

    Science.gov (United States)

    Li, Zhijun; Feng, Maria Q.; Luo, Longxi; Feng, Dongming; Xu, Xiuli

    2018-01-01

    Uncertainty of modal parameters estimation appear in structural health monitoring (SHM) practice of civil engineering to quite some significant extent due to environmental influences and modeling errors. Reasonable methodologies are needed for processing the uncertainty. Bayesian inference can provide a promising and feasible identification solution for the purpose of SHM. However, there are relatively few researches on the application of Bayesian spectral method in the modal identification using SHM data sets. To extract modal parameters from large data sets collected by SHM system, the Bayesian spectral density algorithm was applied to address the uncertainty of mode extraction from output-only response of a long-span suspension bridge. The posterior most possible values of modal parameters and their uncertainties were estimated through Bayesian inference. A long-term variation and statistical analysis was performed using the sensor data sets collected from the SHM system of the suspension bridge over a one-year period. The t location-scale distribution was shown to be a better candidate function for frequencies of lower modes. On the other hand, the burr distribution provided the best fitting to the higher modes which are sensitive to the temperature. In addition, wind-induced variation of modal parameters was also investigated. It was observed that both the damping ratios and modal forces increased during the period of typhoon excitations. Meanwhile, the modal damping ratios exhibit significant correlation with the spectral intensities of the corresponding modal forces.

  2. Image quality improvements using adaptive statistical iterative reconstruction for evaluating chronic myocardial infarction using iodine density images with spectral CT.

    Science.gov (United States)

    Kishimoto, Junichi; Ohta, Yasutoshi; Kitao, Shinichiro; Watanabe, Tomomi; Ogawa, Toshihide

    2018-04-01

    Single-source dual-energy CT (ssDECT) allows the reconstruction of iodine density images (IDIs) from projection based computing. We hypothesized that adding adaptive statistical iterative reconstruction (ASiR) could improve image quality. The aim of our study was to evaluate the effect and determine the optimal blend percentages of ASiR for IDI of myocardial late iodine enhancement (LIE) in the evaluation of chronic myocardial infarction using ssDECT. A total of 28 patients underwent cardiac LIE using a ssDECT scanner. IDIs between 0 and 100% of ASiR contributions in 10% increments were reconstructed. The signal-to-noise ratio (SNR) of remote myocardia and the contrast-to-noise ratio (CNR) of infarcted myocardia were measured. Transmural extent of infarction was graded using a 5-point scale. The SNR, CNR, and transmural extent were assessed for each ASiR contribution ratio. The transmural extents were compared with MRI as a reference standard. Compared to 0% ASiR, the use of 20-100% ASiR resulted in a reduction of image noise (p ASiR images, reconstruction with 100% ASiR image showed the highest improvement in SNR (229%; p ASiR above 80% showed the highest ratio (73.7%) of accurate transmural extent classification. In conclusion, ASiR intensity of 80-100% in IDIs can improve image quality without changes in signal and maximizes the accuracy of transmural extent in infarcted myocardium.

  3. Impact of a high density GPS network on the operational forecast

    Directory of Open Access Journals (Sweden)

    C. Faccani

    2005-01-01

    Full Text Available Global Positioning System Zenith Total Delay (GPS ZTD can provide information about the water vapour in atmosphere. Its assimilation into the analysis used to initialize a model can then improve the weather forecast, giving the right amount of moisture and reducing the model spinup. In the last year, an high density GPS network has been created on the Basilicata region (south of Italy by the Italian Space Agency in the framework of a national project named MAGIC2. MAGIC2 is the Italian follow on of the EC project MAGIC has. Daily operational data assimilation experiments are performed since December 2003. The results show that the assimilation of GPS ZTD improves the forecast especially during the transition from winter to spring even if a no very high model resolution (9km is used.

  4. Weighted A-Statistical Convergence for Sequences of Positive Linear Operators

    Directory of Open Access Journals (Sweden)

    S. A. Mohiuddine

    2014-01-01

    Full Text Available We introduce the notion of weighted A-statistical convergence of a sequence, where A represents the nonnegative regular matrix. We also prove the Korovkin approximation theorem by using the notion of weighted A-statistical convergence. Further, we give a rate of weighted A-statistical convergence and apply the classical Bernstein polynomial to construct an illustrative example in support of our result.

  5. Global statistics of liquid water content and effective number density of water clouds over ocean derived from combined CALIPSO and MODIS measurements

    Science.gov (United States)

    Hu, Y.; Vaughan, M.; McClain, C.; Behrenfeld, M.; Maring, H.; Anderson, D.; Sun-Mack, S.; Flittner, D.; Huang, J.; Wielicki, B.; Minnis, P.; Weimer, C.; Trepte, C.; Kuehn, R.

    2007-03-01

    This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water content and effective number density are presented.

  6. Chapter 7: High-Density H-Mode Operation in ASDEX Upgrade

    International Nuclear Information System (INIS)

    Stober, Joerg Karl; Lang, Peter Thomas; Mertens, Vitus

    2003-01-01

    Recent results are reported on the maximum achievable H-mode density and the behavior of pedestal density and central density peaking as this limit is approached. The maximum achievable H-mode density roughly scales as the Greenwald density, though a dependence on B t is clearly observed. In contrast to the stiff temperature profiles, the density profiles seem to allow more shape variation and especially with high-field-side pellet-injection, strongly peaked profiles with good confinement have been achieved. Also, spontaneous density peaking at high densities is observed in ASDEX Upgrade, which is related to the generally observed large time constants for the density profile equilibration. The equilibrated density profile shapes depend strongly on the heat-flux profile in the sense that central heating leads to significantly flatter profiles

  7. Adjustments of the TaD electron density reconstruction model with GNSS-TEC parameters for operational application purposes

    Directory of Open Access Journals (Sweden)

    Belehaki Anna

    2012-12-01

    Full Text Available Validation results on the latest version of TaD model (TaDv2 show realistic reconstruction of the electron density profiles (EDPs with an average error of 3 TECU, similar to the error obtained from GNSS-TEC calculated paremeters. The work presented here has the aim to further improve the accuracy of the TaD topside reconstruction, adjusting the TEC parameter calculated from TaD model with the TEC parameter calculated by GNSS transmitting RINEX files provided by receivers co-located with the Digisondes. The performance of the new version is tested during a storm period demonstrating further improvements in respect to the previous version. Statistical comparison of modeled and observed TEC confirms the validity of the proposed adjustment. A significant benefit of the proposed upgrade is that it facilitates the real-time implementation of TaD. The model needs a reliable measure of the scale height at the peak height, which is supposed to be provided by Digisondes. Oftenly, the automatic scaling software fails to correctly calculate the scale height at the peak, Hm, due to interferences in the receiving signal. Consequently the model estimated topside scale height is wrongly calculated leading to unrealistic results for the modeled EDP. The proposed TEC adjustment forces the model to correctly reproduce the topside scale height, despite the inaccurate values of Hm. This adjustment is very important for the application of TaD in an operational environment.

  8. A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Kristensen, Kasper; Lewy, Peter

    2014-01-01

    Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP) statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes...

  9. Educating America's Workforce: Summary of Key Operating Statistics. Data collected from the 2007 and 2008 Annual Institutional Reports

    Science.gov (United States)

    Accrediting Council for Independent Colleges and Schools, 2009

    2009-01-01

    This special edition of the Key Operating Statistics (KOS) contains information based on the 2007 and 2008 Annual Institutional Reports (AIR) submitted by ACICS-accredited institutions. The AIR is submitted on September 15 each year by ACICS-accredited institutions. It reflects activity during a reporting year that begins on July 1 and concludes…

  10. Statistical summary of commercial jet aircraft accidents : worldwide operations, 1959-2009

    Science.gov (United States)

    2010-07-01

    The accident statistics presented in this summary are confined to worldwide commercial jet airplanes that are heavier than 60,000 pounds maximum gross weight. Within that set of airplanes, there are two groups excluded: : 1) Airplanes manufactured in...

  11. A statistical study of high coronal densities from X-ray line-ratios of Mg XI

    Science.gov (United States)

    Linford, G. A.; Lemen, J. R.; Strong, K. T.

    1991-01-01

    An X-ray line-ratio density diagnostic was applied to 50 Mg XI spectra of flaring active regions on the sun recorded by the Flat Crystal Spectrometer on the SMM. The plasma density is derived from R, the flux ratio of the forbidden to intercombination lines of the He-like ion, Mg XI. The R ratio for Mg XI is only density sensitive when the electron density exceeds a critical value (about 10 to the 12th/cu cm), the low-density limit (LDL). This theoretical value of the low-density limit is uncertain as it depends on complex atomic theory. Reported coronal densities above 10 to the 12th/cu cm are uncommon. In this study, the distribution of R ratio values about the LDL is estimated and the empirical values are derived for the 1st and 2nd moments of this distribution from 50 Mg XI spectra. From these derived parameters, the percentage of observations is derived which indicated densities above this limit.

  12. Statistical analysis of first period of operation of FTU Tokamak; Analisi statistica del primo periodo di operazioni del Tokamak FTU

    Energy Technology Data Exchange (ETDEWEB)

    Crisanti, F; Apruzzese, G; Frigione, D; Kroegler, H; Lovisetto, L; Mazzitelli, G; Podda, S [ENEA, Centro Ricerche Frascati, Rome (Italy). Dip. Energia

    1996-09-01

    On the FTU Tokamak the plasma physics operations started on the 20/4/90. The first plasma had a plasma current Ip=0.75 MA for about a second. The experimental phase lasted until 7/7/94, when a long shut-down begun for installing the toroidal limiter in the inner side of the vacuum vessel. In these four years of operations plasma experiments have been successfully exploited, e.g. experiments of single and multiple pellet injections; full current drive up to Ip=300 KA was obtained by using waves at the frequency of the Lower Hybrid; analysis of ohmic plasma parameters with different materials (from the low Z silicon to high Z tungsten) as plasma facing element was performed. In this work a statistical analysis of the full period of operation is presented. Moreover, a comparison with the statistical data from other Tokamaks is attempted.

  13. Communication: satisfying fermionic statistics in the modeling of open time-dependent quantum systems with one-electron reduced density matrices.

    Science.gov (United States)

    Head-Marsden, Kade; Mazziotti, David A

    2015-02-07

    For an open, time-dependent quantum system, Lindblad derived the most general modification of the quantum Liouville equation in the Markovian approximation that models environmental effects while preserving the non-negativity of the system's density matrix. While Lindblad's modification is correct for N-electron density matrices, solution of the Liouville equation with a Lindblad operator causes the one-electron reduced density matrix (1-RDM) to violate the Pauli exclusion principle. Consequently, after a short time, the 1-RDM is not representable by an ensemble N-electron density matrix (not ensemble N-representable). In this communication, we derive the necessary and sufficient constraints on the Lindbladian matrix within the Lindblad operator to ensure that the 1-RDM remains N-representable for all time. The theory is illustrated by considering the relaxation of an excitation in several molecules F2, N2, CO, and BeH2 subject to environmental noise.

  14. Operation of ADITYA Thomson scattering system: measurement of temperature and density

    International Nuclear Information System (INIS)

    Thomas, Jinto; Pillai, Vishal; Singh, Neha; Patel, Kiran; Lingeshwari, G.; Hingrajiya, Zalak; Kumar, Ajai

    2015-01-01

    ADITYA Thomson scattering (TS) system is a single point measurement system operated using a 10 J ruby laser and a 1 meter grating spectrometer. Multi-slit optical fibers are arranged at the image plane of the spectrometer so that each fiber slit collects 2 nm band of scattered spectrum. Each slit of the fiber bundle is coupled to high gain Photomultiplier tubes (PMT). Standard white light source is used to calibrate the optical fiber transmission and the laser light itself is used to calibrate the relative gain of the PMT. Rayleigh scattering has been performed for the absolute calibration of the TS system. The temperature of ADITYA plasma has been calculated using the conventional method of estimation (calculated using the slope of logarithmic intensity vs the square of delta lambda). It has been observed that the core temperature of ADITYA Tokamak plasma is in the range of 300 to 600 eV for different plasma shots and the density 2-3 X 10 13 /cc. The time evolution of the plasma discharge has been studied by firing the laser at different times of the discharge assuming the shots are identical. In some of the discharges, the velocity distribution appears to be non Maxwellian. (author)

  15. Teaching Statistics from the Operating Table: Minimally Invasive and Maximally Educational

    Science.gov (United States)

    Nowacki, Amy S.

    2015-01-01

    Statistics courses that focus on data analysis in isolation, discounting the scientific inquiry process, may not motivate students to learn the subject. By involving students in other steps of the inquiry process, such as generating hypotheses and data, students may become more interested and vested in the analysis step. Additionally, such an…

  16. Exploration of one-dimensional plasma current density profile for K-DEMO steady-state operation

    Energy Technology Data Exchange (ETDEWEB)

    Kang, J.S. [Seoul National University, Seoul 151-742 (Korea, Republic of); Jung, L. [National Fusion Research Institute, Daejeon (Korea, Republic of); Byun, C.-S.; Na, D.H.; Na, Y.-S. [Seoul National University, Seoul 151-742 (Korea, Republic of); Hwang, Y.S., E-mail: yhwang@snu.ac.kr [Seoul National University, Seoul 151-742 (Korea, Republic of)

    2016-11-01

    Highlights: • One-dimensional current density and its optimization for the K-DEMO are explored. • Plasma current density profile is calculated with an integrated simulation code. • The impact of self and external heating profiles is considered self-consistently. • Current density is identified as a reference profile by minimizing heating power. - Abstract: Concept study for Korean demonstration fusion reactor (K-DEMO) is in progress, and basic design parameters are proposed by targeting high magnetic field operation with ITER-sized machine. High magnetic field operation is a favorable approach to enlarge relative plasma performance without increasing normalized beta or plasma current. Exploration of one-dimensional current density profile and its optimization process for the K-DEMO steady-state operation are reported in this paper. Numerical analysis is conducted with an integrated plasma simulation code package incorporating a transport code with equilibrium and current drive modules. Operation regimes are addressed with zero-dimensional system analysis. One-dimensional plasma current density profile is calculated based on equilibrium, bootstrap current analysis, and thermal transport analysis. The impact of self and external heating profiles on those parameters is considered self-consistently, where thermal power balance and 100% non-inductive current drive are the main constraints during the whole exploration procedure. Current and pressure profiles are identified as a reference steady-state profile by minimizing the external heating power with desired fusion power.

  17. Diffusion-Based Density-Equalizing Maps: an Interdisciplinary Approach to Visualizing Homicide Rates and Other Georeferenced Statistical Data

    Science.gov (United States)

    Mazzitello, Karina I.; Candia, Julián

    2012-12-01

    In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.

  18. A performance analysis for MHD power cycles operating at maximum power density

    International Nuclear Information System (INIS)

    Sahin, Bahri; Kodal, Ali; Yavuz, Hasbi

    1996-01-01

    An analysis of the thermal efficiency of a magnetohydrodynamic (MHD) power cycle at maximum power density for a constant velocity type MHD generator has been carried out. The irreversibilities at the compressor and the MHD generator are taken into account. The results obtained from power density analysis were compared with those of maximum power analysis. It is shown that by using the power density criteria the MHD cycle efficiency can be increased effectively. (author)

  19. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  20. Energy-density field approach for low- and medium-frequency vibroacoustic analysis of complex structures using a statistical computational model

    Science.gov (United States)

    Kassem, M.; Soize, C.; Gagliardini, L.

    2009-06-01

    In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.

  1. A multivariate statistical methodology for detection of degradation and failure trends using nuclear power plant operational data

    International Nuclear Information System (INIS)

    Samanta, P.K.; Teichmann, T.

    1990-01-01

    In this paper, a multivariate statistical method is presented and demonstrated as a means for analyzing nuclear power plant transients (or events) and safety system performance for detection of malfunctions and degradations within the course of the event based on operational data. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to detect failure trends and patterns and so can lead to prevention of conditions with serious safety implications

  2. Wigner Function:from Ensemble Average of Density Operator to Its One Matrix Element in Entangled Pure States

    Institute of Scientific and Technical Information of China (English)

    FAN Hong-Yi

    2002-01-01

    We show that the Wigner function W = Tr(△ρ) (an ensemble average of the density operator ρ, △ is theWigner operator) can be expressed as a matrix element of ρ in the entangled pure states. In doing so, converting fromquantum master equations to time-evolution equation of the Wigner functions seems direct and concise. The entangledstates are defined in the enlarged Fock space with a fictitious freedom.

  3. Statistical analysis of operating efficiency and failures of a medical linear accelerator for ten years

    International Nuclear Information System (INIS)

    Ju, Sang Gyu; Huh, Seung Jae; Han, Young Yih

    2005-01-01

    To improve the management of a medical linear accelerator, the records of operational failures of a Varian CL2100C over a ten year period were retrospectively analyzed. The failures were classified according to the involved functional subunits, with each class rated into one of three levels depending on the operational conditions. The relationships between the failure rate and working ratio and between the failure rate and outside temperature were investigated. In addition, the average life time of the main part and the operating efficiency over the last 4 years were analyzed. Among the recorded failures (total 587 failures), the most frequent failure was observed in the parts related with the collimation system, including the monitor chamber, which accounted for 20% of all failures. With regard to the operational conditions, 2nd level of failures, which temporally interrupted treatments, were the most frequent. Third level of failures, which interrupted treatment for more than several hours, were mostly caused by the accelerating subunit. The number of failures was increased with number of treatments and operating time. The average life-times of the Klystron and Thyratron became shorter as the working ratio increased, and were 42 and 83% of the expected values, respectively. The operating efficiency was maintained at 95% or higher, but this value slightly decreased. There was no significant correlation between the number of failures and the outside temperature. The maintenance of detailed equipment problems and failures records over a long period of time can provide good knowledge of equipment function as well as the capability of predicting future failure. More rigorous equipment maintenance is required for old medical linear accelerators for the advanced avoidance of serious failure and to improve the quality of patient treatment

  4. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  5. The density of states for almost periodic Schroedinger operators and the frequency module: a counter-example

    International Nuclear Information System (INIS)

    Bellissard, J.

    1981-07-01

    We exhibit an example of a one-dimensional discrete Schroedinger operator with an almost periodic potential for which the steps of the density of states do not belong to the frequency module. This example is suggested by the K-theory

  6. Variable density management in riparian reserves: lessons learned from an operational study in managed forests of western Oregon, USA.

    Science.gov (United States)

    Samuel Chan; Paul Anderson; John Cissel; Larry Lateen; Charley Thompson

    2004-01-01

    A large-scale operational study has been undertaken to investigate variable density management in conjunction with riparian buffers as a means to accelerate development of late-seral habitat, facilitate rare species management, and maintain riparian functions in 40-70 year-old headwater forests in western Oregon, USA. Upland variable retention treatments include...

  7. Global statistics of liquid water content and effective number density of water clouds over ocean derived from combined CALIPSO and MODIS measurements

    OpenAIRE

    Y. Hu; M. Vaughan; C. McClain; M. Behrenfeld; H. Maring; D. Anderson; S. Sun-Mack; D. Flittner; J. Huang; B. Wielicki; P. Minnis; C. Weimer; C. Trepte; R. Kuehn

    2007-01-01

    International audience; This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water...

  8. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  9. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    Science.gov (United States)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  10. Incident analysis, data gathering and use of statistics for operational purposes

    International Nuclear Information System (INIS)

    Girault, B.

    1990-01-01

    The Nuclear and Fossil Generation Division of Electricite de France has developed a database for operational purposes. Operational means that the initial analyses and the direction taken adopted at later stages are essentially directed towards experience feedback. Consequently, requirements of precision, coherence and efficiency characterize the causal analysis applicable to numerous events, by numerous users, over a long period. This use of many analysts, using common methods over a long period of time assures the quality of the final results of the data base. The use of the results is illustrated in a study of safety-related incidents. The study resulted in a number of specific remedies that were applied in the French power plants

  11. Operational experience with nuclear power plants - outage statistics, causes and effects

    International Nuclear Information System (INIS)

    Kutsch, W.

    1980-01-01

    Whether operating experience is good or bad is not a question of the subjective impression. Availability, reliability, environmental influence, safety and economy are of a significance which cannot be expressed by figures. To what extent the result may be called good or bad can be noticed by comparing the results with the projected expected values or by comparing them with other plants locally or overseas. (orig.)

  12. Operational benefits and challenges of the use of fingerprint statistical models: a field study.

    Science.gov (United States)

    Neumann, Cedric; Mateos-Garcia, Ismael; Langenburg, Glenn; Kostroski, Jennifer; Skerrett, James E; Koolen, Martin

    2011-10-10

    Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Edge operational space for high density/high confinement ELMY H-modes in JET

    International Nuclear Information System (INIS)

    Sartori, R.; Saibene, G.; Loarte, A.

    2002-01-01

    This paper discusses how the proximity to the L-H threshold affects the confinement of ELMy H-modes at high density. The largest reduction in confinement at high density is observed at the transition from the Type I to the Type III ELMy regime. At medium plasma triangularity, δ≅0.3 (where δ is the average triangularity at the separatrix), JET experiments show that by increasing the margin above the L-H threshold power and maintaining the edge temperature above the critical temperature for the transition to Type III ELMs, it is possible to avoid the degradation of the pedestal pressure with density, normally observed at lower power. As a result, the range of achievable densities (both in the core and in the pedestal) is increased. At high power above the L-H threshold power the core density was equal to the Greenwald limit with H97≅0.9. There is evidence that a mixed regime of Type I and Type II ELMs has been obtained at this intermediate triangularity, possibly as a result of this increase in density. At higher triangularity, δ≅0.5, the power required to achieve similar results is lower. (author)

  14. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    Science.gov (United States)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  15. Occupational radiation dose statistics from light-water power reactors operating in Western Europe

    International Nuclear Information System (INIS)

    Brookes, I.R.; Eng, T.

    1987-01-01

    Since the early days of nuclear power, collective and individual doses for people engaged in the maintenance and operation of nuclear power plants have been published by regulatory authorities. In 1979 a small working party whose members were drawn from Member States operating light-water reactors (LWRs) in the European Community was convened. The working party decided that only by collection of data under a unified scheme would it ever be possible to properly compare plant performance and for this reason a questionnaire was drawn up which attempted to elicit the maximum of information with the minimum inconvenience to the plant staff. Another decision made by the working party was to broaden the data base from 'European Community LWRs' to 'West European LWRs' to try to take advantage of the considerable experience being built up in Sweden, in Finland and in Switzerland. All the data available to the Commission up to the end of 1984 are presented and commented on. The deductions are not exhaustive but are believed to represent the limits of what could sensibly be done with the data available. Results are presented separately for BWR and PWR but no other subdivision, say by country or maker, is made. Where interpretation can be enhanced by graphical presentation, this is done. In general, doses for each job category are expressed in various ways to reveal and afford comparisons

  16. Impact of connection density on regional cost differences for network operators in the Netherlands

    International Nuclear Information System (INIS)

    2009-04-01

    The Dutch Office of Energy Regulation ('Energiekamer') has an obligation to investigate the extent to which the electricity and gas distribution businesses (DNOs) in the Netherlands face different structural environments that result in regional cost differences which, in turn, could justify tariff differences. On the basis of previous studies, Energiekamer has identified 'water crossings' and 'local taxes' as allowable regional differences. To account for them, Energiekamer has introduced an adjustment to the regulated revenues formula in order to guarantee a level-playing field to the Dutch DNOs. In addition to these factors, it has been claimed that connection density may have an impact on distribution costs and that, therefore, regulated revenues should be adjusted to compensate for regional differences in connection density between DNOs. However, so far, the research in this field has been unable to identify a sufficiently robust relationship between cost and connection density to support this claim. In order to address this issue, Energiekamer has asked Frontier Economics and Consentec to further investigate the relationship between connection density and distribution costs in the Netherlands. Therefore, our analysis has aimed at determining whether, and to what extent, connection density in the Netherlands is a significant driver of the costs of electricity and gas distribution networks. The following three questions are answered: (1) Is connection density a significant cost driver in electricity and gas networks in the Netherlands?; (2) If so, which functional form (e.g. U-shaped) does this relationship have in the Netherlands?; (3) Finally, based on the evidence collected, is the influence of connection density sufficiently well-determined to be considered a regional difference in the Dutch regulatory framework?

  17. A Primer on Receiver Operating Characteristic Analysis and Diagnostic Efficiency Statistics for Pediatric Psychology: We Are Ready to ROC

    Science.gov (United States)

    2014-01-01

    Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores 30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298

  18. Statistical mechanics of Roskilde liquids: configurational adiabats, specific heat contours, and density dependence of the scaling exponent.

    Science.gov (United States)

    Bailey, Nicholas P; Bøhling, Lasse; Veldhorst, Arno A; Schrøder, Thomas B; Dyre, Jeppe C

    2013-11-14

    We derive exact results for the rate of change of thermodynamic quantities, in particular, the configurational specific heat at constant volume, CV, along configurational adiabats (curves of constant excess entropy Sex). Such curves are designated isomorphs for so-called Roskilde liquids, in view of the invariance of various structural and dynamical quantities along them. The slope of the isomorphs in a double logarithmic representation of the density-temperature phase diagram, γ, can be interpreted as one third of an effective inverse power-law potential exponent. We show that in liquids where γ increases (decreases) with density, the contours of CV have smaller (larger) slope than configurational adiabats. We clarify also the connection between γ and the pair potential. A fluctuation formula for the slope of the CV-contours is derived. The theoretical results are supported with data from computer simulations of two systems, the Lennard-Jones fluid, and the Girifalco fluid. The sign of dγ∕dρ is thus a third key parameter in characterizing Roskilde liquids, after γ and the virial-potential energy correlation coefficient R. To go beyond isomorph theory we compare invariance of a dynamical quantity, the self-diffusion coefficient, along adiabats and CV-contours, finding it more invariant along adiabats.

  19. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  20. Value of information to improve daily operations in high-density logistics

    NARCIS (Netherlands)

    Viet, Nguyen Quoc; Behdani, Behzad; Bloemhof, Jacqueline

    2018-01-01

    Agro-food logistics is increasingly challenged to ensure that a wide variety of high-quality products are always available at retail stores. This paper discusses high-density logistics issues caused by more frequent and smaller orders from retailers. Through a case study of the distribution process

  1. Characteristics of PEMFC operating at high current density with low external humidification

    International Nuclear Information System (INIS)

    Fan, Linhao; Zhang, Guobin; Jiao, Kui

    2017-01-01

    Highlights: • PEMFC with low humidity and high current density is studied by numerical simulation. • At high current density, water production lowers external humidification requirement. • A steady anode circulation status without external humidification is demonstrated. • The corresponding detailed internal water transfer path in the PEMFC is illustrated. • Counter-flow is superior to co-flow at low anode external humidification. - Abstract: A three-dimensional multiphase numerical model for proton exchange membrane fuel cell (PEMFC) is developed to study the fuel cell performance and water transport properties with low external humidification. The results show that the sufficient external humidification is necessary to prevent the polymer electrolyte dehydration at low current density, while at high current density, the water produced in cathode CL is enough to humidify the polymer electrolyte instead of external humidification by flowing back and forth between the anode and cathode across the membrane. Furthermore, a steady anode circulation status without external humidification is demonstrated in this study, of which the detailed internal water transfer path is also illustrated. Additionally, it is also found that the water balance under the counter-flow arrangement is superior to co-flow at low anode external humidification.

  2. Calibration Of A Nucleonic Density Gauge For Molasses Brie Control In Vacuum Pan Operation

    International Nuclear Information System (INIS)

    Griffith, J.M.; Cuesta, J.; Laria, J.; Desdin, L.F.

    1999-01-01

    In order to establish a strict control of the molasses to be feed to the vacuum pan station during industrial evaluations of this facility in the next season, the calibration of a prototype of nucleonic density gauge, constructed in close collaboration between CEADEN and ICINAZ has been performed. Some preliminary results of this complementary task of the project are described

  3. Calibration of a nucleonic density gauge for molasses brix control in vacuum pan operation

    International Nuclear Information System (INIS)

    Griffith, J.M.; Laria, J.; Desdin, L.F; Cuesta, J.

    1999-01-01

    In order to establish a strict control of the molasses to be feed to the vacuum pan station during industrial evaluations of this facility in the next season, the calibration of a prototype of nucleonic density gauge, constructed in close collaboration between ceaden and icinaz has been performed. Some preliminary results of this complementary task of the project are described

  4. An Improved Model for Operational Specification of the Electron Density Structure up to Geosynchronous Heights

    Science.gov (United States)

    2010-07-01

    http://www.iono.noa.gr/ElectronDensity/EDProfile.php The web service has been developed with the following open source tools: a) PHP , for the... MySQL for the database, which was based on the enhancement of the DIAS database. Below we present some screen shots to demonstrate the functionality

  5. Operational limits of high density H-modes in ASDEX Upgrade

    International Nuclear Information System (INIS)

    Mertens, V.; Borrass, K.; Kaufmann, M.; Lang, P.T.; Lang, R.; Mueller, H.W.; Neuhauser, J.; Schneider, R.; Schweinzer, J.; Suttrop, W.

    2001-01-01

    Systematic investigations of H-mode density limit (H→L-mode back transition) plasmas with gas fuelling and alternatively with additional pellet injection from the magnetic high-field-side HFS are being performed in the new closed divertor configuration DV-II. The resulting database covering a wide range of the externally controllable plasma parameters I p , B t and P heat confirms that the H-mode threshold power exceeds the generally accepted prediction P L→H heat ∝B-bar t dramatically when one approaches Greenwald densities. Additionally, in contrast to the Greenwald scaling a moderate B t -dependence of the H-mode density limit is found. The limit is observed to coincide with divertor detachment and a strong increase of the edge thermal transport, which has, however, no detrimental effect on global τ E . The pellet injection scheme from the magnetic high-field-side HFS, developed recently on ASDEX Upgrade, leads to fast particle drifts which are, contrary to the standard injection from the low-field-side, directed into the plasma core. This improves markedly the pellet particle fuelling efficiency. The responsible physical mechanism, the diamagnetic particle drift of the pellet ablatant was successfully verified recently. Other increased particle losses on respectively different time scales after the ablation process, however, still persist. Generally, a clear gain in achievable density and plasma stored energy is achieved with stationary HFS pellet injection compared to gas-puffing. (author)

  6. A Statistical Test of the Relationship Between Chorus Wave Activation and Anisotropy of Electron Phase Space Density

    Directory of Open Access Journals (Sweden)

    Dong-Hee Lee

    2014-12-01

    Full Text Available Whistler mode chorus wave is considered to play a critical role in accelerating and precipitating the electrons in the outer radiation belt. In this paper we test a conventional scenario of triggering chorus using THEMIS satellite observations of waves and particles. Specifically, we test if the chorus onset is consistent with development of anisotropy in the electron phase space density (PSD. After analyzing electron PSD for 73 chorus events, we find that, for ~80 % of them, their onsets are indeed associated with development of the positive anisotropy in PSD where the pitch angle distribution of electron velocity peaks at 90 degrees. This PSD anisotropy is prominent mainly at the electron energy range of ≤ ~20 keV. Interestingly, we further find that there is sometimes a time delay among energies in the increases of the anisotropy: A development of the positive anisotropy occurs earlier by several minutes for lower energy than for an adjacent higher energy.

  7. Statistical Design of an Adaptive Synthetic X- Control Chart with Run Rule on Service and Management Operation

    Directory of Open Access Journals (Sweden)

    Shucheng Yu

    2016-01-01

    Full Text Available An improved synthetic X- control chart based on hybrid adaptive scheme and run rule scheme is introduced to enhance the statistical performance of traditional synthetic X- control chart on service and management operation. The proposed scientific hybrid adaptive schemes consider both variable sampling interval and variable sample size scheme. The properties of the proposed chart are obtained using Markov chain approach. An extensive set of numerical results is presented to test the effectiveness of the proposed model in detecting small and moderate shifts in the process mean. The results show that the proposed chart is quicker than the standard synthetic X- chart and CUSUM chart in detecting small and moderate shifts in the process of service and management operation.

  8. The Abdominal Aortic Aneurysm Statistically Corrected Operative Risk Evaluation (AAA SCORE) for predicting mortality after open and endovascular interventions.

    Science.gov (United States)

    Ambler, Graeme K; Gohel, Manjit S; Mitchell, David C; Loftus, Ian M; Boyle, Jonathan R

    2015-01-01

    Accurate adjustment of surgical outcome data for risk is vital in an era of surgeon-level reporting. Current risk prediction models for abdominal aortic aneurysm (AAA) repair are suboptimal. We aimed to develop a reliable risk model for in-hospital mortality after intervention for AAA, using rigorous contemporary statistical techniques to handle missing data. Using data collected during a 15-month period in the United Kingdom National Vascular Database, we applied multiple imputation methodology together with stepwise model selection to generate preoperative and perioperative models of in-hospital mortality after AAA repair, using two thirds of the available data. Model performance was then assessed on the remaining third of the data by receiver operating characteristic curve analysis and compared with existing risk prediction models. Model calibration was assessed by Hosmer-Lemeshow analysis. A total of 8088 AAA repair operations were recorded in the National Vascular Database during the study period, of which 5870 (72.6%) were elective procedures. Both preoperative and perioperative models showed excellent discrimination, with areas under the receiver operating characteristic curve of .89 and .92, respectively. This was significantly better than any of the existing models (area under the receiver operating characteristic curve for best comparator model, .84 and .88; P AAA repair. These models were carefully developed with rigorous statistical methodology and significantly outperform existing methods for both elective cases and overall AAA mortality. These models will be invaluable for both preoperative patient counseling and accurate risk adjustment of published outcome data. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  9. Effects of heat and water transport on the performance of polymer electrolyte membrane fuel cell under high current density operation

    International Nuclear Information System (INIS)

    Tabuchi, Yuichiro; Shiomi, Takeshi; Aoki, Osamu; Kubo, Norio; Shinohara, Kazuhiko

    2010-01-01

    Key challenges to the acceptance of polymer electrolyte membrane fuel cells (PEMFCs) for automobiles are the cost reduction and improvement in its power density for compactness. In order to get the solution, the further improvement in a fuel cell performance is required. In particular, under higher current density operation, water and heat transport in PEMFCs has considerable effects on the cell performance. In this study, the impact of heat and water transport on the cell performance under high current density was investigated by experimental evaluation of liquid water distribution and numerical validation. Liquid water distribution in MEA between rib and channel area is evaluated by neutron radiography. In order to neglect the effect of liquid water in gas channels and reactant species concentration distribution in the flow direction, the differential cell was used in this study. Experimental results suggested that liquid water under the channel was dramatically changed with rib/channel width. From the numerical study, it is found that the change of liquid water distribution was significantly affected by temperature distribution in MEA between rib and channel area. In addition, not only heat transport but also water transport through the membrane also significantly affected the cell performance under high current density operation.

  10. Statistical flaw strength distributions for glass fibres: Correlation between bundle test and AFM-derived flaw size density functions

    International Nuclear Information System (INIS)

    Foray, G.; Descamps-Mandine, A.; R’Mili, M.; Lamon, J.

    2012-01-01

    The present paper investigates glass fibre flaw size distributions. Two commercial fibre grades (HP and HD) mainly used in cement-based composite reinforcement were studied. Glass fibre fractography is a difficult and time consuming exercise, and thus is seldom carried out. An approach based on tensile tests on multifilament bundles and examination of the fibre surface by atomic force microscopy (AFM) was used. Bundles of more than 500 single filaments each were tested. Thus a statistically significant database of failure data was built up for the HP and HD glass fibres. Gaussian flaw distributions were derived from the filament tensile strength data or extracted from the AFM images. The two distributions were compared. Defect sizes computed from raw AFM images agreed reasonably well with those derived from tensile strength data. Finally, the pertinence of a Gaussian distribution was discussed. The alternative Pareto distribution provided a fair approximation when dealing with AFM flaw size.

  11. Effective pile-up density as a measure of the experimental data quality for High-Luminosity LHC operational scenarios.

    CERN Document Server

    Medina Medrano, Luis Eduardo; Arduini, Gianluigi; Napsuciale, Mauro

    2018-01-01

    The High-Luminosity LHC (HL-LHC) experiments will operate at unprecedented level of event pile-up from proton-proton collisions at 14TeV center-of-mass energy. In this paper we study the performance of the baseline and a series of alternative scenarios in terms of the delivered integrated luminosity and its quality (pile-up density). A new figure-of-merit is introduced, the effective pile-up density, a concept that reflects the expected detector efficiency in the reconstruction of event vertices for a given operational scenario, acting as a link between the machine and experimental slides. Alternative scenarios have been proposed either to improve the baseline performance, or tot provide operational schemes in the case of particular limitations. Simulations of the evolution of optimum fills with the latest set of parameters of the HL-LHC are performed with β* - levelling, and results are discussed in terms of both the integrated luminosity and the effective pile-up density. The crab kissing scheme, a propose...

  12. Statistical Modeling for the Effect of Rotor Speed, Yarn Twist and Linear Density on Production and Quality Characteristics of Rotor Spun Yarn

    Directory of Open Access Journals (Sweden)

    Farooq Ahmed Arain

    2012-01-01

    Full Text Available The aim of this study was to develop a statistical model for the effect of RS (Rotor Speed, YT (Yarn Twist and YLD (Yarn Linear Density on production and quality characteristics of rotor spun yarn. Cotton yarns of 30, 35 and 40 tex were produced on rotor spinning machine at different rotor speeds (i.e. 70000, 80000, 90000 and 100000 rpm and with different twist levels (i.e. 450, 500, 550, 600 and 700 tpm. Yarn production (g/hr and quality characteristics were determined for all the experiments. Based on the results, models were developed using response surface regression on MINITAB�16 statistical tool. The developed models not only characterize the intricate relationships among the factors but may also be used to predict the yarn production and quality characteristics at any level of factors within the range of experimental values.

  13. VALIDATION OF SPRING OPERATED PRESSURE RELIEF VALVE TIME TO FAILURE AND THE IMPORTANCE OF STATISTICALLY SUPPORTED MAINTENANCE INTERVALS

    Energy Technology Data Exchange (ETDEWEB)

    Gross, R; Stephen Harris, S

    2009-02-18

    The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?

  14. Operational specification and forecasting advances for Dst, LEO thermospheric densities, and aviation radiation dose and dose rate

    Science.gov (United States)

    Tobiska, W. Kent

    Space weather’s effects upon the near-Earth environment are due to dynamic changes in the energy transfer processes from the Sun’s photons, particles, and fields. Of the space environment domains that are affected by space weather, the magnetosphere, thermosphere, and even troposphere are key regions that are affected. Space Environment Technologies (SET) has developed and is producing innovative space weather applications. Key operational systems for providing timely information about the effects of space weather on these domains are SET’s Magnetosphere Alert and Prediction System (MAPS), LEO Alert and Prediction System (LAPS), and Automated Radiation Measurements for Aviation Safety (ARMAS) system. MAPS provides a forecast Dst index out to 6 days through the data-driven, redundant data stream Anemomilos algorithm. Anemomilos uses observational proxies for the magnitude, location, and velocity of solar ejecta events. This forecast index is used by satellite operations to characterize upcoming geomagnetic storms, for example. In addition, an ENLIL/Rice Dst prediction out to several days has also been developed and will be described. LAPS is the SET fully redundant operational system providing recent history, current epoch, and forecast solar and geomagnetic indices for use in operational versions of the JB2008 thermospheric density model. The thermospheric densities produced by that system, driven by the LAPS data, are forecast to 72-hours to provide the global mass densities for satellite operators. ARMAS is a project that has successfully demonstrated the operation of a micro dosimeter on aircraft to capture the real-time radiation environment due to Galactic Cosmic Rays and Solar Energetic Particles. The dose and dose-rates are captured on aircraft, downlinked in real-time via the Iridium satellites, processed on the ground, incorporated into the most recent NAIRAS global radiation climatology data runs, and made available to end users via the web and

  15. Stability of the superconductive operating mode in high current-density devices

    International Nuclear Information System (INIS)

    Wipf, S.L.

    1979-01-01

    The superconductive operating mode represents a thermal equilibrium that can tolerate a certain amount of disturbance before it is lost. The basin of attraction (BOA), in many ways equivalent to a potential well, is a measure of the size of disturbance needed to lift the device from the superconductive into a resistive operating mode. The BOA for a simple geometry is calculated and discussed. Experimental results are reported, showing how the concept is used to gain information on the disturbances occurring in a superconducting device

  16. Compatibility of advanced tokamak plasma with high density and high radiation loss operation in JT-60U

    International Nuclear Information System (INIS)

    Takenaga, H.; Asakura, N.; Kubo, H.; Higashijima, S.; Konoshima, S.; Nakano, T.; Oyama, N.; Ide, S.; Fujita, T.; Takizuka, T.; Kamada, Y.; Miura, Y.; Porter, G.D.; Rognlien, T.D.; Rensink, M.E.

    2005-01-01

    Compatibility of advanced tokamak plasmas with high density and high radiation loss has been investigated in both reversed shear (RS) plasmas and high β p H-mode plasmas with a weak positive shear on JT-60U. In the RS plasmas, the operation regime is extended to high density above the Greenwald density (n GW ) with high confinement (HH y2 >1) and high radiation loss fraction (f rad >0.9) by tailoring the internal transport barriers (ITBs). High confinement of HH y2 =1.2 is sustained even with 80% radiation from the main plasma enhanced by accumulated metal impurity. The divertor radiation is enhanced by Ne seeding and the ratio of the divertor radiation to the total radiation is increased from 20% without seeding to 40% with Ne seeding. In the high β p H-mode plasmas, high confinement (HH y2 =0.96) is maintained at high density (n-bar e /n GW =0.92) with high radiation loss fraction (f rad ∼1) by utilizing high-field-side pellets and Ar injections. The high n-bar e /n GW is obtained due to a formation of clear density ITB. Strong core-edge parameter linkage is observed, as well as without Ar injection. In this linkage, the pedestal β p , defined as β p ped =p ped /(B p 2 /2μ 0 ) where p ped is the plasma pressure at the pedestal top, is enhanced with the total β p . The radiation profile in the main plasma is peaked due to Ar accumulation inside the ITB and the measured central radiation is ascribed to Ar. The impurity transport analyses indicate that Ar accumulation by a factor of 2 more than the electron, as observed in the high β p H-mode plasma, is acceptable even with peaked density profile in a fusion reactor for impurity seeding. (author)

  17. Harvest operations for density management: planning requirements, production, costs, stand damage, and recommendations

    Science.gov (United States)

    Loren D. Kellogg; Stephen J. Pilkerton

    2013-01-01

    Since the early 1990s, several studies have been undertaken to determine the planning requirements, productivity, costs, and residual stand damage of harvest operations in thinning treatments designed to promote development of complex forest structure in order to enhance ecological functioning and biological diversity. Th ese studies include the Oregon State...

  18. CH-54 Operational Statistics

    Science.gov (United States)

    1976-02-01

    NO-MONTHLY DIR MAINT MH bY ACFT S 00001570 M250 + P14 SAVEVALUt MU-S1MULAT1 UN NORM TIME 1.1 HKJ 00001580 Ko5C+Pl4 SAVtVALUt...DEPART ASSIGN PRIORITY PRIORITY ASSIGN TRANSFER LINK SPl iT TRANSFER P26,K0fPMCP tPMCM P17tK17tPMCV V171»,V36 V172*,V36 <fJf>*,V3t> 1075

  19. Safeguarding subcriticality during loading and shuffling operations in the higher density of the RSG-GAS's silicide core

    International Nuclear Information System (INIS)

    Sembiring, T.M.; Kuntoro, I.

    2003-01-01

    The core conversion program of the RSG-GAS reactor is to convert the all-oxide to all-silicide core. The silicide equilibrium core with fuel meat density of 3.55 gU cm -3 is an optimal core for RSG-GAS reactor and it can significantly increase the operation cycle length from 25 to 32 full power days. Nevertheless, the subcriticality of the shutdown core and the shutdown margin are lower than of the oxide core. Therefore, the deviation of subcriticality condition in the higher silicide core caused by the fuel loading and shuffling error should be reanalysed. The objective of this work is to analyse the sufficiency of the subcriticality condition of the shutdown core to face the worst condition caused by an error during loading and shuffling operations. The calculations were carried out using the 2-dimensional multigroup neutron diffusion code of Batan-FUEL. In the fuel handling error, the calculated results showed that the subcriticality condition of the shutdown higher density silicide equilibrium core of RSG-GAS can be maintained. Therefore, all fuel management steps are fixed in the present reactor operation manual can be applied in the higher silicide equilibrium core of RSG-GAS reactor. (author)

  20. Effects of watershed densities of animal feeding operations on nutrient concentrations and estrogenic activity in agricultural streams.

    Science.gov (United States)

    Ciparis, Serena; Iwanowicz, Luke R; Voshell, J Reese

    2012-01-01

    Application of manures from animal feeding operations (AFOs) as fertilizer on agricultural land can introduce nutrients and hormones (e.g. estrogens) to streams. A landscape-scale study was conducted in the Shenandoah River watershed (Virginia, USA) in order to assess the relationship between densities of AFOs in watersheds of agricultural streams and in-stream nutrient concentrations and estrogenic activity. The effect of wastewater treatment plants (WWTPs) on nutrients and estrogenic activity was also evaluated. During periods of high and low flow, dissolved inorganic nitrogen (DIN) and orthophosphate (PO(4)-P) concentrations were analyzed and estrogens/estrogenic compounds were extracted and quantified as17β-estradiol equivalents (E2Eq) using a bioluminescent yeast estrogen screen. Estrogenic activity was measurable in the majority of collected samples, and 20% had E2Eq concentrations >1 ng/L. Relatively high concentrations of DIN (>1000 μg/L) were also frequently detected. During all sampling periods, there were strong relationships between watershed densities of AFOs and in-stream concentrations of DIN (R(2) = 0.56-0.81) and E2Eq (R(2) = 0.39-0.75). Relationships between watershed densities of AFOs and PO(4)-P were weaker, but were also significant (R(2) = 0.27-0.57). When combined with the effect of watershed AFO density, streams receiving WWTP effluent had higher concentrations of PO(4)-P than streams without WWTP discharges, and PO(4)-P was the only analyte with a consistent relationship to WWTPs. The results of this study suggest that as the watershed density of AFOs increases, there is a proportional increase in the potential for nonpoint source pollution of agricultural streams and their receiving waters by nutrients, particularly DIN, and compounds that can cause endocrine disruption in aquatic organisms. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Effects of watershed densities of animal feeding operations on nutrient concentrations and estrogenic activity in agricultural streams

    Science.gov (United States)

    Ciparis, Serena; Iwanowicz, Luke R.; Voshell, J. Reese

    2012-01-01

    Application of manures from animal feeding operations (AFOs) as fertilizer on agricultural land can introduce nutrients and hormones (e.g. estrogens) to streams. A landscape-scale study was conducted in the Shenandoah River watershed (Virginia, USA) in order to assess the relationship between densities of AFOs in watersheds of agricultural streams and in-stream nutrient concentrations and estrogenic activity. The effect of wastewater treatment plants (WWTPs) on nutrients and estrogenic activity was also evaluated. During periods of high and low flow, dissolved inorganic nitrogen (DIN) and orthophosphate (PO4-P) concentrations were analyzed and estrogens/estrogenic compounds were extracted and quantified as17β-estradiol equivalents (E2Eq) using a bioluminescent yeast estrogen screen. Estrogenic activity was measurable in the majority of collected samples, and 20% had E2Eq concentrations > 1 ng/L. Relatively high concentrations of DIN (> 1000 μg/L) were also frequently detected. During all sampling periods, there were strong relationships between watershed densities of AFOs and in-stream concentrations of DIN (R2 = 0.56–0.81) and E2Eq (R2 = 0.39–0.75). Relationships between watershed densities of AFOs and PO4-P were weaker, but were also significant (R2 = 0.27–0.57). When combined with the effect of watershed AFO density, streams receiving WWTP effluent had higher concentrations of PO4-P than streams without WWTP discharges, and PO4-P was the only analyte with a consistent relationship to WWTPs. The results of this study suggest that as the watershed density of AFOs increases, there is a proportional increase in the potential for nonpoint source pollution of agricultural streams and their receiving waters by nutrients, particularly DIN, and compounds that can cause endocrine disruption in aquatic organisms.

  2. The influence of changes in the VVER-1000 fuel assembly shape during operation on the power density distribution

    Energy Technology Data Exchange (ETDEWEB)

    Shishkov, L. K., E-mail: Shishkov-LK@nrcki.ru; Gorodkov, S. S.; Mikailov, E. F.; Sukhino-Homenko, E. A.; Sumarokova, A. S., E-mail: Sumarokova-AS@nrcki.ru [National Research Center Kurchatov Institute (Russian Federation)

    2016-12-15

    A new approach to calculation of the coefficients of sensitivity of the fuel pin power to deviations in gap sizes between fuel assemblies of the VVER-1000 reactor during its operation is proposed. It is shown that the calculations by the MCU code should be performed for a full-size model of the core to take the interference of the gap influence into account. In order to reduce the conservatism of calculations, the coolant density and coolant temperature feedbacks should be taken into account, as well as the fuel burnup.

  3. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan

    2008-01-01

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  4. The spectral density of the QCD Dirac operator and patterns of chiral symmetry breaking

    International Nuclear Information System (INIS)

    Toublan, D.; Verbaarschot, J.J.M.

    1999-01-01

    We study the spectrum of the QCD Dirac operator for two colors with fermions in the fundamental representation and for two or more colors with adjoint fermions. For N f flavors, the chiral flavor symmetry of these theories is spontaneously broken according to SU (2N f → Sp (2N f ) and SU (N f → O (N f ), respectively, rather than the symmetry breaking pattern SU (N f ) x SU (N f ) → SU (N f ) for QCD with three or more colors and fundamental fermions. In this paper we study the Dirac spectrum for the first two symmetry breaking patterns. Following previous work for the third case we find the Dirac spectrum in the domain λ QCD by means of partially quenched chiral perturbation theory. In particular, this result allows us to calculate the slope of the Dirac spectrum at λ = 0. We also show that for λ 2 Λ QCD (wing L the linear size of the system) the Dirac spectrum is given by a chiral Random Matrix Theory with the symmetries of the Dirac operator

  5. The european flood alert system EFAS – Part 2: Statistical skill assessment of probabilistic and deterministic operational forecasts

    Directory of Open Access Journals (Sweden)

    J. C. Bartholmes

    2009-02-01

    Full Text Available Since 2005 the European Flood Alert System (EFAS has been producing probabilistic hydrological forecasts in pre-operational mode at the Joint Research Centre (JRC of the European Commission. EFAS aims at increasing preparedness for floods in trans-national European river basins by providing medium-range deterministic and probabilistic flood forecasting information, from 3 to 10 days in advance, to national hydro-meteorological services.

    This paper is Part 2 of a study presenting the development and skill assessment of EFAS. In Part 1, the scientific approach adopted in the development of the system has been presented, as well as its basic principles and forecast products. In the present article, two years of existing operational EFAS forecasts are statistically assessed and the skill of EFAS forecasts is analysed with several skill scores. The analysis is based on the comparison of threshold exceedances between proxy-observed and forecasted discharges. Skill is assessed both with and without taking into account the persistence of the forecasted signal during consecutive forecasts.

    Skill assessment approaches are mostly adopted from meteorology and the analysis also compares probabilistic and deterministic aspects of EFAS. Furthermore, the utility of different skill scores is discussed and their strengths and shortcomings illustrated. The analysis shows the benefit of incorporating past forecasts in the probability analysis, for medium-range forecasts, which effectively increases the skill of the forecasts.

  6. Density functional theory calculations of H/D isotope effects on polymer electrolyte membrane fuel cell operations

    Energy Technology Data Exchange (ETDEWEB)

    Yanase, Satoshi; Oi, Takao [Sophia Univ., Tokyo (Japan). Faculty of Science and Technology

    2015-10-01

    To elucidate hydrogen isotope effects observed between fuel and exhaust hydrogen gases during polymer electrolyte membrane fuel cell operations, H-to-D reduced partition function ratios (RPFRs) for the hydrogen species in the Pt catalyst phase of the anode and the electrolyte membrane phase of the fuel cell were evaluated by density functional theory calculations on model species of the two phases. The evaluation yielded 3.2365 as the value of the equilibrium constant of the hydrogen isotope exchange reaction between the two phases at 39 C, which was close to the experimentally estimated value of 3.46-3.99 at the same temperature. It was indicated that H{sup +} ions on the Pt catalyst surface of the anode and H species in the electrolyte membrane phase were isotopically in equilibrium with one another during fuel cell operations.

  7. High beta tokamak operation in DIII-D limited at low density/collisionality by resistive tearing modes

    International Nuclear Information System (INIS)

    La Haye, R.J.; Lao, L.L.; Strait, E.J.; Taylor, T.S.

    1997-01-01

    The maximum operational high beta in single-null divertor (SND) long pulse tokamak discharges in the DIII-D tokamak with a cross-sectional shape similar to the proposed International Thermonuclear Experimental Reactor (ITER) device is found to be limited by the onset of resistive instabilities that have the characteristics of neoclassically destabilized tearing modes. There is a soft limit due to the onset of an m/n=3/2 rotating tearing mode that saturates at low amplitude and a hard limit at slightly higher beta due to the onset of an m/n=2/1 rotating tearing mode that grows, slows down and locks. By operating at higher density and thus collisionality, the practical beta limit due to resistive tearing modes approaches the ideal magnetohydrodynamic (MHD) limit. (author). 15 refs, 4 figs

  8. Determination of absolute Ba densities during dimming operation of fluorescent lamps by laser-induced fluorescence measurements

    International Nuclear Information System (INIS)

    Hadrath, S; Beck, M; Garner, R C; Lieder, G; Ehlbeck, J

    2007-01-01

    Investigations of fluorescent lamps (FL) are often focused on the electrodes, since the lifetime of the lamps is typically limited by the electrode lifetime and durability. During steady state operation, the work function lowering emitter material, in particular, barium, is lost. Greater barium losses occur under dimming conditions, in which reduced discharge currents lead to increased cathode falls, the result of the otherwise diminished heating of the electrode by the bombarding plasma ions. In this work the barium density near the electrodes of (FL), operating in high frequency dimming mode is investigated using the high-sensitivity method of laser-induced fluorescence. From these measurements we infer barium loss for a range of discharge currents and auxiliary coil heating currents. We show that the Ba loss can very easily be reduced by moderate auxiliary coil heating

  9. Statistics of strain rates and surface density function in a flame-resolved high-fidelity simulation of a turbulent premixed bluff body burner

    Science.gov (United States)

    Sandeep, Anurag; Proch, Fabian; Kempf, Andreas M.; Chakraborty, Nilanjan

    2018-06-01

    The statistical behavior of the surface density function (SDF, the magnitude of the reaction progress variable gradient) and the strain rates, which govern the evolution of the SDF, have been analyzed using a three-dimensional flame-resolved simulation database of a turbulent lean premixed methane-air flame in a bluff-body configuration. It has been found that the turbulence intensity increases with the distance from the burner, changing the flame curvature distribution and increasing the probability of the negative curvature in the downstream direction. The curvature dependences of dilatation rate ∇ṡu → and displacement speed Sd give rise to variations of these quantities in the axial direction. These variations affect the nature of the alignment between the progress variable gradient and the local principal strain rates, which in turn affects the mean flame normal strain rate, which assumes positive values close to the burner but increasingly becomes negative as the effect of turbulence increases with the axial distance from the burner exit. The axial distance dependences of the curvature and displacement speed also induce a considerable variation in the mean value of the curvature stretch. The axial distance dependences of the dilatation rate and flame normal strain rate govern the behavior of the flame tangential strain rate, and its mean value increases in the downstream direction. The current analysis indicates that the statistical behaviors of different strain rates and displacement speed and their curvature dependences need to be included in the modeling of flame surface density and scalar dissipation rate in order to accurately capture their local behaviors.

  10. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  11. Implication of nonintegral occupation number and Fermi-Dirac statistics in the local-spin-density approximation applied to finite systems

    International Nuclear Information System (INIS)

    Dhar, S.

    1989-01-01

    In electronic-structure calculations for finite systems using the local-spin-density (LSD) approximation, it is assumed that the eigenvalues of the Kohn-Sham equation should obey Fermi-Dirac (FD) statistics. In order to comply with this assumption for some of the transition-metal atoms, a nonintegral occupation number is used which also minimizes the total energy. It is shown here that for finite systems it is not necessary that the eigenvalues of the Kohn-Sham equation obey FD statistics. It is also shown that the Kohn-Sham exchange potential used in all LSD models is correct only for integer occupation number. With a noninteger occupation number the LSD exchange potential will be smaller than that given by the Kohn-Sham potential. Ab initio self-consistent spin-polarized calculations have been performed numerically for the total energy of an iron atom. It is found that the ground state belongs to the 3d 6 4s 2 configuration. The ionization potentials of all the Fe/sup n/ + ions are reported and are in agreement with experiment

  12. Statistically Enhanced Model of In Situ Oil Sands Extraction Operations: An Evaluation of Variability in Greenhouse Gas Emissions.

    Science.gov (United States)

    Orellana, Andrea; Laurenzi, Ian J; MacLean, Heather L; Bergerson, Joule A

    2018-02-06

    Greenhouse gas (GHG) emissions associated with extraction of bitumen from oil sands can vary from project to project and over time. However, the nature and magnitude of this variability have yet to be incorporated into life cycle studies. We present a statistically enhanced life cycle based model (GHOST-SE) for assessing variability of GHG emissions associated with the extraction of bitumen using in situ techniques in Alberta, Canada. It employs publicly available, company-reported operating data, facilitating assessment of inter- and intraproject variability as well as the time evolution of GHG emissions from commercial in situ oil sands projects. We estimate the median GHG emissions associated with bitumen production via cyclic steam stimulation (CSS) to be 77 kg CO 2 eq/bbl bitumen (80% CI: 61-109 kg CO 2 eq/bbl), and via steam assisted gravity drainage (SAGD) to be 68 kg CO 2 eq/bbl bitumen (80% CI: 49-102 kg CO 2 eq/bbl). We also show that the median emissions intensity of Alberta's CSS and SAGD projects have been relatively stable from 2000 to 2013, despite greater than 6-fold growth in production. Variability between projects is the single largest source of variability (driven in part by reservoir characteristics) but intraproject variability (e.g., startups, interruptions), is also important and must be considered in order to inform research or policy priorities.

  13. NOx, Soot, and Fuel Consumption Predictions under Transient Operating Cycle for Common Rail High Power Density Diesel Engines

    Directory of Open Access Journals (Sweden)

    N. H. Walke

    2016-01-01

    Full Text Available Diesel engine is presently facing the challenge of controlling NOx and soot emissions on transient cycles, to meet stricter emission norms and to control emissions during field operations. Development of a simulation tool for NOx and soot emissions prediction on transient operating cycles has become the most important objective, which can significantly reduce the experimentation time and cost required for tuning these emissions. Hence, in this work, a 0D comprehensive predictive model has been formulated with selection and coupling of appropriate combustion and emissions models to engine cycle models. Selected combustion and emissions models are further modified to improve their prediction accuracy in the full operating zone. Responses of the combustion and emissions models have been validated for load and “start of injection” changes. Model predicted transient fuel consumption, air handling system parameters, and NOx and soot emissions are in good agreement with measured data on a turbocharged high power density common rail engine for the “nonroad transient cycle” (NRTC. It can be concluded that 0D models can be used for prediction of transient emissions on modern engines. How the formulated approach can also be extended to transient emissions prediction for other applications and fuels is also discussed.

  14. Quantum information density scaling and qubit operation time constraints of CMOS silicon-based quantum computer architectures

    Science.gov (United States)

    Rotta, Davide; Sebastiano, Fabio; Charbon, Edoardo; Prati, Enrico

    2017-06-01

    range of a silicon complementary metal-oxide-semiconductor quantum processor to be within 1 and 100 GHz. Such constraint limits the feasibility of fault-tolerant quantum information processing with complementary metal-oxide-semiconductor technology only to the most advanced nodes. The compatibility with classical complementary metal-oxide-semiconductor control circuitry is discussed, focusing on the cryogenic complementary metal-oxide-semiconductor operation required to bring the classical controller as close as possible to the quantum processor and to enable interfacing thousands of qubits on the same chip via time-division, frequency-division, and space-division multiplexing. The operation time range prospected for cryogenic control electronics is found to be compatible with the operation time expected for qubits. By combining the forecast of the development of scaled technology nodes with operation time and classical circuitry constraints, we derive a maximum quantum information density for logical qubits of 2.8 and 4 Mqb/cm2 for the 10 and 7-nm technology nodes, respectively, for the Steane code. The density is one and two orders of magnitude less for surface codes and for concatenated codes, respectively. Such values provide a benchmark for the development of fault-tolerant quantum algorithms by circuital quantum information based on silicon platforms and a guideline for other technologies in general.

  15. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  16. A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data.

    Directory of Open Access Journals (Sweden)

    J Rasmus Nielsen

    Full Text Available Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes zero observations and over-dispersion. The model utilises the fact the correlation between numbers of fish caught increases when the distance in space and time between the fish decreases, and the correlation between size groups in a haul increases when the difference in size decreases. Here the model is extended in two ways. Instead of assuming a natural scale size correlation, the model is further developed to allow for a transformed length scale. Furthermore, in the present application, the spatial- and size-dependent correlation between species was included. For cod (Gadus morhua and whiting (Merlangius merlangus, a common structured size correlation was fitted, and a separable structure between the time and space-size correlation was found for each species, whereas more complex structures were required to describe the correlation between species (and space-size. The within-species time correlation is strong, whereas the correlations between the species are weaker over time but strong within the year.

  17. Quantum-statistical kinetic equations

    International Nuclear Information System (INIS)

    Loss, D.; Schoeller, H.

    1989-01-01

    Considering a homogeneous normal quantum fluid consisting of identical interacting fermions or bosons, the authors derive an exact quantum-statistical generalized kinetic equation with a collision operator given as explicit cluster series where exchange effects are included through renormalized Liouville operators. This new result is obtained by applying a recently developed superoperator formalism (Liouville operators, cluster expansions, symmetrized projectors, P q -rule, etc.) to nonequilibrium systems described by a density operator ρ(t) which obeys the von Neumann equation. By means of this formalism a factorization theorem is proven (being essential for obtaining closed equations), and partial resummations (leading to renormalized quantities) are performed. As an illustrative application, the quantum-statistical versions (including exchange effects due to Fermi-Dirac or Bose-Einstein statistics) of the homogeneous Boltzmann (binary collisions) and Choh-Uhlenbeck (triple collisions) equations are derived

  18. Successful operation of continuous reactors at short retention times results in high-density, fast-rate Dehalococcoides dechlorinating cultures.

    Science.gov (United States)

    Delgado, Anca G; Fajardo-Williams, Devyn; Popat, Sudeep C; Torres, César I; Krajmalnik-Brown, Rosa

    2014-03-01

    The discovery of Dehalococcoides mccartyi reducing perchloroethene and trichloroethene (TCE) to ethene was a key landmark for bioremediation applications at contaminated sites. D. mccartyi-containing cultures are typically grown in batch-fed reactors. On the other hand, continuous cultivation of these microorganisms has been described only at long hydraulic retention times (HRTs). We report the cultivation of a representative D. mccartyi-containing culture in continuous stirred-tank reactors (CSTRs) at a short, 3-d HRT, using TCE as the electron acceptor. We successfully operated 3-d HRT CSTRs for up to 120 days and observed sustained dechlorination of TCE at influent concentrations of 1 and 2 mM TCE to ≥ 97 % ethene, coupled to the production of 10(12) D. mccartyi cells Lculture (-1). These outcomes were possible in part by using a medium with low bicarbonate concentrations (5 mM) to minimize the excessive proliferation of microorganisms that use bicarbonate as an electron acceptor and compete with D. mccartyi for H2. The maximum conversion rates for the CSTR-produced culture were 0.13 ± 0.016, 0.06 ± 0.018, and 0.02 ± 0.007 mmol Cl(-) Lculture (-1) h(-1), respectively, for TCE, cis-dichloroethene, and vinyl chloride. The CSTR operation described here provides the fastest laboratory cultivation rate of high-cell density Dehalococcoides cultures reported in the literature to date. This cultivation method provides a fundamental scientific platform for potential future operations of such a system at larger scales.

  19. Concentration, size, and density of total suspended particulates at the air exhaust of concentrated animal feeding operations.

    Science.gov (United States)

    Yang, Xufei; Lee, Jongmin; Zhang, Yuanhui; Wang, Xinlei; Yang, Liangcheng

    2015-08-01

    Total suspended particulate (TSP) samples were seasonally collected at the air exhaust of 15 commercial concentrated animal feeding operations (CAFOs; including swine finishing, swine farrowing, swine gestation, laying hen, and tom turkey) in the U.S. Midwest. The measured TSP concentrations ranged from 0.38 ± 0.04 mg m⁻³ (swine gestation in summer) to 10.9 ± 3.9 mg m⁻³ (tom turkey in winter) and were significantly affected by animal species, housing facility type, feeder type (dry or wet), and season. The average particle size of collected TSP samples in terms of mass median equivalent spherical diameter ranged from 14.8 ± 0.5 µm (swine finishing in winter) to 30.5 ± 2.0 µm (tom turkey in summer) and showed a significant seasonal effect. This finding affirmed that particulate matter (PM) released from CAFOs contains a significant portion of large particles. The measured particle size distribution (PSD) and the density of deposited particles (on average 1.65 ± 0.13 g cm⁻³) were used to estimate the mass fractions of PM10 and PM2.5 (PM ≤ 10 and ≤ 2.5 μm, respectively) in the collected TSP. The results showed that the PM10 fractions ranged from 12.7 ± 5.1% (tom turkey) to 21.1 ± 3.2% (swine finishing), whereas the PM2.5 fractions ranged from 3.4 ± 1.9% (tom turkey) to 5.7 ± 3.2% (swine finishing) and were smaller than 9.0% at all visited CAFOs. This study applied a filter-based method for PSD measurement and deposited particles as a surrogate to estimate the TSP's particle density. The limitations, along with the assumptions adopted during the calculation of PM mass fractions, must be recognized when comparing the findings to other studies.

  20. Wind energy statistics

    International Nuclear Information System (INIS)

    Holttinen, H.; Tammelin, B.; Hyvoenen, R.

    1997-01-01

    The recording, analyzing and publishing of statistics of wind energy production has been reorganized in cooperation of VTT Energy, Finnish Meteorological (FMI Energy) and Finnish Wind Energy Association (STY) and supported by the Ministry of Trade and Industry (KTM). VTT Energy has developed a database that contains both monthly data and information on the wind turbines, sites and operators involved. The monthly production figures together with component failure statistics are collected from the operators by VTT Energy, who produces the final wind energy statistics to be published in Tuulensilmae and reported to energy statistics in Finland and abroad (Statistics Finland, Eurostat, IEA). To be able to verify the annual and monthly wind energy potential with average wind energy climate a production index in adopted. The index gives the expected wind energy production at various areas in Finland calculated using real wind speed observations, air density and a power curve for a typical 500 kW-wind turbine. FMI Energy has produced the average figures for four weather stations using the data from 1985-1996, and produces the monthly figures. (orig.)

  1. Gamow-Jordan vectors and non-reducible density operators from higher-order S-matrix poles

    International Nuclear Information System (INIS)

    Bohm, A.; Loewe, M.; Maxson, S.; Patuleanu, P.; Puentmann, C.; Gadella, M.

    1997-01-01

    In analogy to Gamow vectors that are obtained from first-order resonance poles of the S-matrix, one can also define higher-order Gamow vectors which are derived from higher-order poles of the S-matrix. An S-matrix pole of r-th order at z R =E R -iΓ/2 leads to r generalized eigenvectors of order k=0,1,hor-ellipsis,r-1, which are also Jordan vectors of degree (k+1) with generalized eigenvalue (E R -iΓ/2). The Gamow-Jordan vectors are elements of a generalized complex eigenvector expansion, whose form suggests the definition of a state operator (density matrix) for the microphysical decaying state of this higher-order pole. This microphysical state is a mixture of non-reducible components. In spite of the fact that the k-th order Gamow-Jordan vectors has the polynomial time-dependence which one always associates with higher-order poles, the microphysical state obeys a purely exponential decay law. copyright 1997 American Institute of Physics

  2. Program for analysis and evaluation of operational data in nuclear power plants - statistical of operational incident of 1984 to 1985 from Angra-1

    International Nuclear Information System (INIS)

    Soares, H.V.; Ambros, P.C.; Araujo, J.B.

    1987-01-01

    The reports of CNEN in the years of 1984 - 1985 regarding reactor accidents. These reports are divided into five classes: reactor shutdown, operator of the core emergency cooling system, power plant operating at limit conditions, radioactivity contamination and super exposition is presented. (A.C.A.S.)

  3. Quantum mechanics as applied mathematical statistics

    International Nuclear Information System (INIS)

    Skala, L.; Cizek, J.; Kapsa, V.

    2011-01-01

    Basic mathematical apparatus of quantum mechanics like the wave function, probability density, probability density current, coordinate and momentum operators, corresponding commutation relation, Schroedinger equation, kinetic energy, uncertainty relations and continuity equation is discussed from the point of view of mathematical statistics. It is shown that the basic structure of quantum mechanics can be understood as generalization of classical mechanics in which the statistical character of results of measurement of the coordinate and momentum is taken into account and the most important general properties of statistical theories are correctly respected.

  4. Managers’ perspectives: practical experience and challenges associated with variable-density operations and uneven-aged management

    Science.gov (United States)

    Kurtis E. Steele

    2013-01-01

    Variable-density thinning has received a lot of public attention in recent years and has subsequently become standard language in most of the Willamette National Forest’s timber management projects. Many techniques have been tried, with varying on-the-ground successes. To accomplish variable-density thinning, the McKenzie River Ranger District currently uses...

  5. Local and global properties of eigenfunctions and one-electron densities of Coulombic Schrödinger operators

    DEFF Research Database (Denmark)

    Fournais, Søren; Hoffmann-Ostenhof, Maria; Hoffmann-Ostenhof, Thomas

    2008-01-01

    We review recent results by the authors on the regularity of molecular eigenfunctions ψ and their corresponding one-electron densities ρ, as well as of the spherically averaged one-electron atomic density ρ. Furthermore, we prove an exponentially decreasing lower bound for ρ in the case when...

  6. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  7. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  8. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  9. Incorporation of statistical distribution of particle properties in chemical reactor design and operation: the cooled tubular reactor

    NARCIS (Netherlands)

    Wijngaarden, R.J.; Westerterp, K.R.

    1992-01-01

    Pellet heat and mass transfer coefficients inside packed beds do not have definite deterministic values, but are stochastic quantities with a certain distribution. Here, a method is presented to incorporate the stochastic distribution of pellet properties in reactor design and operation models. The

  10. Quantum statistical properties of orthonormalized eigenstates of the operator (a-circumflex f (n-circumflex))k

    International Nuclear Information System (INIS)

    Wang Jisuo; Jian Feng; Liu Tangkun

    2002-01-01

    The completeness of the k orthonormalized eigenstates of the operator (a-circumflex f (n-circumflex)) k (k≥3) is investigated. We introduce a new kind of higher-order squeezing and an antibunching. The properties of the Mth-order squeezing and the antibunching effect of the k states are studied. The result shows that these states may form a complete Hilbert space, and the Mth-order (M=(n+1/2)k; n=0,1,...) squeezing effects exist in all of the k states when k is even. There is an antibunching effect in all of the states. An alternative method for constructing the k states is proposed, and the result shows that all of them can be generated by linear superposition of the time-dependent nonlinear coherent states at different instants. (author)

  11. Photocatalytic degradation of malathion using Zn2+-doped TiO2 nanoparticles: statistical analysis and optimization of operating parameters

    Science.gov (United States)

    Nasseri, Simin; Omidvar Borna, Mohammad; Esrafili, Ali; Rezaei Kalantary, Roshanak; Kakavandi, Babak; Sillanpää, Mika; Asadi, Anvar

    2018-02-01

    A Zn2+-doped TiO2 is successfully synthesized by a facile photodeposition method and used in the catalytic photo-degradation of organophosphorus pesticide, malathion. The obtained photocatalysts are characterized in detail by X-ray diffraction (XRD), Brunauer-Emmett-Teller (BET), Field emission scanning electron microscopy (FESEM) and transmission electron microscopy (TEM). XRD results confirm the formation of the anatase and rutile phases for the Zn2+-doped TiO2 nanoparticles, with crystallite sizes of 12.9 nm. Zn2+-doped TiO2 that was synthesized by 3.0%wt Zn doping at 200 °C exhibited the best photocatalytic activity. 60 sets of experiments were conducted using response surface methodology (RSM) by adjusting five operating parameters, i.e. initial malathion concentration, catalyst dose, pH, reaction time at five levels and presence or absence of UV light. The analysis revealed that all considered parameters are significant in the degradation process in their linear terms. The optimum values of the variables were found to be 177.59 mg/L, 0.99 g/L, 10.99 and 81.04 min for initial malathion concentration, catalyst dose, pH and reaction time, respectively, under UV irradiation (UV ON). Under the optimized conditions, the experimental values of degradation and mineralization were 98 and 74%, respectively. Moreover, the effects of competing anions and H2O2 on photocatalyst process were also investigated.

  12. A statistical study of coronal densities from X-ray line ratios of helium-like ions - Ne IX and Mg XI

    Science.gov (United States)

    Linford, G. A.; Lemen, J. R.; Strong, K. T.

    1988-01-01

    Since the repair of the Solar Maximum Mission (SMM) spacecraft, the Flat Crystal Spectrometer (FCS) has recorded many high temperature spectra of helium-like ions under a wide variety of coronal conditions including active regions, long duration events, compact events, and double flares. The plasma density and temperature are derived from the ratios R and G, where R = f/i, G = (f + i)/r, and r, f, and i denote the resonance, forbidden, and intercombination line fluxes. A new method for obtaining the density and temperature for events observed with the FCS aboard SMM is presented. The results for these events are presented and compared to earlier results, and the method is evaluated based on these comparisons.

  13. Evaluation of High Density Air Traffic Operations with Automation for Separation Assurance, Weather Avoidance and Schedule Conformance

    Science.gov (United States)

    Prevot, Thomas; Mercer, Joey S.; Martin, Lynne Hazel; Homola, Jeffrey R.; Cabrall, Christopher D.; Brasil, Connie L.

    2011-01-01

    In this paper we discuss the development and evaluation of our prototype technologies and procedures for far-term air traffic control operations with automation for separation assurance, weather avoidance and schedule conformance. Controller-in-the-loop simulations in the Airspace Operations Laboratory at the NASA Ames Research Center in 2010 have shown very promising results. We found the operations to provide high airspace throughput, excellent efficiency and schedule conformance. The simulation also highlighted areas for improvements: Short-term conflict situations sometimes resulted in separation violations, particularly for transitioning aircraft in complex traffic flows. The combination of heavy metering and growing weather resulted in an increased number of aircraft penetrating convective weather cells. To address these shortcomings technologies and procedures have been improved and the operations are being re-evaluated with the same scenarios. In this paper we will first describe the concept and technologies for automating separation assurance, weather avoidance, and schedule conformance. Second, the results from the 2010 simulation will be reviewed. We report human-systems integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. Next, improvements will be discussed that were made to address identified shortcomings. We conclude that, with further refinements, air traffic control operations with ground-based automated separation assurance can routinely provide currently unachievable levels of traffic throughput in the en route airspace.

  14. Quantum formalism for classical statistics

    Science.gov (United States)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  15. Statistical Analysis of Model Data for Operational Space Launch Weather Support at Kennedy Space Center and Cape Canaveral Air Force Station

    Science.gov (United States)

    Bauman, William H., III

    2010-01-01

    The 12-km resolution North American Mesoscale (NAM) model (MesoNAM) is used by the 45th Weather Squadron (45 WS) Launch Weather Officers at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) to support space launch weather operations. The 45 WS tasked the Applied Meteorology Unit to conduct an objective statistics-based analysis of MesoNAM output compared to wind tower mesonet observations and then develop a an operational tool to display the results. The National Centers for Environmental Prediction began running the current version of the MesoNAM in mid-August 2006. The period of record for the dataset was 1 September 2006 - 31 January 2010. The AMU evaluated MesoNAM hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The MesoNAM forecast winds, temperature and dew point were compared to the observed values of these parameters from the sensors in the KSC/CCAFS wind tower network. The data sets were stratified by model initialization time, month and onshore/offshore flow for each wind tower. Statistics computed included bias (mean difference), standard deviation of the bias, root mean square error (RMSE) and a hypothesis test for bias = O. Twelve wind towers located in close proximity to key launch complexes were used for the statistical analysis with the sensors on the towers positioned at varying heights to include 6 ft, 30 ft, 54 ft, 60 ft, 90 ft, 162 ft, 204 ft and 230 ft depending on the launch vehicle and associated weather launch commit criteria being evaluated. These twelve wind towers support activities for the Space Shuttle (launch and landing), Delta IV, Atlas V and Falcon 9 launch vehicles. For all twelve towers, the results indicate a diurnal signal in the bias of temperature (T) and weaker but discernable diurnal signal in the bias of dewpoint temperature (T(sub d)) in the MesoNAM forecasts. Also, the standard deviation of the bias and RMSE of T, T(sub d), wind speed and wind

  16. Relationship Between Column-Density and Surface Mixing Ratio: Statistical Analysis of O3 and NO2 Data from the July 2011 Maryland DISCOVER-AQ Mission

    Science.gov (United States)

    Flynn, Clare; Pickering, Kenneth E.; Crawford, James H.; Lamsol, Lok; Krotkov, Nickolay; Herman, Jay; Weinheimer, Andrew; Chen, Gao; Liu, Xiong; Szykman, James; hide

    2014-01-01

    To investigate the ability of column (or partial column) information to represent surface air quality, results of linear regression analyses between surface mixing ratio data and column abundances for O3 and NO2 are presented for the July 2011 Maryland deployment of the DISCOVER-AQ mission. Data collected by the P-3B aircraft, ground-based Pandora spectrometers, Aura/OMI satellite instrument, and simulations for July 2011 from the CMAQ air quality model during this deployment provide a large and varied data set, allowing this problem to be approached from multiple perspectives. O3 columns typically exhibited a statistically significant and high degree of correlation with surface data (R(sup 2) > 0.64) in the P- 3B data set, a moderate degree of correlation (0.16 analysis.

  17. Long-range corrected density functional theory with accelerated Hartree-Fock exchange integration using a two-Gaussian operator [LC-ωPBE(2Gau)].

    Science.gov (United States)

    Song, Jong-Won; Hirao, Kimihiko

    2015-10-14

    Since the advent of hybrid functional in 1993, it has become a main quantum chemical tool for the calculation of energies and properties of molecular systems. Following the introduction of long-range corrected hybrid scheme for density functional theory a decade later, the applicability of the hybrid functional has been further amplified due to the resulting increased performance on orbital energy, excitation energy, non-linear optical property, barrier height, and so on. Nevertheless, the high cost associated with the evaluation of Hartree-Fock (HF) exchange integrals remains a bottleneck for the broader and more active applications of hybrid functionals to large molecular and periodic systems. Here, we propose a very simple yet efficient method for the computation of long-range corrected hybrid scheme. It uses a modified two-Gaussian attenuating operator instead of the error function for the long-range HF exchange integral. As a result, the two-Gaussian HF operator, which mimics the shape of the error function operator, reduces computational time dramatically (e.g., about 14 times acceleration in C diamond calculation using periodic boundary condition) and enables lower scaling with system size, while maintaining the improved features of the long-range corrected density functional theory.

  18. Statistical analysis of fluorimeter operation

    International Nuclear Information System (INIS)

    Cutlip, L.B.

    1991-01-01

    Acceptance criteria for uranium check standards used to verify fluorimeter calibration have been developed. This work was done in response to Tiger Team finding QA/BMP-5, item 4. Data used as input to these calculations is retained in the Tiger Team closeout file, located in the Technical Service Division managers office. 3 refs., 4 tabs

  19. Local-scale modelling of density-driven flow for the phases of repository operation and post-closure at Beberg

    International Nuclear Information System (INIS)

    Jaquet, O.; Siegel, P.

    2004-09-01

    A hydrogeological model was developed for Beberg with the aim of evaluating the impact of a repository (for the operational and post-closure phases) while accounting for the effects of density-driven flow. Two embedded scales were taken into account for this modelling study: a local scale at which the granitic medium was considered as a continuum and a repository scale, where the medium is fractured and therefore was regarded to be discrete. The following step-wise approach was established to model density-driven flow at both repository and local scale: (a) modelling fracture networks at the repository scale, (b) upscaling the hydraulic properties to a continuum at local scale and (c) modelling density-driven flow to evaluate repository impact at local scale. The results demonstrate the strong impact of the repository on the flow field during the phase of operation. The distribution of the salt concentration is affected by a large upcoming effect with increased relative concentration and by the presence of fracture zones carrying freshwater from the surface. The concentrations obtained for the reference case, expressed in terms of percentage with respect to the maximum (prescribed) value in the model, are as follows: ca 30% for the phase of desaturation, and ca 20% for the resaturation phase. For the reference case, the impact of repository operations appears no longer visible after a resaturation period of about 20 years after repository closure; under resaturation conditions, evidence of the operational phase has already disappeared in terms of the observed hydraulic and concentration fields. Sensitivity calculations have proven the importance of explicitly discretising repository tunnels when assessing resaturation time and maximum concentration values. Furthermore, the definition of a fixed potential as boundary condition along the model's top surface is likely to provide underestimated values for the maximum concentration and overestimated flow rates in the

  20. Level densities

    International Nuclear Information System (INIS)

    Ignatyuk, A.V.

    1998-01-01

    For any applications of the statistical theory of nuclear reactions it is very important to obtain the parameters of the level density description from the reliable experimental data. The cumulative numbers of low-lying levels and the average spacings between neutron resonances are usually used as such data. The level density parameters fitted to such data are compiled in the RIPL Starter File for the tree models most frequently used in practical calculations: i) For the Gilber-Cameron model the parameters of the Beijing group, based on a rather recent compilations of the neutron resonance and low-lying level densities and included into the beijing-gc.dat file, are chosen as recommended. As alternative versions the parameters provided by other groups are given into the files: jaeri-gc.dat, bombay-gc.dat, obninsk-gc.dat. Additionally the iljinov-gc.dat, and mengoni-gc.dat files include sets of the level density parameters that take into account the damping of shell effects at high energies. ii) For the backed-shifted Fermi gas model the beijing-bs.dat file is selected as the recommended one. Alternative parameters of the Obninsk group are given in the obninsk-bs.dat file and those of Bombay in bombay-bs.dat. iii) For the generalized superfluid model the Obninsk group parameters included into the obninsk-bcs.dat file are chosen as recommended ones and the beijing-bcs.dat file is included as an alternative set of parameters. iv) For the microscopic approach to the level densities the files are: obninsk-micro.for -FORTRAN 77 source for the microscopical statistical level density code developed in Obninsk by Ignatyuk and coworkers, moller-levels.gz - Moeller single-particle level and ground state deformation data base, moller-levels.for -retrieval code for Moeller single-particle level scheme. (author)

  1. Characteristics of fatal abusive head trauma among children in the USA: 2003-2007: an application of the CDC operational case definition to national vital statistics data.

    Science.gov (United States)

    Parks, Sharyn E; Kegler, Scott R; Annest, Joseph L; Mercy, James A

    2012-06-01

    In March of 2008, an expert panel was convened at the Centers for Disease Control and Prevention to develop code-based case definitions for abusive head trauma (AHT) in children under 5 years of age based on the International Classification of Diseases, 10th Revision (ICD-10) nature and cause of injury codes. This study presents the operational case definition and applies it to US death data. National Center for Health Statistics National Vital Statistics System data on multiple cause-of-death from 2003 to 2007 were examined. Inspection of records with at least one ICD-10 injury/disease code and at least one ICD-10 cause code from the AHT case definition resulted in the identification of 780 fatal AHT cases, with 699 classified as definite/presumptive AHT and 81 classified as probable AHT. The fatal AHT rate was highest among children age definition can help to identify population subgroups at higher risk for AHT defined by year and month of death, age, sex and race/ethnicity. This type of definition may be useful for various epidemiological applications including research and surveillance. These activities can in turn inform further development of prevention activities, including educating parents about the dangers of shaking and strategies for managing infant crying.

  2. A NOVEL APPROACH TO SUPPORT MAJORITY VOTING IN SPATIAL GROUP MCDM USING DENSITY INDUCED OWA OPERATOR FOR SEISMIC VULNERABILITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    M. Moradi

    2014-10-01

    Full Text Available Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other’s opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts’ opinions.

  3. a Novel Approach to Support Majority Voting in Spatial Group Mcdm Using Density Induced Owa Operator for Seismic Vulnerability Assessment

    Science.gov (United States)

    Moradi, M.; Delavar, M. R.; Moshiri, B.; Khamespanah, F.

    2014-10-01

    Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other's opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts' opinions.

  4. Can pre-operative contrast-enhanced dynamic MR imaging for prostate cancer predict microvessel density in prostatectomy specimens?

    Energy Technology Data Exchange (ETDEWEB)

    Schlemmer, Heinz-Peter [Department of Oncological Diagnostics and Therapy, German Cancer Research Center, University Hospital Mannheim, Ruprecht Karls University, Heidelberg (Germany); Department of Diagnostic Radiology, University Hospital Tuebingen, Hoppe-Seyler-Strasse 3, 72076, Tuebingen (Germany); Merkle, Jonas; Kaick, Gerhard van [Department of Oncological Diagnostics and Therapy, German Cancer Research Center, University Hospital Mannheim, Ruprecht Karls University, Heidelberg (Germany); Grobholz, Rainer [Department of Pathology, University Hospital Mannheim, Ruprecht Karls University, Heidelberg (Germany); Jaeger, Tim; Michel, Maurice Stephan [Department of Urology, University Hospital Mannheim, Ruprecht Karls University, Heidelberg (Germany); Werner, Axel; Rabe, Jan [Department of Diagnostic Radiology, University Hospital Mannheim, Ruprecht Karls University, Heidelberg (Germany)

    2004-02-01

    The aim of this study was to correlate quantitative dynamic contrast-enhanced MRI (DCE MRI) parameters with microvessel density (MVD) in prostate carcinoma. Twenty-eight patients with biopsy-proven prostate carcinoma were examined by endorectal MRI including multiplanar T2- and T1-weighted spin-echo and dynamic T1-weighted turbo-FLASH MRI during and after intravenous Gd-DTPA administration. Microvessels were stained on surgical specimens using a CD31 monoclonal antibody. The MVD was quantified in hot spots by counting (MVC) and determining the area fraction by morphometry (MVAF). The DCE MRI data were analyzed using an open pharmacokinetic two-compartment model. In corresponding anatomic locations the time shift ({delta}t) between the beginning of signal enhancement of cancer and adjacent normal prostatic tissue, the degree of contrast enhancement and the contrast exchange rate constant (k{sub 21}) were calculated. The MVC and MVAF were elevated in carcinoma (p<0.001 and p=0.002, respectively) and correlated to k{sub 21} (r=0.62, p<0.001 and r=0.80, p<0.001, respectively). k{sub 21}-values of carcinoma were significantly higher compared with normal peripheral but not central zone tissue. {delta}t was longer in high compared with low-grade tumors (p=0.025). The DCE MRI can provide important information about individual MVD in prostate cancer, which may be helpful for guiding biopsy and assessing individual prognosis. (orig.)

  5. Statistical modeling of Earth's plasmasphere

    Science.gov (United States)

    Veibell, Victoir

    The behavior of plasma near Earth's geosynchronous orbit is of vital importance to both satellite operators and magnetosphere modelers because it also has a significant influence on energy transport, ion composition, and induced currents. The system is highly complex in both time and space, making the forecasting of extreme space weather events difficult. This dissertation examines the behavior and statistical properties of plasma mass density near geosynchronous orbit by using both linear and nonlinear models, as well as epoch analyses, in an attempt to better understand the physical processes that precipitates and drives its variations. It is shown that while equatorial mass density does vary significantly on an hourly timescale when a drop in the disturbance time scale index ( Dst) was observed, it does not vary significantly between the day of a Dst event onset and the day immediately following. It is also shown that increases in equatorial mass density were not, on average, preceded or followed by any significant change in the examined solar wind or geomagnetic variables, including Dst, despite prior results that considered a few selected events and found a notable influence. It is verified that equatorial mass density and and solar activity via the F10.7 index have a strong correlation, which is stronger over longer timescales such as 27 days than it is over an hourly timescale. It is then shown that this connection seems to affect the behavior of equatorial mass density most during periods of strong solar activity leading to large mass density reactions to Dst drops for high values of F10.7. It is also shown that equatorial mass density behaves differently before and after events based on the value of F10.7 at the onset of an equatorial mass density event or a Dst event, and that a southward interplanetary magnetic field at onset leads to slowed mass density growth after event onset. These behavioral differences provide insight into how solar and geomagnetic

  6. Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing.

    Science.gov (United States)

    Roy, Kevin; Undey, Cenk; Mistretta, Thomas; Naugle, Gregory; Sodhi, Manbir

    2014-01-01

    Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning-in-place (CIP) and steaming-in-place (SIP, also known as sterilization-in-place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real-time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers.

  7. Statistics of the thyroid pathology operated in the ORL-CCC service of the Hospital Rafael Angel Calderon Guardia, between 2008-2012

    International Nuclear Information System (INIS)

    Ramos Castro, Paula

    2013-01-01

    The statistics of the Servicio de Otorrinolaringologia y Cirugia de Cabeza y Cuello de Hospital Rafael Angel Calderon Guardia were described, in terms of thyroid surgery; in the period from January 1, 2008 to December 31, 2012, compiling 200 cases of operated patients. Due to the increase in the diagnosis of these diseases in the Costa Rican hospital environment and the lack of an instrument to evaluate the therapeutic action, the need to perform an evaluation of the surgical management given to thyroid pathology is born. Through a table of data complications and the systematic review of the records of the population, the presentation of the results obtained was made using the inclusion criteria such as: patients older than 12 years, diagnosis of benign or malignant thyroid pathology with need of surgical resolution, presence of BAAF reported in medical record, case report of patient in interdisciplinary thyroid session, with surgery or surgeries performed in the period between January 2008 and December 2012, with a written report of surgery and biopsy defined with medical record. The exclusion criteria were: patients without reports of FNAB, operative sheet and/or defined biopsy with medical record, patients whose medical record will not be located in the hospital center to collect the information. (author) [es

  8. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  9. Quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    Quantum mechanics can emerge from classical statistics. A typical quantum system describes an isolated subsystem of a classical statistical ensemble with infinitely many classical states. The state of this subsystem can be characterized by only a few probabilistic observables. Their expectation values define a density matrix if they obey a 'purity constraint'. Then all the usual laws of quantum mechanics follow, including Heisenberg's uncertainty relation, entanglement and a violation of Bell's inequalities. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. Born's rule for quantum mechanical probabilities follows from the probability concept for a classical statistical ensemble. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem. As an illustration, we discuss a classical statistical implementation of a quantum computer.

  10. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  11. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  12. Few-particle quantum dynamics–comparing nonequilibrium Green functions with the generalized Kadanoff–Baym ansatz to density operator theory

    International Nuclear Information System (INIS)

    Hermanns, S; Bonitz, M; Balzer, K

    2013-01-01

    The nonequilibrium description of quantum systems requires, for more than two or three particles, the use of a reduced description to be numerically tractable. Two possible approaches are based on either reduced density matrices or nonequilibrium Green functions (NEGF). Both concepts are formulated in terms of hierarchies of coupled equations—the Bogoliubov-Born-Green-Kirkwood-Yvon (BBGKY) hierarchy for the reduced density operators and the Martin-Schwinger-hierarchy (MS) for the Green functions, respectively. In both cases, similar approximations are introduced to decouple the hierarchy, yet still many questions regarding the correspondence of both approaches remain open. Here we analyze this correspondence by studying the generalized Kadanoff–Baym ansatz (GKBA) that reduces the NEGF to a single-time theory. Starting from the BBGKY-hierarchy we present the approximations that are necessary to recover the GKBA result both, with Hartree-Fock propagators (HF-GKBA) and propagators in second Born approximation. To test the quality of the HF-GKBA, we study the dynamics of a 4-electron Hubbard nanocluster starting from a strong nonequilibrium initial state and compare to exact results and the Wang-Cassing approximation to the BBGKY hierarchy presented recently by Akbari et al. [1].

  13. The SENSE-Isomorphism Theoretical Image Voxel Estimation (SENSE-ITIVE) Model for Reconstruction and Observing Statistical Properties of Reconstruction Operators

    Science.gov (United States)

    Bruce, Iain P.; Karaman, M. Muge; Rowe, Daniel B.

    2012-01-01

    The acquisition of sub-sampled data from an array of receiver coils has become a common means of reducing data acquisition time in MRI. Of the various techniques used in parallel MRI, SENSitivity Encoding (SENSE) is one of the most common, making use of a complex-valued weighted least squares estimation to unfold the aliased images. It was recently shown in Bruce et al. [Magn. Reson. Imag. 29(2011):1267–1287] that when the SENSE model is represented in terms of a real-valued isomorphism, it assumes a skew-symmetric covariance between receiver coils, as well as an identity covariance structure between voxels. In this manuscript, we show that not only is the skew-symmetric coil covariance unlike that of real data, but the estimated covariance structure between voxels over a time series of experimental data is not an identity matrix. As such, a new model, entitled SENSE-ITIVE, is described with both revised coil and voxel covariance structures. Both the SENSE and SENSE-ITIVE models are represented in terms of real-valued isomorphisms, allowing for a statistical analysis of reconstructed voxel means, variances, and correlations resulting from the use of different coil and voxel covariance structures used in the reconstruction processes to be conducted. It is shown through both theoretical and experimental illustrations that the miss-specification of the coil and voxel covariance structures in the SENSE model results in a lower standard deviation in each voxel of the reconstructed images, and thus an artificial increase in SNR, compared to the standard deviation and SNR of the SENSE-ITIVE model where both the coil and voxel covariances are appropriately accounted for. It is also shown that there are differences in the correlations induced by the reconstruction operations of both models, and consequently there are differences in the correlations estimated throughout the course of reconstructed time series. These differences in correlations could result in meaningful

  14. Statistical separability and the impossibility of the superluminal quantum communication

    International Nuclear Information System (INIS)

    Zhang Qiren

    2004-01-01

    The authors analyse the relation and the difference between the quantum correlation of two points in space and the communication between them. The statistical separability of two points in the space is defined and proven. From this statistical separability, authors prove that the superluminal quantum communication between different points is impossible. To emphasis the compatibility between the quantum theory and the relativity, authors write the von Neumann equation of density operator evolution in the multi-time form. (author)

  15. Statistical prediction of seasonal discharge in Central Asia for water resources management: development of a generic (pre-)operational modeling tool

    Science.gov (United States)

    Apel, Heiko; Baimaganbetov, Azamat; Kalashnikova, Olga; Gavrilenko, Nadejda; Abdykerimova, Zharkinay; Agalhanova, Marina; Gerlitz, Lars; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Gafurov, Abror

    2017-04-01

    The semi-arid regions of Central Asia crucially depend on the water resources supplied by the mountainous areas of the Tien-Shan and Pamirs. During the summer months the snow and glacier melt dominated river discharge originating in the mountains provides the main water resource available for agricultural production, but also for storage in reservoirs for energy generation during the winter months. Thus a reliable seasonal forecast of the water resources is crucial for a sustainable management and planning of water resources. In fact, seasonal forecasts are mandatory tasks of all national hydro-meteorological services in the region. In order to support the operational seasonal forecast procedures of hydromet services, this study aims at the development of a generic tool for deriving statistical forecast models of seasonal river discharge. The generic model is kept as simple as possible in order to be driven by available hydrological and meteorological data, and be applicable for all catchments with their often limited data availability in the region. As snowmelt dominates summer runoff, the main meteorological predictors for the forecast models are monthly values of winter precipitation and temperature as recorded by climatological stations in the catchments. These data sets are accompanied by snow cover predictors derived from the operational ModSnow tool, which provides cloud free snow cover data for the selected catchments based on MODIS satellite images. In addition to the meteorological data antecedent streamflow is used as a predictor variable. This basic predictor set was further extended by multi-monthly means of the individual predictors, as well as composites of the predictors. Forecast models are derived based on these predictors as linear combinations of up to 3 or 4 predictors. A user selectable number of best models according to pre-defined performance criteria is extracted automatically by the developed model fitting algorithm, which includes a test

  16. Statistical symmetries in physics

    International Nuclear Information System (INIS)

    Green, H.S.; Adelaide Univ., SA

    1994-01-01

    Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs

  17. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  18. Physics of increased edge electron temperature and density turbulence during ELM-free QH-mode operation on DIII-D

    Science.gov (United States)

    Sung, C.; Rhodes, T. L.; Staebler, G. M.; Yan, Z.; McKee, G. R.; Smith, S. P.; Osborne, T. H.; Peebles, W. A.

    2018-05-01

    For the first time, we report increased edge electron temperature and density turbulence levels ( T˜ e and n˜ e) in Edge Localized Mode free Quiescent H-mode (ELM-free QH-mode) plasmas as compared to the ELMing time period. ELMs can severely damage plasma facing components in fusion plasma devices due to their large transient energy transport, making ELM-free operation a highly sought after goal. The QH-mode is a candidate for this goal as it is ELM-free for times limited only by hardware constraints. It is found that the driving gradients decrease during the QH-mode compared to the ELMing phase, however, a significant decrease in the ExB shearing rate is also observed that taken together is consistent with the increased turbulence. These results are significant as the prediction and control of ELM-free H-mode regimes are crucial for the operation of future fusion devices such as ITER. The changes in the linear growth rates calculated by CGYRO [Candy et al., J. Comput. Phys. 324, 73 (2016)] and the measured ExB shearing rate between ELMing and QH-mode phases are qualitatively consistent with these turbulence changes. Comparison with ELMing and 3D fields ELM suppressed H-mode finds a similar increase in T˜ e and n˜ e, however, with distinctly different origins, the increased driving gradients rather than the changes in the ExB shearing rate in 3D fields ELM suppressed the H-mode. However, linear gyrokinetic calculation results are generally consistent with the increased turbulence in both ELM-controlled discharges.

  19. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  20. Statistical forecast of seasonal discharge in Central Asia using observational records: development of a generic linear modelling tool for operational water resource management

    Directory of Open Access Journals (Sweden)

    H. Apel

    2018-04-01

    Full Text Available The semi-arid regions of Central Asia crucially depend on the water resources supplied by the mountainous areas of the Tien Shan and Pamir and Altai mountains. During the summer months the snow-melt- and glacier-melt-dominated river discharge originating in the mountains provides the main water resource available for agricultural production, but also for storage in reservoirs for energy generation during the winter months. Thus a reliable seasonal forecast of the water resources is crucial for sustainable management and planning of water resources. In fact, seasonal forecasts are mandatory tasks of all national hydro-meteorological services in the region. In order to support the operational seasonal forecast procedures of hydro-meteorological services, this study aims to develop a generic tool for deriving statistical forecast models of seasonal river discharge based solely on observational records. The generic model structure is kept as simple as possible in order to be driven by meteorological and hydrological data readily available at the hydro-meteorological services, and to be applicable for all catchments in the region. As snow melt dominates summer runoff, the main meteorological predictors for the forecast models are monthly values of winter precipitation and temperature, satellite-based snow cover data, and antecedent discharge. This basic predictor set was further extended by multi-monthly means of the individual predictors, as well as composites of the predictors. Forecast models are derived based on these predictors as linear combinations of up to four predictors. A user-selectable number of the best models is extracted automatically by the developed model fitting algorithm, which includes a test for robustness by a leave-one-out cross-validation. Based on the cross-validation the predictive uncertainty was quantified for every prediction model. Forecasts of the mean seasonal discharge of the period April to September are derived

  1. Statistical forecast of seasonal discharge in Central Asia using observational records: development of a generic linear modelling tool for operational water resource management

    Science.gov (United States)

    Apel, Heiko; Abdykerimova, Zharkinay; Agalhanova, Marina; Baimaganbetov, Azamat; Gavrilenko, Nadejda; Gerlitz, Lars; Kalashnikova, Olga; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Gafurov, Abror

    2018-04-01

    The semi-arid regions of Central Asia crucially depend on the water resources supplied by the mountainous areas of the Tien Shan and Pamir and Altai mountains. During the summer months the snow-melt- and glacier-melt-dominated river discharge originating in the mountains provides the main water resource available for agricultural production, but also for storage in reservoirs for energy generation during the winter months. Thus a reliable seasonal forecast of the water resources is crucial for sustainable management and planning of water resources. In fact, seasonal forecasts are mandatory tasks of all national hydro-meteorological services in the region. In order to support the operational seasonal forecast procedures of hydro-meteorological services, this study aims to develop a generic tool for deriving statistical forecast models of seasonal river discharge based solely on observational records. The generic model structure is kept as simple as possible in order to be driven by meteorological and hydrological data readily available at the hydro-meteorological services, and to be applicable for all catchments in the region. As snow melt dominates summer runoff, the main meteorological predictors for the forecast models are monthly values of winter precipitation and temperature, satellite-based snow cover data, and antecedent discharge. This basic predictor set was further extended by multi-monthly means of the individual predictors, as well as composites of the predictors. Forecast models are derived based on these predictors as linear combinations of up to four predictors. A user-selectable number of the best models is extracted automatically by the developed model fitting algorithm, which includes a test for robustness by a leave-one-out cross-validation. Based on the cross-validation the predictive uncertainty was quantified for every prediction model. Forecasts of the mean seasonal discharge of the period April to September are derived every month from

  2. Statistical mechanics of anyons

    International Nuclear Information System (INIS)

    Arovas, D.P.

    1985-01-01

    We study the statistical mechanics of a two-dimensional gas of free anyons - particles which interpolate between Bose-Einstein and Fermi-Dirac character. Thermodynamic quantities are discussed in the low-density regime. In particular, the second virial coefficient is evaluated by two different methods and is found to exhibit a simple, periodic, but nonanalytic behavior as a function of the statistics determining parameter. (orig.)

  3. Statistical Analysis and Time Series Modeling of Air Traffic Operations Data From Flight Service Stations and Terminal Radar Approach Control Facilities : Two Case Studies

    Science.gov (United States)

    1981-10-01

    Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...

  4. Statistical Engineering in Air Traffic Management Research

    Science.gov (United States)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  5. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  6. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  7. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  8. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  9. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  10. Histoplasmosis Statistics

    Science.gov (United States)

    ... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...

  11. Thermodynamics for Fractal Statistics

    OpenAIRE

    da Cruz, Wellington

    1998-01-01

    We consider for an anyon gas its termodynamics properties taking into account the fractal statistics obtained by us recently. This approach describes the anyonic excitations in terms of equivalence classes labeled by fractal parameter or Hausdorff dimension $h$. An exact equation of state is obtained in the high-temperature and low-temperature limits, for gases with a constant density of states.

  12. Statistics of Local Extremes

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Bierbooms, W.; Hansen, Kurt Schaldemose

    2003-01-01

    . A theoretical expression for the probability density function associated with local extremes of a stochasticprocess is presented. The expression is basically based on the lower four statistical moments and a bandwidth parameter. The theoretical expression is subsequently verified by comparison with simulated...

  13. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  14. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  15. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  16. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  17. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  18. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  19. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  20. Design of durability test protocol for vehicular fuel cell systems operated in power-follow mode based on statistical results of on-road data

    Science.gov (United States)

    Xu, Liangfei; Reimer, Uwe; Li, Jianqiu; Huang, Haiyan; Hu, Zunyan; Jiang, Hongliang; Janßen, Holger; Ouyang, Minggao; Lehnert, Werner

    2018-02-01

    City buses using polymer electrolyte membrane (PEM) fuel cells are considered to be the most likely fuel cell vehicles to be commercialized in China. The technical specifications of the fuel cell systems (FCSs) these buses are equipped with will differ based on the powertrain configurations and vehicle control strategies, but can generally be classified into the power-follow and soft-run modes. Each mode imposes different levels of electrochemical stress on the fuel cells. Evaluating the aging behavior of fuel cell stacks under the conditions encountered in fuel cell buses requires new durability test protocols based on statistical results obtained during actual driving tests. In this study, we propose a systematic design method for fuel cell durability test protocols that correspond to the power-follow mode based on three parameters for different fuel cell load ranges. The powertrain configurations and control strategy are described herein, followed by a presentation of the statistical data for the duty cycles of FCSs in one city bus in the demonstration project. Assessment protocols are presented based on the statistical results using mathematical optimization methods, and are compared to existing protocols with respect to common factors, such as time at open circuit voltage and root-mean-square power.

  1. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  2. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  3. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  4. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  5. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  6. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  7. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  8. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  9. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  10. Low Density Supersonic Decelerators

    Data.gov (United States)

    National Aeronautics and Space Administration — The Low-Density Supersonic Decelerator project will demonstrate the use of inflatable structures and advanced parachutes that operate at supersonic speeds to more...

  11. E. coli inactivation by visible light irradiation using a Fe–Cd/TiO2 photocatalyst: Statistical analysis and optimization of operating parameters

    NARCIS (Netherlands)

    Feilizadeh, Mehrzad; Mul, Guido; Vossoughi, M.

    2015-01-01

    In this study, the antibacterial effect of a Fe and Cd co-doped TiO2 (Fe–Cd/TiO2) visible light sensitive photocatalyst was optimized by varying operating parameters and using a response surface methodology to evaluate the experimental data. Twenty sets of disinfection experiments were conducted by

  12. A generalized operational formula based on total electronic densities to obtain 3D pictures of the dual descriptor to reveal nucleophilic and electrophilic sites accurately on closed-shell molecules.

    Science.gov (United States)

    Martínez-Araya, Jorge I

    2016-09-30

    By means of the conceptual density functional theory, the so-called dual descriptor (DD) has been adapted to be used in any closed-shell molecule that presents degeneracy in its frontier molecular orbitals. The latter is of paramount importance because a correct description of local reactivity will allow to predict the most favorable sites on a molecule to undergo nucleophilic or electrophilic attacks; on the contrary, an incomplete description of local reactivity might have serio us consequences, particularly for those experimental chemists that have the need of getting an insight about reactivity of chemical reagents before using them in synthesis to obtain a new compound. In the present work, the old approach based only on electronic densities of frontier molecular orbitals is replaced by the most accurate procedure that implies the use of total electronic densities thus keeping consistency with the essential principle of the DFT in which the electronic density is the fundamental variable and not the molecular orbitals. As a result of the present work, the DD will be able to properly describe local reactivities only in terms of total electronic densities. To test the proposed operational formula, 12 very common molecules were selected as the original definition of the DD was not able to describe their local reactivities properly. The ethylene molecule was additionally used to test the capability of the proposed operational formula to reveal a correct local reactivity even in absence of degeneracy in frontier molecular orbitals. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Emergence of quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C

    2009-01-01

    The conceptual setting of quantum mechanics is subject to an ongoing debate from its beginnings until now. The consequences of the apparent differences between quantum statistics and classical statistics range from the philosophical interpretations to practical issues as quantum computing. In this note we demonstrate how quantum mechanics can emerge from classical statistical systems. We discuss conditions and circumstances for this to happen. Quantum systems describe isolated subsystems of classical statistical systems with infinitely many states. While infinitely many classical observables 'measure' properties of the subsystem and its environment, the state of the subsystem can be characterized by the expectation values of only a few probabilistic observables. They define a density matrix, and all the usual laws of quantum mechanics follow. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem.

  14. [Operative treatment of diabetics with vascular complications : Secondary data analysis of diagnosis-related groups statistics from 2005 to 2014 in Germany].

    Science.gov (United States)

    Olm, M; Kühnl, A; Knipfer, E; Salvermoser, M; Eckstein, H-H; Zimmermann, A

    2018-03-27

    In Germany approximately 40,000 amputations per year are performed on patients with diabetes mellitus, often with accompanying vascular complications. The aim of this study was to present the various degrees of severity of the vascular complications and the temporal changes of the treatment options in diabetics with vascular complications in Germany. The microdata of the diagnosis-related groups (DRG) statistics of the Federal Statistical Office were analyzed over the period from 2005 to 2014. All cases were included in which the main or secondary diagnosis of diabetes mellitus with concurrent vascular complications (diabetic angiopathy and peripheral arterial disease) was encrypted. The median age of the 1,811,422 cases was 73 years and 62% were male. While the total number of amputations remained stable over time, there was a 41% reduction in knee-preserving and a 31% reduction in non-knee preserving major amputations with an 18% increase in minor amputations. Revascularization increased by 33% from 36 procedures in 2005 to 48 procedures per 100,000 inhabitants. The increase in revascularization was evident in the area of endovascular therapy alone where there was an increase of 78%. Due to the significant increase in endovascular revascularization measures, there was a significant increase in the proportion of diabetes patients with vascular pathologies in whom revascularization was carried out. As a result, improved limb preservation was achieved despite equally high amputation rates due to increasing minor amputation rates.

  15. Multi-cluster processor operating only select number of clusters during each phase based on program statistic monitored at predetermined intervals

    Science.gov (United States)

    Balasubramonian, Rajeev [Sandy, UT; Dwarkadas, Sandhya [Rochester, NY; Albonesi, David [Ithaca, NY

    2009-02-10

    In a processor having multiple clusters which operate in parallel, the number of clusters in use can be varied dynamically. At the start of each program phase, the configuration option for an interval is run to determine the optimal configuration, which is used until the next phase change is detected. The optimum instruction interval is determined by starting with a minimum interval and doubling it until a low stability factor is reached.

  16. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  17. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  18. Correlation Analysis of Rainstorm Runoff and Density Current in a Canyon-Shaped Source Water Reservoir: Implications for Reservoir Optimal Operation

    Directory of Open Access Journals (Sweden)

    Yang Li

    2018-04-01

    Full Text Available Extreme weather has recently become frequent. Heavy rainfall forms storm runoff, which is usually very turbid and contains a high concentration of organic matter, therefore affecting water quality when it enters reservoirs. The large canyon-shaped Heihe Reservoir is the most important raw water source for the city of Xi’an. During the flood season, storm runoff flows into the reservoir as a density current. We determined the relationship among inflow peak discharge (Q, suspended sediment concentration, inflow water temperature, and undercurrent water density. The relationships between (Q and inflow suspended sediment concentration (CS0 could be described by the equation CS0 = 0.3899 × e0.0025Q, that between CS0 and suspended sediment concentration at the entrance of the main reservoir area S1 (CS1 was determined using CS1 = 0.0346 × e0.2335CS0, and air temperature (Ta and inflow water temperature (Tw based on the meteorological data were related as follows: Tw = 0.7718 × Ta + 1.0979. Then, we calculated the density of the undercurrent layer. Compared to the vertical water density distribution at S1 before rainfall, the undercurrent elevation was determined based on the principle of equivalent density inflow. Based on our results, we proposed schemes for optimizing water intake selection and flood discharge during the flood season.

  19. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  20. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  1. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  2. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  3. Solution of the statistical bootstrap with Bose statistics

    International Nuclear Information System (INIS)

    Engels, J.; Fabricius, K.; Schilling, K.

    1977-01-01

    A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density

  4. Statistical analysis of disruptions in JET

    International Nuclear Information System (INIS)

    De Vries, P.C.; Johnson, M.F.; Segui, I.

    2009-01-01

    The disruption rate (the percentage of discharges that disrupt) in JET was found to drop steadily over the years. Recent campaigns (2005-2007) show a yearly averaged disruption rate of only 6% while from 1991 to 1995 this was often higher than 20%. Besides the disruption rate, the so-called disruptivity, or the likelihood of a disruption depending on the plasma parameters, has been determined. The disruptivity of plasmas was found to be significantly higher close to the three main operational boundaries for tokamaks; the low-q, high density and β-limit. The frequency at which JET operated close to the density-limit increased six fold over the last decade; however, only a small reduction in disruptivity was found. Similarly the disruptivity close to the low-q and β-limit was found to be unchanged. The most significant reduction in disruptivity was found far from the operational boundaries, leading to the conclusion that the improved disruption rate is due to a better technical capability of operating JET, instead of safer operations close to the physics limits. The statistics showed that a simple protection system was able to mitigate the forces of a large fraction of disruptions, although it has proved to be at present more difficult to ameliorate the heat flux.

  5. Nuclear level density

    International Nuclear Information System (INIS)

    Cardoso Junior, J.L.

    1982-10-01

    Experimental data show that the number of nuclear states increases rapidly with increasing excitation energy. The properties of highly excited nuclei are important for many nuclear reactions, mainly those that go via processes of the compound nucleus type. In this case, it is sufficient to know the statistical properties of the nuclear levels. First of them is the function of nuclear levels density. Several theoretical models which describe the level density are presented. The statistical mechanics and a quantum mechanics formalisms as well as semi-empirical results are analysed and discussed. (Author) [pt

  6. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  7. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  8. Symmetry and statistics

    International Nuclear Information System (INIS)

    French, J.B.

    1974-01-01

    The concepts of statistical behavior and symmetry are presented from the point of view of many body spectroscopy. Remarks are made on methods for the evaluation of moments, particularly widths, for the purpose of giving a feeling for the types of mathematical structures encountered. Applications involving ground state energies, spectra, and level densities are discussed. The extent to which Hamiltonian eigenstates belong to irreducible representations is mentioned. (4 figures, 1 table) (U.S.)

  9. Road density

    Data.gov (United States)

    U.S. Environmental Protection Agency — Road density is generally highly correlated with amount of developed land cover. High road densities usually indicate high levels of ecological disturbance. More...

  10. Ocean Drilling Program: Web Site Access Statistics

    Science.gov (United States)

    web site ODP/TAMU Science Operator Home Ocean Drilling Program Web Site Access Statistics* Overview See statistics for JOIDES members. See statistics for Janus database. 1997 October November December

  11. Operating Characteristics of Statistical Methods for Detecting Gene-by-Measured Environment Interaction in the Presence of Gene-Environment Correlation under Violations of Distributional Assumptions.

    Science.gov (United States)

    Van Hulle, Carol A; Rathouz, Paul J

    2015-02-01

    Accurately identifying interactions between genetic vulnerabilities and environmental factors is of critical importance for genetic research on health and behavior. In the previous work of Van Hulle et al. (Behavior Genetics, Vol. 43, 2013, pp. 71-84), we explored the operating characteristics for a set of biometric (e.g., twin) models of Rathouz et al. (Behavior Genetics, Vol. 38, 2008, pp. 301-315), for testing gene-by-measured environment interaction (GxM) in the presence of gene-by-measured environment correlation (rGM) where data followed the assumed distributional structure. Here we explore the effects that violating distributional assumptions have on the operating characteristics of these same models even when structural model assumptions are correct. We simulated N = 2,000 replicates of n = 1,000 twin pairs under a number of conditions. Non-normality was imposed on either the putative moderator or on the ultimate outcome by ordinalizing or censoring the data. We examined the empirical Type I error rates and compared Bayesian information criterion (BIC) values. In general, non-normality in the putative moderator had little impact on the Type I error rates or BIC comparisons. In contrast, non-normality in the outcome was often mistaken for or masked GxM, especially when the outcome data were censored.

  12. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  13. Escherichia coli bacteria density in relation to turbidity, streamflow characteristics, and season in the Chattahoochee River near Atlanta, Georgia, October 2000 through September 2008—Description, statistical analysis, and predictive modeling

    Science.gov (United States)

    Lawrence, Stephen J.

    2012-01-01

    Water-based recreation—such as rafting, canoeing, and fishing—is popular among visitors to the Chattahoochee River National Recreation Area (CRNRA) in north Georgia. The CRNRA is a 48-mile reach of the Chattahoochee River upstream from Atlanta, Georgia, managed by the National Park Service (NPS). Historically, high densities of fecal-indicator bacteria have been documented in the Chattahoochee River and its tributaries at levels that commonly exceeded Georgia water-quality standards. In October 2000, the NPS partnered with the U.S. Geological Survey (USGS), State and local agencies, and non-governmental organizations to monitor Escherichia coli bacteria (E. coli) density and develop a system to alert river users when E. coli densities exceeded the U.S. Environmental Protection Agency (USEPA) single-sample beach criterion of 235 colonies (most probable number) per 100 milliliters (MPN/100 mL) of water. This program, called BacteriALERT, monitors E. coli density, turbidity, and water temperature at two sites on the Chattahoochee River upstream from Atlanta, Georgia. This report summarizes E. coli bacteria density and turbidity values in water samples collected between 2000 and 2008 as part of the BacteriALERT program; describes the relations between E. coli density and turbidity, streamflow characteristics, and season; and describes the regression analyses used to develop predictive models that estimate E. coli density in real time at both sampling sites.

  14. Three-dimensional Monte Carlo simulations of W7-X plasma transport: density control and particle balance in steady-state operations

    International Nuclear Information System (INIS)

    Sharma, D.; Feng, Y.; Sardei, F.; Reiter, D.

    2005-01-01

    This paper presents self-consistent three-dimensional (3D) plasma transport simulations in the boundary of stellarator W7-X obtained with the Monte Carlo code EMC3-EIRENE for three typical island divertor configurations. The chosen 3D grid consists of relatively simple nested finite toroidal surfaces defined on a toroidal field period and covering the whole edge topology, which includes closed surfaces, islands and ergodic regions. Local grid refinements account for the required high resolution in the divertor region. The distribution of plasma density and temperature in the divertor region, as well as the power deposition profiles on the divertor plates, are shown to strongly depend on the island geometry, i.e. on the position and size of the dominant island chain. Configurations with strike-point positions closer to the gap of the divertor chamber generally favour the neutral compression in the divertor chamber and hence the pumping efficiency. The ratio of pumping to recycling fluxes is found to be roughly independent of the separatrix density and is thus a figure of merit for the quality of the configuration and of the divertor system in terms of density control. Lower limits for the achievable separatrix density, which determine the particle exhaust capabilities in stationary conditions, are compared for the three W7-X configurations

  15. Development of a Density Sensor for In-Line Real-Time Process Control and Monitoring of Slurries during Radioactive Waste Retrieval and Transport Operations at DOE Sites

    International Nuclear Information System (INIS)

    Bamberger, Judith A.; Greenwood, Margaret S.

    2000-01-01

    A density sensor (densimeter) to monitor and control slurries in-line real-time during radioactive waste retrieval and transport and detect conditions leading to degraded transport and line plugging is described. Benefits over baseline grab samples and off line analysis include: early detection and prevention of pipeline plugging, real-time density through the transfer process, elimination of grab sampling and off-line analysis, and reduced worker radiation exposure. The sensor is small, robust and could be retrofitted into existing pump pit manifolds and transfer lines. The probe uses ultrasonic signal reflection at the fluid-pipe wall interface to quantify density and features include: a non-intrusive sensing surface located flush with the pipeline wall; performance that is not affected by entrained air or by electromagnetic noise from nearby pumps and other equipment and is compact. Components were tested for chemical and radiation resistance and the spool piece was pressure tested in accordance with ASME Process Piping Code B31.3 and approved by the Hanford Site Flammable Gas Equipment Advisory Board for installation. During pipeline tests, the sensor predicted density within+ 2% oriented in vertical and horizontal position. The densimeter is installed in the modified process manifold that is installed in the prefabricated pump pit at Hanford tank SY-101 site. In FY-2002 the density sensor performance will be evaluated during transfers of both water and waste through the pipeline. A separate project developed an ultrasonic sensor that: (1) can be attached permanently to a pipeline wall, possibly as a spool piece inserted into the line or (2) can clamp onto an existing pipeline wall and be movable to another location. This method is attractive for radioactive fluids transport applications because the sensors could be applied to existing equipment without the need to penetrate the pipe pressure boundary or to open the system to install new equipment

  16. Air Carrier Traffic Statistics.

    Science.gov (United States)

    2013-11-01

    This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...

  17. Air Carrier Traffic Statistics.

    Science.gov (United States)

    2012-07-01

    This report contains airline operating statistics for large certificated air carriers based on data reported to U.S. Department of Transportation (DOT) by carriers that hold a certificate issued under Section 401 of the Federal Aviation Act of 1958 a...

  18. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  19. Beyond quantum microcanonical statistics

    International Nuclear Information System (INIS)

    Fresch, Barbara; Moro, Giorgio J.

    2011-01-01

    Descriptions of molecular systems usually refer to two distinct theoretical frameworks. On the one hand the quantum pure state, i.e., the wavefunction, of an isolated system is determined to calculate molecular properties and their time evolution according to the unitary Schroedinger equation. On the other hand a mixed state, i.e., a statistical density matrix, is the standard formalism to account for thermal equilibrium, as postulated in the microcanonical quantum statistics. In the present paper an alternative treatment relying on a statistical analysis of the possible wavefunctions of an isolated system is presented. In analogy with the classical ergodic theory, the time evolution of the wavefunction determines the probability distribution in the phase space pertaining to an isolated system. However, this alone cannot account for a well defined thermodynamical description of the system in the macroscopic limit, unless a suitable probability distribution for the quantum constants of motion is introduced. We present a workable formalism assuring the emergence of typical values of thermodynamic functions, such as the internal energy and the entropy, in the large size limit of the system. This allows the identification of macroscopic properties independently of the specific realization of the quantum state. A description of material systems in agreement with equilibrium thermodynamics is then derived without constraints on the physical constituents and interactions of the system. Furthermore, the canonical statistics is recovered in all generality for the reduced density matrix of a subsystem.

  20. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  1. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  2. Advanced Approaches to Greatly Reduce Hydrogen Gas Crossover Losses in PEM Electrolyzers Operating at High Pressures and Low Current Densities, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ElectroChem proposes technology advances in its unique PEM IFF water electrolyzer design to meet the NASA requirement for an electrolyzer that will operate very...

  3. Dynamic statistical information theory

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fokker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dynamic entropy density and dynamic information density and the nonlinear evolution equations of Boltzmann dynamic entropy density and dynamic information density, that describe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and information have been combined with the state and its law of motion of the systems. Furthermore we presented the formulas of two kinds of entropy production rates and information dissipation rates, the expressions of two kinds of drift information flows and diffusion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy production rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel

  4. An operational perspective of challenging statistical dogma while establishing a modern, secure distributed data management and imaging transport system: the Pediatric Brain Tumor Consortium phase I experience.

    Science.gov (United States)

    Onar, Arzu; Ramamurthy, Uma; Wallace, Dana; Boyett, James M

    2009-04-01

    The Pediatric Brain Tumor Consortium (PBTC) is a multidisciplinary cooperative research organization devoted to the study of correlative tumor biology and new therapies for primary central nervous system (CNS) tumors of childhood. The PBTC was created in 1999 to conduct early-phase studies in a rapid fashion in order to provide sound scientific foundation for the Children's Oncology Group to conduct definitive trials. The Operations and Biostatistics Center (OBC) of the PBTC is responsible for centrally administering study design and trial development, study conduct and monitoring, data collection and management as well as various regulatory and compliance processes. The phase I designs utilized for the consortium trials have accommodated challenges unique to pediatric trials such as body surface area (BSA)-based dosing in the absence of pediatric formulations of oral agents. Further during the past decade, the OBC has developed and implemented a state-of-the-art secure and efficient internet-based paperless distributed data management system. Additional web-based systems are also in place for tracking and distributing correlative study data as well as neuroimaging files. These systems enable effective communications among the members of the consortium and facilitate the conduct and timely reporting of multi-institutional early-phase clinical trials.

  5. Bit Grooming: statistically accurate precision-preserving quantization with compression, evaluated in the netCDF Operators (NCO, v4.4.8+)

    Science.gov (United States)

    Zender, Charles S.

    2016-09-01

    Geoscientific models and measurements generate false precision (scientifically meaningless data bits) that wastes storage space. False precision can mislead (by implying noise is signal) and be scientifically pointless, especially for measurements. By contrast, lossy compression can be both economical (save space) and heuristic (clarify data limitations) without compromising the scientific integrity of data. Data quantization can thus be appropriate regardless of whether space limitations are a concern. We introduce, implement, and characterize a new lossy compression scheme suitable for IEEE floating-point data. Our new Bit Grooming algorithm alternately shaves (to zero) and sets (to one) the least significant bits of consecutive values to preserve a desired precision. This is a symmetric, two-sided variant of an algorithm sometimes called Bit Shaving that quantizes values solely by zeroing bits. Our variation eliminates the artificial low bias produced by always zeroing bits, and makes Bit Grooming more suitable for arrays and multi-dimensional fields whose mean statistics are important. Bit Grooming relies on standard lossless compression to achieve the actual reduction in storage space, so we tested Bit Grooming by applying the DEFLATE compression algorithm to bit-groomed and full-precision climate data stored in netCDF3, netCDF4, HDF4, and HDF5 formats. Bit Grooming reduces the storage space required by initially uncompressed and compressed climate data by 25-80 and 5-65 %, respectively, for single-precision values (the most common case for climate data) quantized to retain 1-5 decimal digits of precision. The potential reduction is greater for double-precision datasets. When used aggressively (i.e., preserving only 1-2 digits), Bit Grooming produces storage reductions comparable to other quantization techniques such as Linear Packing. Unlike Linear Packing, whose guaranteed precision rapidly degrades within the relatively narrow dynamic range of values that

  6. Statistical Physics and Light-Front Quantization

    Energy Technology Data Exchange (ETDEWEB)

    Raufeisen, J

    2004-08-12

    Light-front quantization has important advantages for describing relativistic statistical systems, particularly systems for which boost invariance is essential, such as the fireball created in a heavy ion collisions. In this paper the authors develop light-front field theory at finite temperature and density with special attention to quantum chromodynamics. They construct the most general form of the statistical operator allowed by the Poincare algebra and show that there are no zero-mode related problems when describing phase transitions. They then demonstrate a direct connection between densities in light-front thermal field theory and the parton distributions measured in hard scattering experiments. The approach thus generalizes the concept of a parton distribution to finite temperature. In light-front quantization, the gauge-invariant Green's functions of a quark in a medium can be defined in terms of just 2-component spinors and have a much simpler spinor structure than the equal-time fermion propagator. From the Green's function, the authors introduce the new concept of a light-front density matrix, whose matrix elements are related to forward and to off-diagonal parton distributions. Furthermore, they explain how thermodynamic quantities can be calculated in discretized light-cone quantization, which is applicable at high chemical potential and is not plagued by the fermion-doubling problems.

  7. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  8. Lung density

    DEFF Research Database (Denmark)

    Garnett, E S; Webber, C E; Coates, G

    1977-01-01

    The density of a defined volume of the human lung can be measured in vivo by a new noninvasive technique. A beam of gamma-rays is directed at the lung and, by measuring the scattered gamma-rays, lung density is calculated. The density in the lower lobe of the right lung in normal man during quiet...... breathing in the sitting position ranged from 0.25 to 0.37 g.cm-3. Subnormal values were found in patients with emphsema. In patients with pulmonary congestion and edema, lung density values ranged from 0.33 to 0.93 g.cm-3. The lung density measurement correlated well with the findings in chest radiographs...... but the lung density values were more sensitive indices. This was particularly evident in serial observations of individual patients....

  9. Applied Statistics.

    Science.gov (United States)

    1980-03-18

    of Operations Research Society , 25, 1977, 493-505. Limit probabilities n a multi-type critical age-dependent branching process, Howard Weiner...11, 9/23/77. Scandi Acturial Journal, 1978, 211-224. q. On the distribution of the greatest coon divisor, Persl Diaconis & Paul Erdos. Technical Report...Technical Report No. 20, 7/12/78. Journal of the Roya-- Eatistical Society , Series B, 40, 1978, 64,70. Toward characterizing Boolean transformations, Alan

  10. Lidar measurements of plume statistics

    DEFF Research Database (Denmark)

    Ejsing Jørgensen, Hans; Mikkelsen, T.

    1993-01-01

    of measured crosswind concentration profiles, the following statistics were obtained: 1) Mean profile, 2) Root mean square profile, 3) Fluctuation intensities,and 4)Intermittency factors. Furthermore, some experimentally determined probability density functions (pdf's) of the fluctuations are presented. All...... the measured statistics are referred to a fixed and a 'moving' frame of reference, the latter being defined as a frame of reference from which the (low frequency) plume meander is removed. Finally, the measured statistics are compared with statistics on concentration fluctuations obtained with a simple puff...

  11. A reciprocal of Coleman's theorem and the quantum statistics of systems with spontaneous symmetry breaking

    International Nuclear Information System (INIS)

    Chaichian, M.; Montonen, C.; Perez Rojas, H.

    1991-01-01

    The completely different conservation properties of charges associated to unbroken and broken symmetries are discussed. The impossibility of establishing a conservation law for nondegenerate Hilbert space representations in the broken case leads to a reciprocal of Coleman's theorem. The quantum statistical implication is that these charges cannot be introduced as conserved operators in the density matrix. (orig.)

  12. Density functional theory

    International Nuclear Information System (INIS)

    Das, M.P.

    1984-07-01

    The state of the art of the density functional formalism (DFT) is reviewed. The theory is quantum statistical in nature; its simplest version is the well-known Thomas-Fermi theory. The DFT is a powerful formalism in which one can treat the effect of interactions in inhomogeneous systems. After some introductory material, the DFT is outlined from the two basic theorems, and various generalizations of the theorems appropriate to several physical situations are pointed out. Next, various approximations to the density functionals are presented and some practical schemes, discussed; the approximations include an electron gas of almost constant density and an electron gas of slowly varying density. Then applications of DFT in various diverse areas of physics (atomic systems, plasmas, liquids, nuclear matter) are mentioned, and its strengths and weaknesses are pointed out. In conclusion, more recent developments of DFT are indicated

  13. Functional statistics and related fields

    CERN Document Server

    Bongiorno, Enea; Cao, Ricardo; Vieu, Philippe

    2017-01-01

    This volume collects latest methodological and applied contributions on functional, high-dimensional and other complex data, related statistical models and tools as well as on operator-based statistics. It contains selected and refereed contributions presented at the Fourth International Workshop on Functional and Operatorial Statistics (IWFOS 2017) held in A Coruña, Spain, from 15 to 17 June 2017. The series of IWFOS workshops was initiated by the Working Group on Functional and Operatorial Statistics at the University of Toulouse in 2008. Since then, many of the major advances in functional statistics and related fields have been periodically presented and discussed at the IWFOS workshops. .

  14. Density estimation by maximum quantum entropy

    International Nuclear Information System (INIS)

    Silver, R.N.; Wallstrom, T.; Martz, H.F.

    1993-01-01

    A new Bayesian method for non-parametric density estimation is proposed, based on a mathematical analogy to quantum statistical physics. The mathematical procedure is related to maximum entropy methods for inverse problems and image reconstruction. The information divergence enforces global smoothing toward default models, convexity, positivity, extensivity and normalization. The novel feature is the replacement of classical entropy by quantum entropy, so that local smoothing is enforced by constraints on differential operators. The linear response of the estimate is proportional to the covariance. The hyperparameters are estimated by type-II maximum likelihood (evidence). The method is demonstrated on textbook data sets

  15. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  16. FAA statistical handbook of aviation

    Science.gov (United States)

    1994-01-01

    This report presents statistical information pertaining to the Federal Aviation Administration, the National Airspace System, Airports, Airport Activity, U.S. Civil Air Carrier Fleet, U.S. Civil Air Carrier Operating Data, Airmen, General Aviation Ai...

  17. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  18. Image Filtering with Boolean and Statistical Operators.

    Science.gov (United States)

    1983-12-01

    S3(2) COMPLEX AMAT(256, 4). BMAT (256. 4). CMAT(256. 4) CALL IOF(3. MAIN. AFLNM. DFLNI, CFLNM. MS., 82, S3) CALL OPEN(1.AFLNM* 1.IER) CALL CHECKC!ER...RDBLK(2. 6164. MAT. 16, IER) CALL CHECK(IER) DO I K-1. 4 DO I J-1.256 CMAT(J. K)-AMAT(J. K)’. BMAT (J. K) I CONTINUE S CALL WRBLK(3. 164!. CMAT. 16. IER

  19. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter; Aplicacion del modelo estadistico de Thomas-Fermi a la termodinamica de medios ultradensos

    Energy Technology Data Exchange (ETDEWEB)

    Martin, R

    1977-07-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermis equation. Approximate calculus methods are found from analytic study of the T-Fermis equation for non zero temperature. T-Fermis equation is solved with the code GOLEM written in FORTRAN V (UNIVAC). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (Author) 24 refs.

  20. Radiation transport in statistically inhomogeneous rocks

    International Nuclear Information System (INIS)

    Lukhminskij, B.E.

    1975-01-01

    A study has been made of radiation transfer in statistically inhomogeneous rocks. Account has been taken of the statistical character of rock composition through randomization of density. Formulas are summarized for sigma-distribution, homogeneous density, the Simpson and Cauchy distributions. Consideration is given to the statistics of mean square ranges in a medium, simulated by the jump Markov random function. A quantitative criterion of rock heterogeneity is proposed

  1. Low Bone Density

    Science.gov (United States)

    ... Density Exam/Testing › Low Bone Density Low Bone Density Low bone density is when your bone density ... people with normal bone density. Detecting Low Bone Density A bone density test will determine whether you ...

  2. Unique Microstructural Changes in the Brain Associated with Urological Chronic Pelvic Pain Syndrome (UCPPS Revealed by Diffusion Tensor MRI, Super-Resolution Track Density Imaging, and Statistical Parameter Mapping: A MAPP Network Neuroimaging Study.

    Directory of Open Access Journals (Sweden)

    Davis Woodworth

    Full Text Available Studies have suggested chronic pain syndromes are associated with neural reorganization in specific regions associated with perception, processing, and integration of pain. Urological chronic pelvic pain syndrome (UCPPS represents a collection of pain syndromes characterized by pelvic pain, namely Chronic Prostatitis/Chronic Pelvic Pain Syndrome (CP/CPPS and Interstitial Cystitis/Painful Bladder Syndrome (IC/PBS, that are both poorly understood in their pathophysiology, and treated ineffectively. We hypothesized patients with UCPPS may have microstructural differences in the brain compared with healthy control subjects (HCs, as well as patients with irritable bowel syndrome (IBS, a common gastrointestinal pain disorder. In the current study we performed population-based voxel-wise DTI and super-resolution track density imaging (TDI in a large, two-center sample of phenotyped patients from the multicenter cohort with UCPPS (N = 45, IBS (N = 39, and HCs (N = 56 as part of the MAPP Research Network. Compared with HCs, UCPPS patients had lower fractional anisotropy (FA, lower generalized anisotropy (GA, lower track density, and higher mean diffusivity (MD in brain regions commonly associated with perception and integration of pain information. Results also showed significant differences in specific anatomical regions in UCPPS patients when compared with IBS patients, consistent with microstructural alterations specific to UCPPS. While IBS patients showed clear sex related differences in FA, MD, GA, and track density consistent with previous reports, few such differences were observed in UCPPS patients. Heat maps illustrating the correlation between specific regions of interest and various pain and urinary symptom scores showed clustering of significant associations along the cortico-basal ganglia-thalamic-cortical loop associated with pain integration, modulation, and perception. Together, results suggest patients with UCPPS have extensive

  3. Infrared thermography for wood density estimation

    Science.gov (United States)

    López, Gamaliel; Basterra, Luis-Alfonso; Acuña, Luis

    2018-03-01

    Infrared thermography (IRT) is becoming a commonly used technique to non-destructively inspect and evaluate wood structures. Based on the radiation emitted by all objects, this technique enables the remote visualization of the surface temperature without making contact using a thermographic device. The process of transforming radiant energy into temperature depends on many parameters, and interpreting the results is usually complicated. However, some works have analyzed the operation of IRT and expanded its applications, as found in the latest literature. This work analyzes the effect of density on the thermodynamic behavior of timber to be determined by IRT. The cooling of various wood samples has been registered, and a statistical procedure that enables one to quantitatively estimate the density of timber has been designed. This procedure represents a new method to physically characterize this material.

  4. DAMPING OF ELECTRON DENSITY STRUCTURES AND IMPLICATIONS FOR INTERSTELLAR SCINTILLATION

    International Nuclear Information System (INIS)

    Smith, K. W.; Terry, P. W.

    2011-01-01

    The forms of electron density structures in kinetic Alfven wave (KAW) turbulence are studied in connection with scintillation. The focus is on small scales L ∼ 10 8 -10 10 cm where the KAW regime is active in the interstellar medium, principally within turbulent H II regions. Scales at 10 times the ion gyroradius and smaller are inferred to dominate scintillation in the theory of Boldyrev et al. From numerical solutions of a decaying KAW turbulence model, structure morphology reveals two types of localized structures, filaments and sheets, and shows that they arise in different regimes of resistive and diffusive damping. Minimal resistive damping yields localized current filaments that form out of Gaussian-distributed initial conditions. When resistive damping is large relative to diffusive damping, sheet-like structures form. In the filamentary regime, each filament is associated with a non-localized magnetic and density structure, circularly symmetric in cross section. Density and magnetic fields have Gaussian statistics (as inferred from Gaussian-valued kurtosis) while density gradients are strongly non-Gaussian, more so than current. This enhancement of non-Gaussian statistics in a derivative field is expected since gradient operations enhance small-scale fluctuations. The enhancement of density gradient kurtosis over current kurtosis is not obvious, yet it suggests that modest density fluctuations may yield large scintillation events during pulsar signal propagation. In the sheet regime the same statistical observations hold, despite the absence of localized filamentary structures. Probability density functions are constructed from statistical ensembles in both regimes, showing clear formation of long, highly non-Gaussian tails.

  5. Density in Liquids.

    Science.gov (United States)

    Nesin, Gert; Barrow, Lloyd H.

    1984-01-01

    Describes a fourth-grade unit on density which introduces a concept useful in the study of chemistry and procedures appropriate to the chemistry laboratory. The hands-on activities, which use simple equipment and household substances, are at the level of thinking Piaget describes as concrete operational. (BC)

  6. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-01-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the resutls on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monople giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excelent agreement with recent experimental data, showing that the decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  7. Statistical decay of giant resonances

    International Nuclear Information System (INIS)

    Dias, H.; Teruya, N.; Wolynec, E.

    1986-02-01

    Statistical calculations to predict the neutron spectrum resulting from the decay of Giant Resonances are discussed. The dependence of the results on the optical potential parametrization and on the level density of the residual nucleus is assessed. A Hauser-Feshbach calculation is performed for the decay of the monopole giant resonance in 208 Pb using the experimental levels of 207 Pb from a recent compilation. The calculated statistical decay is in excellent agreement with recent experimental data, showing that decay of this resonance is dominantly statistical, as predicted by continuum RPA calculations. (Author) [pt

  8. Statistical aspects of nuclear structure

    International Nuclear Information System (INIS)

    Parikh, J.C.

    1977-01-01

    The statistical properties of energy levels and a statistical approach to transition strengths are discussed in relation to nuclear structure studies at high excitation energies. It is shown that the calculations can be extended to the ground state domain also. The discussion is based on the study of random matrix theory of level density and level spacings, using the Gaussian Orthogonal Ensemble (GOE) concept. The short range and long range correlations are also studied statistically. The polynomial expansion method is used to obtain excitation strengths. (A.K.)

  9. Statistical summary 1990-91

    International Nuclear Information System (INIS)

    1991-01-01

    The information contained in this statistical summary leaflet summarizes in bar charts or pie charts Nuclear Electric's performance in 1990-91 in the areas of finance, plant and plant operations, safety, commercial operations and manpower. It is intended that the information will provide a basis for comparison in future years. The leaflet also includes a summary of Nuclear Electric's environmental policy statement. (UK)

  10. The Statistical Fermi Paradox

    Science.gov (United States)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  11. Statistical implications in Monte Carlo depletions - 051

    International Nuclear Information System (INIS)

    Zhiwen, Xu; Rhodes, J.; Smith, K.

    2010-01-01

    As a result of steady advances of computer power, continuous-energy Monte Carlo depletion analysis is attracting considerable attention for reactor burnup calculations. The typical Monte Carlo analysis is set up as a combination of a Monte Carlo neutron transport solver and a fuel burnup solver. Note that the burnup solver is a deterministic module. The statistical errors in Monte Carlo solutions are introduced into nuclide number densities and propagated along fuel burnup. This paper is towards the understanding of the statistical implications in Monte Carlo depletions, including both statistical bias and statistical variations in depleted fuel number densities. The deterministic Studsvik lattice physics code, CASMO-5, is modified to model the Monte Carlo depletion. The statistical bias in depleted number densities is found to be negligible compared to its statistical variations, which, in turn, demonstrates the correctness of the Monte Carlo depletion method. Meanwhile, the statistical variation in number densities generally increases with burnup. Several possible ways of reducing the statistical errors are discussed: 1) to increase the number of individual Monte Carlo histories; 2) to increase the number of time steps; 3) to run additional independent Monte Carlo depletion cases. Finally, a new Monte Carlo depletion methodology, called the batch depletion method, is proposed, which consists of performing a set of independent Monte Carlo depletions and is thus capable of estimating the overall statistical errors including both the local statistical error and the propagated statistical error. (authors)

  12. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Hansen, Kurt Schaldemose

    2004-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continously increase the knowledge on wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describe the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of high-sampled full-scale time series measurements...... are consistent, given the inevitabel uncertainties associated with model as well as with the extreme value data analysis. Keywords: Statistical model, extreme wind conditions, statistical analysis, turbulence, wind loading, statistical analysis, turbulence, wind loading, wind shear, wind turbines....

  13. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  14. Fission level densities

    International Nuclear Information System (INIS)

    Maslov, V.M.

    1998-01-01

    Fission level densities (or fissioning nucleus level densities at fission saddle deformations) are required for statistical model calculations of actinide fission cross sections. Back-shifted Fermi-Gas Model, Constant Temperature Model and Generalized Superfluid Model (GSM) are widely used for the description of level densities at stable deformations. These models provide approximately identical level density description at excitations close to the neutron binding energy. It is at low excitation energies that they are discrepant, while this energy region is crucial for fission cross section calculations. A drawback of back-shifted Fermi gas model and traditional constant temperature model approaches is that it is difficult to include in a consistent way pair correlations, collective effects and shell effects. Pair, shell and collective properties of nucleus do not reduce just to the renormalization of level density parameter a, but influence the energy dependence of level densities. These effects turn out to be important because they seem to depend upon deformation of either equilibrium or saddle-point. These effects are easily introduced within GSM approach. Fission barriers are another key ingredients involved in the fission cross section calculations. Fission level density and barrier parameters are strongly interdependent. This is the reason for including fission barrier parameters along with the fission level densities in the Starter File. The recommended file is maslov.dat - fission barrier parameters. Recent version of actinide fission barrier data obtained in Obninsk (obninsk.dat) should only be considered as a guide for selection of initial parameters. These data are included in the Starter File, together with the fission barrier parameters recommended by CNDC (beijing.dat), for completeness. (author)

  15. Positive-Operator Valued Measure (POVM Quantization

    Directory of Open Access Journals (Sweden)

    Jean Pierre Gazeau

    2014-12-01

    Full Text Available We present a general formalism for giving a measure space paired with a separable Hilbert space a quantum version based on a normalized positive operator-valued measure. The latter are built from families of density operators labeled by points of the measure space. We especially focus on various probabilistic aspects of these constructions. Simple ormore elaborate examples illustrate the procedure: circle, two-sphere, plane and half-plane. Links with Positive-Operator Valued Measure (POVM quantum measurement and quantum statistical inference are sketched.

  16. From statistic mechanic outside equilibrium to transport equations

    International Nuclear Information System (INIS)

    Balian, R.

    1995-01-01

    This lecture notes give a synthetic view on the foundations of non-equilibrium statistical mechanics. The purpose is to establish the transport equations satisfied by the relevant variables, starting from the microscopic dynamics. The Liouville representation is introduced, and a projection associates with any density operator , for given choice of relevant observables, a reduced density operator. An exact integral-differential equation for the relevant variables is thereby derived. A short-memory approximation then yields the transport equations. A relevant entropy which characterizes the coarseness of the description is associated with each level of description. As an illustration, the classical gas, with its three levels of description and with the Chapman-Enskog method, is discussed. (author). 3 figs., 5 refs

  17. Density limit in ASDEX discharges with peaked density profiles

    International Nuclear Information System (INIS)

    Staebler, A.; Niedermeyer, H.; Loch, R.; Mertens, V.; Mueller, E.R.; Soeldner, F.X.; Wagner, F.

    1989-01-01

    Results concerning the density limit in OH and NI-heated ASDEX discharges with the usually observed broad density profiles have been reported earlier: In ohmic discharges with high q a (q-cylindrical is used throughout this paper) the Murakami parameter (n e R/B t ) is a good scaling parameter. At the high densities edge cooling is observed causing the plasma to shrink until an m=2-instability terminates the discharge. When approaching q a =2 the density limit is no longer proportional to I p ; a minimum exists in n e,max (q a ) at q a ∼2.15. With NI-heating the density limit increases less than proportional to the heating power; the behaviour during the pre-disruptive phase is rather similar to the one of OH discharges. There are specific operating regimes on ASDEX leading to discharges with strongly peaked density profiles: the improved ohmic confinement regime, counter neutral injection, and multipellet injection. These regimes are characterized by enhanced energy and particle confinement. The operational limit in density for these discharges is, therefore, of great interest having furthermore in mind that high central densities are favourable in achieving high fusion yields. In addition, further insight into the mechanisms of the density limit observed in tokamaks may be obtained by comparing plasmas with rather different density profiles at their maximum attainable densities. 7 refs., 2 figs

  18. Statistics and finance an introduction

    CERN Document Server

    Ruppert, David

    2004-01-01

    This textbook emphasizes the applications of statistics and probability to finance. Students are assumed to have had a prior course in statistics, but no background in finance or economics. The basics of probability and statistics are reviewed and more advanced topics in statistics, such as regression, ARMA and GARCH models, the bootstrap, and nonparametric regression using splines, are introduced as needed. The book covers the classical methods of finance such as portfolio theory, CAPM, and the Black-Scholes formula, and it introduces the somewhat newer area of behavioral finance. Applications and use of MATLAB and SAS software are stressed. The book will serve as a text in courses aimed at advanced undergraduates and masters students in statistics, engineering, and applied mathematics as well as quantitatively oriented MBA students. Those in the finance industry wishing to know more statistics could also use it for self-study. David Ruppert is the Andrew Schultz, Jr. Professor of Engineering, School of Oper...

  19. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  20. CRISS power spectral density

    International Nuclear Information System (INIS)

    Vaeth, W.

    1979-04-01

    The correlation of signal components at different frequencies like higher harmonics cannot be detected by a normal power spectral density measurement, since this technique correlates only components at the same frequency. This paper describes a special method for measuring the correlation of two signal components at different frequencies: the CRISS power spectral density. From this new function in frequency analysis, the correlation of two components can be determined quantitatively either they stem from one signal or from two diverse signals. The principle of the method, suitable for the higher harmonics of a signal as well as for any other frequency combinations is shown for the digital frequency analysis technique. Two examples of CRISS power spectral densities demonstrates the operation of the new method. (orig.) [de

  1. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  2. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  3. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  4. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  5. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  6. Statistical Model of Extreme Shear

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2005-01-01

    In order to continue cost-optimisation of modern large wind turbines, it is important to continuously increase the knowledge of wind field parameters relevant to design loads. This paper presents a general statistical model that offers site-specific prediction of the probability density function...... by a model that, on a statistically consistent basis, describes the most likely spatial shape of an extreme wind shear event. Predictions from the model have been compared with results from an extreme value data analysis, based on a large number of full-scale measurements recorded with a high sampling rate...

  7. Statistical assessment of numerous Monte Carlo tallies

    International Nuclear Information System (INIS)

    Kiedrowski, Brian C.; Solomon, Clell J.

    2011-01-01

    Four tests are developed to assess the statistical reliability of collections of tallies that number in thousands or greater. To this end, the relative-variance density function is developed and its moments are studied using simplified, non-transport models. The statistical tests are performed upon the results of MCNP calculations of three different transport test problems and appear to show that the tests are appropriate indicators of global statistical quality. (author)

  8. Introduction to Statistics - eNotes

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Møller, Jan Kloppenborg; Andersen, Elisabeth Wreford

    2015-01-01

    Online textbook used in the introductory statistics courses at DTU. It provides a basic introduction to applied statistics for engineers. The necessary elements from probability theory are introduced (stochastic variable, density and distribution function, mean and variance, etc.) and thereafter...... the most basic statistical analysis methods are presented: Confidence band, hypothesis testing, simulation, simple and muliple regression, ANOVA and analysis of contingency tables. Examples with the software R are included for all presented theory and methods....

  9. Statistics of polarization speckle: theory versus experiment

    DEFF Research Database (Denmark)

    Wang, Wei; Hanson, Steen Grüner; Takeda, Mitsuo

    2010-01-01

    In this paper, we reviewed our recent work on the statistical properties of polarization speckle, described by stochastic Stokes parameters fluctuating in space. Based on the Gaussian assumption for the random electric field components and polar-interferometer, we investigated theoretically...... and experimentally the statistics of Stokes parameters of polarization speckle, including probability density function of Stokes parameters with the spatial degree of polarization, autocorrelation of Stokes vector and statistics of spatial derivatives for Stokes parameters....

  10. Tube problems: worldwide statistics reviewed

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    EPRI's Steam Generator Strategic Management Project issues an annual report on the progress being made in tackling steam generator problems worldwide, containing a wealth of detailed statistics on the status of operating units and degradation mechanisms encountered. A few highlights are presented from the latest report, issued in October 1993, which covers the period to 31 December 1992. (Author)

  11. Beginning R The Statistical Programming Language

    CERN Document Server

    Gardener, Mark

    2012-01-01

    Conquer the complexities of this open source statistical language R is fast becoming the de facto standard for statistical computing and analysis in science, business, engineering, and related fields. This book examines this complex language using simple statistical examples, showing how R operates in a user-friendly context. Both students and workers in fields that require extensive statistical analysis will find this book helpful as they learn to use R for simple summary statistics, hypothesis testing, creating graphs, regression, and much more. It covers formula notation, complex statistics

  12. Statistical Moments in Variable Density Incompressible Mixing Flows

    Science.gov (United States)

    2015-08-28

    59]. The algorithm uses an approximate projection method [16] with the interface modeled with the Immersed Boundary Method ( IBM ), as spread via a nu...and B. C. Watson . Taylor instability of finite surface waves. J. Fluid Mech., 7:177–193, 1960. [32] E. Fermi. Taylor instability of an

  13. Semiclassical statistical mechanics

    International Nuclear Information System (INIS)

    Stratt, R.M.

    1979-04-01

    On the basis of an approach devised by Miller, a formalism is developed which allows the nonperturbative incorporation of quantum effects into equilibrium classical statistical mechanics. The resulting expressions bear a close similarity to classical phase space integrals and, therefore, are easily molded into forms suitable for examining a wide variety of problems. As a demonstration of this, three such problems are briefly considered: the simple harmonic oscillator, the vibrational state distribution of HCl, and the density-independent radial distribution function of He 4 . A more detailed study is then made of two more general applications involving the statistical mechanics of nonanalytic potentials and of fluids. The former, which is a particularly difficult problem for perturbative schemes, is treated with only limited success by restricting phase space and by adding an effective potential. The problem of fluids, however, is readily found to yield to a semiclassical pairwise interaction approximation, which in turn permits any classical many-body model to be expressed in a convenient form. The remainder of the discussion concentrates on some ramifications of having a phase space version of quantum mechanics. To test the breadth of the formulation, the task of constructing quantal ensemble averages of phase space functions is undertaken, and in the process several limitations of the formalism are revealed. A rather different approach is also pursued. The concept of quantum mechanical ergodicity is examined through the use of numerically evaluated eigenstates of the Barbanis potential, and the existence of this quantal ergodicity - normally associated with classical phase space - is verified. 21 figures, 4 tables

  14. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  15. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  16. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  17. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  18. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...

  19. State Transportation Statistics 2011

    Science.gov (United States)

    2012-08-08

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...

  20. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  1. State Transportation Statistics 2013

    Science.gov (United States)

    2014-09-19

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...

  2. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  3. Statistical analysis of JET disruptions

    International Nuclear Information System (INIS)

    Tanga, A.; Johnson, M.F.

    1991-07-01

    In the operation of JET and of any tokamak many discharges are terminated by a major disruption. The disruptive termination of a discharge is usually an unwanted event which may cause damage to the structure of the vessel. In a reactor disruptions are potentially a very serious problem, hence the importance of studying them and devising methods to avoid disruptions. Statistical information has been collected about the disruptions which have occurred at JET over a long span of operations. The analysis is focused on the operational aspects of the disruptions rather than on the underlining physics. (Author)

  4. Statistical modelling with quantile functions

    CERN Document Server

    Gilchrist, Warren

    2000-01-01

    Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...

  5. Density Distribution Sunflower Plots

    Directory of Open Access Journals (Sweden)

    William D. Dupont

    2003-01-01

    Full Text Available Density distribution sunflower plots are used to display high-density bivariate data. They are useful for data where a conventional scatter plot is difficult to read due to overstriking of the plot symbol. The x-y plane is subdivided into a lattice of regular hexagonal bins of width w specified by the user. The user also specifies the values of l, d, and k that affect the plot as follows. Individual observations are plotted when there are less than l observations per bin as in a conventional scatter plot. Each bin with from l to d observations contains a light sunflower. Other bins contain a dark sunflower. In a light sunflower each petal represents one observation. In a dark sunflower, each petal represents k observations. (A dark sunflower with p petals represents between /2-pk k and /2+pk k observations. The user can control the sizes and colors of the sunflowers. By selecting appropriate colors and sizes for the light and dark sunflowers, plots can be obtained that give both the overall sense of the data density distribution as well as the number of data points in any given region. The use of this graphic is illustrated with data from the Framingham Heart Study. A documented Stata program, called sunflower, is available to draw these graphs. It can be downloaded from the Statistical Software Components archive at http://ideas.repec.org/c/boc/bocode/s430201.html . (Journal of Statistical Software 2003; 8 (3: 1-5. Posted at http://www.jstatsoft.org/index.php?vol=8 .

  6. Statistical analysis of field data for aircraft warranties

    Science.gov (United States)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  7. Discrete density of states

    International Nuclear Information System (INIS)

    Aydin, Alhun; Sisman, Altug

    2016-01-01

    By considering the quantum-mechanically minimum allowable energy interval, we exactly count number of states (NOS) and introduce discrete density of states (DOS) concept for a particle in a box for various dimensions. Expressions for bounded and unbounded continua are analytically recovered from discrete ones. Even though substantial fluctuations prevail in discrete DOS, they're almost completely flattened out after summation or integration operation. It's seen that relative errors of analytical expressions of bounded/unbounded continua rapidly decrease for high NOS values (weak confinement or high energy conditions), while the proposed analytical expressions based on Weyl's conjecture always preserve their lower error characteristic. - Highlights: • Discrete density of states considering minimum energy difference is proposed. • Analytical DOS and NOS formulas based on Weyl conjecture are given. • Discrete DOS and NOS functions are examined for various dimensions. • Relative errors of analytical formulas are much better than the conventional ones.

  8. Discrete density of states

    Energy Technology Data Exchange (ETDEWEB)

    Aydin, Alhun; Sisman, Altug, E-mail: sismanal@itu.edu.tr

    2016-03-22

    By considering the quantum-mechanically minimum allowable energy interval, we exactly count number of states (NOS) and introduce discrete density of states (DOS) concept for a particle in a box for various dimensions. Expressions for bounded and unbounded continua are analytically recovered from discrete ones. Even though substantial fluctuations prevail in discrete DOS, they're almost completely flattened out after summation or integration operation. It's seen that relative errors of analytical expressions of bounded/unbounded continua rapidly decrease for high NOS values (weak confinement or high energy conditions), while the proposed analytical expressions based on Weyl's conjecture always preserve their lower error characteristic. - Highlights: • Discrete density of states considering minimum energy difference is proposed. • Analytical DOS and NOS formulas based on Weyl conjecture are given. • Discrete DOS and NOS functions are examined for various dimensions. • Relative errors of analytical formulas are much better than the conventional ones.

  9. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  10. Quantum statistics of dense gases and nonideal plasmas

    CERN Document Server

    Ebeling, Werner; Filinov, Vladimir

    2017-01-01

    The aim of this book is the pedagogical exploration of the basic principles of quantum-statistical thermodynamics as applied to various states of matter – ranging from rare gases to astrophysical matter with high-energy density. The reader will learn in this work that thermodynamics and quantum statistics are still the concepts on which even the most advanced research is operating - despite of a flood of modern concepts, classical entities like temperature, pressure, energy and entropy are shown to remain fundamental. The physics of gases, plasmas and high-energy density matter is still a growing field and even though solids and liquids dominate our daily life, more than 99 percent of the visible Universe is in the state of gases and plasmas and the overwhelming part of matter exists at extreme conditions connected with very large energy densities, such as in the interior of stars. This text, combining material from lectures and advanced seminars given by the authors over many decades, is a must-have intr...

  11. Statistical learning across development: Flexible yet constrained

    Directory of Open Access Journals (Sweden)

    Lauren eKrogh

    2013-01-01

    Full Text Available Much research in the past two decades has documented infants’ and adults' ability to extract statistical regularities from auditory input. Importantly, recent research has extended these findings to the visual domain, demonstrating learners' sensitivity to statistical patterns within visual arrays and sequences of shapes. In this review we discuss both auditory and visual statistical learning to elucidate both the generality of and constraints on statistical learning. The review first outlines the major findings of the statistical learning literature with infants, followed by discussion of statistical learning across domains, modalities, and development. The second part of this review considers constraints on statistical learning. The discussion focuses on two categories of constraint: constraints on the types of input over which statistical learning operates and constraints based on the state of the learner. The review concludes with a discussion of possible mechanisms underlying statistical learning.

  12. A Statistical Study of Eiscat Electron and Ion Temperature Measurements In The E-region

    Science.gov (United States)

    Hussey, G.; Haldoupis, C.; Schlegel, K.; Bösinger, T.

    Motivated by the large EISCAT data base, which covers over 15 years of common programme operation, and previous statistical work with EISCAT data (e.g., C. Hal- doupis, K. Schlegel, and G. Hussey, Auroral E-region electron density gradients mea- sured with EISCAT, Ann. Geopshysicae, 18, 1172-1181, 2000), a detailed statistical analysis of electron and ion EISCAT temperature measurements has been undertaken. This study was specifically concerned with the statistical dependence of heating events with other ambient parameters such as the electric field and electron density. The re- sults showed previously reported dependences such as the electron temperature being directly correlated with the ambient electric field and inversely related to the electron density. However, these correlations were found to be also dependent upon altitude. There was also evidence of the so called "Schlegel effect" (K. Schlegel, Reduced effective recombination coefficient in the disturbed polar E-region, J. Atmos. Terr. Phys., 44, 183-185, 1982); that is, the heated electron gas leads to increases in elec- tron density through a reduction in the recombination rate. This paper will present the statistical heating results and attempt to offer physical explanations and interpretations of the findings.

  13. The dynamics of variable-density turbulence

    International Nuclear Information System (INIS)

    Sandoval, D.L.

    1995-11-01

    The dynamics of variable-density turbulent fluids are studied by direct numerical simulation. The flow is incompressible so that acoustic waves are decoupled from the problem, and implying that density is not a thermodynamic variable. Changes in density occur due to molecular mixing. The velocity field, is in general, divergent. A pseudo-spectral numerical technique is used to solve the equations of motion. Three-dimensional simulations are performed using a grid size of 128 3 grid points. Two types of problems are studied: (1) the decay of isotropic, variable-density turbulence, and (2) buoyancy-generated turbulence in a fluid with large density fluctuations. In the case of isotropic, variable-density turbulence, the overall statistical decay behavior, for the cases studied, is relatively unaffected by the presence of density variations when the initial density and velocity fields are statistically independent. The results for this case are in quantitative agreement with previous numerical and laboratory results. In this case, the initial density field has a bimodal probability density function (pdf) which evolves in time towards a Gaussian distribution. The pdf of the density field is symmetric about its mean value throughout its evolution. If the initial velocity and density fields are statistically dependent, however, the decay process is significantly affected by the density fluctuations. For the case of buoyancy-generated turbulence, variable-density departures from the Boussinesq approximation are studied. The results of the buoyancy-generated turbulence are compared with variable-density model predictions. Both a one-point (engineering) model and a two-point (spectral) model are tested against the numerical data. Some deficiencies in these variable-density models are discussed and modifications are suggested

  14. Statistical theory of dynamo

    Science.gov (United States)

    Kim, E.; Newton, A. P.

    2012-04-01

    One major problem in dynamo theory is the multi-scale nature of the MHD turbulence, which requires statistical theory in terms of probability distribution functions. In this contribution, we present the statistical theory of magnetic fields in a simplified mean field α-Ω dynamo model by varying the statistical property of alpha, including marginal stability and intermittency, and then utilize observational data of solar activity to fine-tune the mean field dynamo model. Specifically, we first present a comprehensive investigation into the effect of the stochastic parameters in a simplified α-Ω dynamo model. Through considering the manifold of marginal stability (the region of parameter space where the mean growth rate is zero), we show that stochastic fluctuations are conductive to dynamo. Furthermore, by considering the cases of fluctuating alpha that are periodic and Gaussian coloured random noise with identical characteristic time-scales and fluctuating amplitudes, we show that the transition to dynamo is significantly facilitated for stochastic alpha with random noise. Furthermore, we show that probability density functions (PDFs) of the growth-rate, magnetic field and magnetic energy can provide a wealth of useful information regarding the dynamo behaviour/intermittency. Finally, the precise statistical property of the dynamo such as temporal correlation and fluctuating amplitude is found to be dependent on the distribution the fluctuations of stochastic parameters. We then use observations of solar activity to constrain parameters relating to the effect in stochastic α-Ω nonlinear dynamo models. This is achieved through performing a comprehensive statistical comparison by computing PDFs of solar activity from observations and from our simulation of mean field dynamo model. The observational data that are used are the time history of solar activity inferred for C14 data in the past 11000 years on a long time scale and direct observations of the sun spot

  15. Non-linearity consideration when analyzing reactor noise statistical characteristics. [BWR

    Energy Technology Data Exchange (ETDEWEB)

    Kebadze, B V; Adamovski, L A

    1975-06-01

    Statistical characteristics of boiling water reactor noise in the vicinity of stability threshold are studied. The reactor is considered as a non-linear system affected by random perturbations. To solve a non-linear problem the principle of statistical linearization is used. It is shown that the halfwidth of resonance peak in neutron power noise spectrum density as well as the reciprocal of noise dispersion, which are used in predicting a stable operation theshold, are different from zero both within and beyond the stability boundary the determination of which was based on linear criteria.

  16. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  17. Transport Statistics - Transport - UNECE

    Science.gov (United States)

    Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6

  18. Density limit experiments on FTU

    International Nuclear Information System (INIS)

    Pucella, G.; Tudisco, O.; Apicella, M.L.; Apruzzese, G.; Artaserse, G.; Belli, F.; Boncagni, L.; Botrugno, A.; Buratti, P.; Calabrò, G.; Castaldo, C.; Cianfarani, C.; Cocilovo, V.; Dimatteo, L.; Esposito, B.; Frigione, D.; Gabellieri, L.; Giovannozzi, E.; Bin, W.; Granucci, G.

    2013-01-01

    One of the main problems in tokamak fusion devices concerns the capability to operate at a high plasma density, which is observed to be limited by the appearance of catastrophic events causing loss of plasma confinement. The commonly used empirical scaling law for the density limit is the Greenwald limit, predicting that the maximum achievable line-averaged density along a central chord depends only on the average plasma current density. However, the Greenwald density limit has been exceeded in tokamak experiments in the case of peaked density profiles, indicating that the edge density is the real parameter responsible for the density limit. Recently, it has been shown on the Frascati Tokamak Upgrade (FTU) that the Greenwald density limit is exceeded in gas-fuelled discharges with a high value of the edge safety factor. In order to understand this behaviour, dedicated density limit experiments were performed on FTU, in which the high density domain was explored in a wide range of values of plasma current (I p = 500–900 kA) and toroidal magnetic field (B T = 4–8 T). These experiments confirm the edge nature of the density limit, as a Greenwald-like scaling holds for the maximum achievable line-averaged density along a peripheral chord passing at r/a ≃ 4/5. On the other hand, the maximum achievable line-averaged density along a central chord does not depend on the average plasma current density and essentially depends on the toroidal magnetic field only. This behaviour is explained in terms of density profile peaking in the high density domain, with a peaking factor at the disruption depending on the edge safety factor. The possibility that the MARFE (multifaced asymmetric radiation from the edge) phenomenon is the cause of the peaking has been considered, with the MARFE believed to form a channel for the penetration of the neutral particles into deeper layers of the plasma. Finally, the magnetohydrodynamic (MHD) analysis has shown that also the central line

  19. Parity and the spin{statistics connection

    Indian Academy of Sciences (India)

    A simple demonstration of the spin-statistics connection for general causal fields is obtained by using the parity operation to exchange spatial coordinates in the scalar product of a locally commuting field operator, evaluated at position x, with the same field operator evaluated at -x, at equal times.

  20. The statistical analysis of anisotropies

    International Nuclear Information System (INIS)

    Webster, A.

    1977-01-01

    One of the many uses to which a radio survey may be put is an analysis of the distribution of the radio sources on the celestial sphere to find out whether they are bunched into clusters or lie in preferred regions of space. There are many methods of testing for clustering in point processes and since they are not all equally good this contribution is presented as a brief guide to what seems to be the best of them. The radio sources certainly do not show very strong clusering and may well be entirely unclustered so if a statistical method is to be useful it must be both powerful and flexible. A statistic is powerful in this context if it can efficiently distinguish a weakly clustered distribution of sources from an unclustered one, and it is flexible if it can be applied in a way which avoids mistaking defects in the survey for true peculiarities in the distribution of sources. The paper divides clustering statistics into two classes: number density statistics and log N/log S statistics. (Auth.)

  1. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  2. Mammography density estimation with automated volumetic breast density measurement

    International Nuclear Information System (INIS)

    Ko, Su Yeon; Kim, Eun Kyung; Kim, Min Jung; Moon, Hee Jung

    2014-01-01

    To compare automated volumetric breast density measurement (VBDM) with radiologists' evaluations based on the Breast Imaging Reporting and Data System (BI-RADS), and to identify the factors associated with technical failure of VBDM. In this study, 1129 women aged 19-82 years who underwent mammography from December 2011 to January 2012 were included. Breast density evaluations by radiologists based on BI-RADS and by VBDM (Volpara Version 1.5.1) were compared. The agreement in interpreting breast density between radiologists and VBDM was determined based on four density grades (D1, D2, D3, and D4) and a binary classification of fatty (D1-2) vs. dense (D3-4) breast using kappa statistics. The association between technical failure of VBDM and patient age, total breast volume, fibroglandular tissue volume, history of partial mastectomy, the frequency of mass > 3 cm, and breast density was analyzed. The agreement between breast density evaluations by radiologists and VBDM was fair (k value = 0.26) when the four density grades (D1/D2/D3/D4) were used and moderate (k value = 0.47) for the binary classification (D1-2/D3-4). Twenty-seven women (2.4%) showed failure of VBDM. Small total breast volume, history of partial mastectomy, and high breast density were significantly associated with technical failure of VBDM (p 0.001 to 0.015). There is fair or moderate agreement in breast density evaluation between radiologists and VBDM. Technical failure of VBDM may be related to small total breast volume, a history of partial mastectomy, and high breast density.

  3. Statistical mechanics of violent relaxation

    International Nuclear Information System (INIS)

    Shu, F.H.

    1978-01-01

    We reexamine the foundations of Lynden-Bell's statistical mechanical discussion of violent relaxation in collisionless stellar systems. We argue that Lynden-Bell's formulation in terms of a continuum description introduces unnecessary complications, and we consider a more conventional formulation in terms of particles. We then find the exclusion principle discovered by Lynden-Bell to be quantitatively important only at phase densities where two-body encounters are no longer negligible. Since the edynamical basis for the exclusion principle vanishes in such cases anyway, Lynden-Bell statistics always reduces in practice to Maxwell-Boltzmann statistics when applied to stellar systems. Lynden-Bell also found the equilibrium distribution function generally to be a sum of Maxwellians with velocity dispersions dependent on the phase density at star formation. We show that this difficulty vanishes in the particulate description for an encounterless stellar system as long as stars of different masses are initially well mixed in phase space. Our methods also demonstrate the equivalence between Gibbs's formalism which uses the microcanonical ensemble and Boltzmann's formalism which uses a coarse-grained continuum description. In addition, we clarify the concept of irreversible behavior on a macroscopic scale for an encounterless stellar system. Finally, we comment on the use of unusual macroscopic constraints to simulate the effects of incomplete relaxation

  4. Statistical inference and Aristotle's Rhetoric.

    Science.gov (United States)

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  5. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.

  6. The relation between herbivore density and relative resource ...

    African Journals Online (AJOL)

    The relation between kudu density and the relative density of habitat patches in each landscape was significant, with exponential models producing more significant statistics than linear models. Regressions of resource density against animal density are useful to understand 'carrying capacity' for wild herbivores, and ...

  7. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  8. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  9. Density structures inside the plasmasphere: Cluster observations

    DEFF Research Database (Denmark)

    Darrouzet, F.; Decreau, P.M.E.; De Keyser, J.

    2004-01-01

    The electron density profiles derived from the EFW and WHISPER instruments on board the four Cluster spacecraft reveal density structures inside the plasmasphere and at its outer boundary, the plasmapause. We have conducted a statistical study to characterize these density structures. We focus...... on the plasmasphere crossing on I I April 2002, during which Cluster observed several density irregularities inside the plasmasphere, as well as a plasmaspheric plume. We derive the density gradient vectors from simultaneous density measurements by the four spacecraft. We also determine the normal velocity...... of the boundaries of the plume and of the irregularities from the time delays between those boundaries in the four individual density profiles, assuming they are planar. These new observations yield novel insights about the occurrence of density irregularities, their geometry and their dynamics. These in...

  10. 517 DWELLING DENSITY VARIABILITY ACROSS GOVERNMENT ...

    African Journals Online (AJOL)

    Osondu

    confidence level, apartment type had no significant effect on dwelling density in ... words: dwelling density, home spaces, housing units, multifamily apartments ... spaces for work, Obateru (2005) defined .... of Statistics Year Book, 2008; Seeling et al., ... stress. The bedroom and habitable room indicators show similar trend.

  11. Fast response densitometer for measuring liquid density

    Science.gov (United States)

    1972-01-01

    Densitometer was developed which produces linear voltage proportional to changes in density of flowing liquid hydrogen. Unit has fast response time and good system stability, statistical variation, and thermal equilibrium. System accuracy is 2 percent of total density span. Basic design may be altered to include measurement of other flowing materials.

  12. Robust statistical reconstruction for charged particle tomography

    Science.gov (United States)

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  13. Spectral statistics of chaotic many-body systems

    International Nuclear Information System (INIS)

    Dubertrand, Rémy; Müller, Sebastian

    2016-01-01

    We derive a trace formula that expresses the level density of chaotic many-body systems as a smooth term plus a sum over contributions associated to solutions of the nonlinear Schrödinger (or Gross–Pitaevski) equation. Our formula applies to bosonic systems with discretised positions, such as the Bose–Hubbard model, in the semiclassical limit as well as in the limit where the number of particles is taken to infinity. We use the trace formula to investigate the spectral statistics of these systems, by studying interference between solutions of the nonlinear Schrödinger equation. We show that in the limits taken the statistics of fully chaotic many-particle systems becomes universal and agrees with predictions from the Wigner–Dyson ensembles of random matrix theory. The conditions for Wigner–Dyson statistics involve a gap in the spectrum of the Frobenius–Perron operator, leaving the possibility of different statistics for systems with weaker chaotic properties. (paper)

  14. Introduction to mathematical statistical physics

    CERN Document Server

    Minlos, R A

    1999-01-01

    This book presents a mathematically rigorous approach to the main ideas and phenomena of statistical physics. The introduction addresses the physical motivation, focussing on the basic concept of modern statistical physics, that is the notion of Gibbsian random fields. Properties of Gibbsian fields are analyzed in two ranges of physical parameters: "regular" (corresponding to high-temperature and low-density regimes) where no phase transition is exhibited, and "singular" (low temperature regimes) where such transitions occur. Next, a detailed approach to the analysis of the phenomena of phase transitions of the first kind, the Pirogov-Sinai theory, is presented. The author discusses this theory in a general way and illustrates it with the example of a lattice gas with three types of particles. The conclusion gives a brief review of recent developments arising from this theory. The volume is written for the beginner, yet advanced students will benefit from it as well. The book will serve nicely as a supplement...

  15. Multivariate density estimation theory, practice, and visualization

    CERN Document Server

    Scott, David W

    2015-01-01

    David W. Scott, PhD, is Noah Harding Professor in the Department of Statistics at Rice University. The author of over 100 published articles, papers, and book chapters, Dr. Scott is also Fellow of the American Statistical Association (ASA) and the Institute of Mathematical Statistics. He is recipient of the ASA Founder's Award and the Army Wilks Award. His research interests include computational statistics, data visualization, and density estimation. Dr. Scott is also Coeditor of Wiley Interdisciplinary Reviews: Computational Statistics and previous Editor of the Journal of Computational and

  16. Statistical correlations in an ideal gas of particles obeying fractional exclusion statistics.

    Science.gov (United States)

    Pellegrino, F M D; Angilella, G G N; March, N H; Pucci, R

    2007-12-01

    After a brief discussion of the concepts of fractional exchange and fractional exclusion statistics, we report partly analytical and partly numerical results on thermodynamic properties of assemblies of particles obeying fractional exclusion statistics. The effect of dimensionality is one focal point, the ratio mu/k_(B)T of chemical potential to thermal energy being obtained numerically as a function of a scaled particle density. Pair correlation functions are also presented as a function of the statistical parameter, with Friedel oscillations developing close to the fermion limit, for sufficiently large density.

  17. Recreational Boating Statistics 2012

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  18. Recreational Boating Statistics 2013

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  19. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  20. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  1. Recreational Boating Statistics 2011

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  2. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  3. Tuberculosis Data and Statistics

    Science.gov (United States)

    ... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...

  4. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  5. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  6. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  7. School Violence: Data & Statistics

    Science.gov (United States)

    ... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...

  8. Caregiver Statistics: Demographics

    Science.gov (United States)

    ... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...

  9. Aortic Aneurysm Statistics

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  10. Alcohol Facts and Statistics

    Science.gov (United States)

    ... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...

  11. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  12. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  13. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  14. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  15. Aplicação de métodos estatísticos na otimização da densidade de empacotamento de distribuições de pós de alumina Optimization of the packing density of alumina powder distributions using statistical techniques

    Directory of Open Access Journals (Sweden)

    A. P. Silva

    2004-12-01

    related statistical techniques (software Statistica, the particle size distribution that maximises the packing density was obtained in both cases and, by comparison with theoretical particle size distributions, the validity of Alfred's theoretical model for perfect spheres was demonstrated. These results clearly show that the harmful effect of the non-spherical shape of real particles can, in fact, be compensated by the optimization of the overall particle size distribution.

  16. Bounded Densities and Their Derivatives

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, V.

    2009-01-01

    This paper describes how one can compute interval-valued statistical measures given limited information about the underlying distribution. The particular focus is on a bounded derivative of a probability density function and its combination with other available statistical evidence for computing ...... quantities of interest. To be able to utilise the evidence about the derivative it is suggested to adapt the ‘conventional’ problem statement to variational calculus and the way to do so is demonstrated. A number of examples are given throughout the paper....

  17. [Pro Familia statistics for 1974].

    Science.gov (United States)

    1975-09-01

    Statistics for 1974 for the West German family planning organization Pro Familia are reported. 56 offices are now operating, and 23,726 clients were seen. Men were seen more frequently than previously. 10,000 telephone calls were also handled. 16-25 year olds were increasingly represented in the clientele, as were unmarried persons of all ages. 1,242 patients were referred to physicians or clinics for clinical diagnosis.

  18. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  19. Interactive statistics with ILLMO

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2014-01-01

    Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured

  20. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  1. Youth Sports Safety Statistics

    Science.gov (United States)

    ... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...

  2. Probing NWP model deficiencies by statistical postprocessing

    DEFF Research Database (Denmark)

    Rosgaard, Martin Haubjerg; Nielsen, Henrik Aalborg; Nielsen, Torben S.

    2016-01-01

    The objective in this article is twofold. On one hand, a Model Output Statistics (MOS) framework for improved wind speed forecast accuracy is described and evaluated. On the other hand, the approach explored identifies unintuitive explanatory value from a diagnostic variable in an operational....... Based on the statistical model candidates inferred from the data, the lifted index NWP model diagnostic is consistently found among the NWP model predictors of the best performing statistical models across sites....

  3. Statistical test of reproducibility and operator variance in thin-section modal analysis of textures and phenocrysts in the Topopah Spring member, drill hole USW VH-2, Crater Flat, Nye County, Nevada

    International Nuclear Information System (INIS)

    Moore, L.M.; Byers, F.M. Jr.; Broxton, D.E.

    1989-06-01

    A thin-section operator-variance test was given to the 2 junior authors, petrographers, by the senior author, a statistician, using 16 thin sections cut from core plugs drilled by the US Geological Survey from drill hole USW VH-2 standard (HCQ) drill core. The thin sections are samples of Topopah Spring devitrified rhyolite tuff from four textural zones, in ascending order: (1) lower nonlithophysal, (2) lower lithopysal, (3) middle nonlithophysal, and (4) upper lithophysal. Drill hole USW-VH-2 is near the center of the Crater Flat, about 6 miles WSW of the Yucca Mountain in Exploration Block. The original thin-section labels were opaqued out with removable enamel and renumbered with alpha-numeric labels. The sliders were then given to the petrographer operators for quantitative thin-section modal (point-count) analysis of cryptocrystalline, spherulitic, granophyric, and void textures, as well as phenocryst minerals. Between operator variance was tested by giving the two petrographers the same slide, and within-operator variance was tested by the same operator the same slide to count in a second test set, administered at least three months after the first set. Both operators were unaware that they were receiving the same slide to recount. 14 figs., 6 tabs

  4. Basic statistics with Microsoft Excel: a review.

    Science.gov (United States)

    Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto

    2017-06-01

    The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.

  5. East African Journal of Statistics: Editorial Policies

    African Journals Online (AJOL)

    Focus and Scope. EAJOSTA publishes the latest finding in applied and theoretical statistics. The journal also accepts papers in operations research, financial mathematics and acturial sciences, all considered as part of applied statistics. Articles must deal with original research, which have not been accepted for publication ...

  6. Functional integral approach to classical statistical dynamics

    International Nuclear Information System (INIS)

    Jensen, R.V.

    1980-04-01

    A functional integral method is developed for the statistical solution of nonlinear stochastic differential equations which arise in classical dynamics. The functional integral approach provides a very natural and elegant derivation of the statistical dynamical equations that have been derived using the operator formalism of Martin, Siggia, and Rose

  7. A New Statistical Tool: Scalar Score Function

    Czech Academy of Sciences Publication Activity Database

    Fabián, Zdeněk

    2011-01-01

    Roč. 2, - (2011), s. 109-116 ISSN 1934-7332 R&D Projects: GA ČR GA205/09/1079 Institutional research plan: CEZ:AV0Z10300504 Keywords : statistics * inference function * data characteristics * point estimates * heavy tails Subject RIV: BB - Applied Statistics, Operational Research

  8. Clinical Decision Support: Statistical Hopes and Challenges

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Zvárová, Jana

    2016-01-01

    Roč. 4, č. 1 (2016), s. 30-34 ISSN 1805-8698 Grant - others:Nadační fond na opdporu vědy(CZ) Neuron Institutional support: RVO:67985807 Keywords : decision support * data mining * multivariate statistics * psychiatry * information based medicine Subject RIV: BB - Applied Statistics, Operational Research

  9. The statistics of galaxies: beyond correlation functions

    International Nuclear Information System (INIS)

    Lachieze-Rey, M.

    1988-01-01

    I mention some normalization problems encountered when estimating the 2-point correlation functions in samples of galaxies of different average densities. I present some aspects of the void probability function as a statistical indicator, free of such normalization problems. Finally I suggest a new statistical approach to give an account in a synthetic way of those aspects of the galaxy distribution that a conventional method is unable to characterize

  10. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  11. Statistics of spatially integrated speckle intensity difference

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Yura, Harold

    2009-01-01

    We consider the statistics of the spatially integrated speckle intensity difference obtained from two separated finite collecting apertures. For fully developed speckle, closed-form analytic solutions for both the probability density function and the cumulative distribution function are derived...... here for both arbitrary values of the mean number of speckles contained within an aperture and the degree of coherence of the optical field. Additionally, closed-form expressions are obtained for the corresponding nth statistical moments....

  12. Tobacco Products Production and Operations Reports

    Data.gov (United States)

    Department of the Treasury — Monthly statistical reports on tobacco products production and operations. Data for Tobacco Statistical Release is derived directly from the Report – Manufacturer of...

  13. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  14. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  15. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  16. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  17. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  18. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  19. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  20. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  1. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  2. Deep convolutional neural network for mammographic density segmentation

    Science.gov (United States)

    Wei, Jun; Li, Songfeng; Chan, Heang-Ping; Helvie, Mark A.; Roubidoux, Marilyn A.; Lu, Yao; Zhou, Chuan; Hadjiiski, Lubomir; Samala, Ravi K.

    2018-02-01

    Breast density is one of the most significant factors for cancer risk. In this study, we proposed a supervised deep learning approach for automated estimation of percentage density (PD) on digital mammography (DM). The deep convolutional neural network (DCNN) was trained to estimate a probability map of breast density (PMD). PD was calculated as the ratio of the dense area to the breast area based on the probability of each pixel belonging to dense region or fatty region at a decision threshold of 0.5. The DCNN estimate was compared to a feature-based statistical learning approach, in which gray level, texture and morphological features were extracted from each ROI and the least absolute shrinkage and selection operator (LASSO) was used to select and combine the useful features to generate the PMD. The reference PD of each image was provided by two experienced MQSA radiologists. With IRB approval, we retrospectively collected 347 DMs from patient files at our institution. The 10-fold cross-validation results showed a strong correlation r=0.96 between the DCNN estimation and interactive segmentation by radiologists while that of the feature-based statistical learning approach vs radiologists' segmentation had a correlation r=0.78. The difference between the segmentation by DCNN and by radiologists was significantly smaller than that between the feature-based learning approach and radiologists (p approach has the potential to replace radiologists' interactive thresholding in PD estimation on DMs.

  3. What Exactly is a Parton Density ?

    International Nuclear Information System (INIS)

    Collins, J.C.

    2003-01-01

    I give an account of the definitions of parton densities, both the conventional ones, integrated over parton transverse momentum, and unintegrated transverse-momentum-dependent densities. The aim is to get a precise and correct definition of a parton density as the target expectation value of a suitable quantum mechanical operator, so that a clear connection to non-perturbative QCD is provided. Starting from the intuitive ideas in the parton model that predate QCD, we will see how the simplest operator definitions suffer from divergences. Corrections to the definition are needed to eliminate the divergences. An improved definition of unintegrated parton densities is proposed. (author)

  4. Density dependent hadron field theory

    International Nuclear Information System (INIS)

    Fuchs, C.; Lenske, H.; Wolter, H.H.

    1995-01-01

    A fully covariant approach to a density dependent hadron field theory is presented. The relation between in-medium NN interactions and field-theoretical meson-nucleon vertices is discussed. The medium dependence of nuclear interactions is described by a functional dependence of the meson-nucleon vertices on the baryon field operators. As a consequence, the Euler-Lagrange equations lead to baryon rearrangement self-energies which are not obtained when only a parametric dependence of the vertices on the density is assumed. It is shown that the approach is energy-momentum conserving and thermodynamically consistent. Solutions of the field equations are studied in the mean-field approximation. Descriptions of the medium dependence in terms of the baryon scalar and vector density are investigated. Applications to infinite nuclear matter and finite nuclei are discussed. Density dependent coupling constants obtained from Dirac-Brueckner calculations with the Bonn NN potentials are used. Results from Hartree calculations for energy spectra, binding energies, and charge density distributions of 16 O, 40,48 Ca, and 208 Pb are presented. Comparisons to data strongly support the importance of rearrangement in a relativistic density dependent field theory. Most striking is the simultaneous improvement of charge radii, charge densities, and binding energies. The results indicate the appearance of a new ''Coester line'' in the nuclear matter equation of state

  5. Level densities in nuclear physics

    International Nuclear Information System (INIS)

    Beckerman, M.

    1978-01-01

    In the independent-particle model nucleons move independently in a central potential. There is a well-defined set of single- particle orbitals, each nucleon occupies one of these orbitals subject to Fermi statistics, and the total energy of the nucleus is equal to the sum of the energies of the individual nucleons. The basic question is the range of validity of this Fermi gas description and, in particular, the roles of the residual interactions and collective modes. A detailed examination of experimental level densities in light-mass system is given to provide some insight into these questions. Level densities over the first 10 MeV or so in excitation energy as deduced from neutron and proton resonances data and from spectra of low-lying bound levels are discussed. To exhibit some of the salient features of these data comparisons to independent-particle (shell) model calculations are presented. Shell structure is predicted to manifest itself through discontinuities in the single-particle level density at the Fermi energy and through variatons in the occupancy of the valence orbitals. These predictions are examined through combinatorial calculations performed with the Grover [Phys. Rev., 157, 832(1967), 185 1303(1969)] odometer method. Before the discussion of the experimenta results, statistical mechanical level densities for spherical nuclei are reviewed. After consideration of deformed nuclei, the conclusions resulting from this work are drawn. 7 figures, 3 tables

  6. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  7. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  8. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  9. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  10. Mineral industry statistics 1975

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)

  11. Lectures on statistical mechanics

    CERN Document Server

    Bowler, M G

    1982-01-01

    Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent

  12. Introduction to Statistics

    Directory of Open Access Journals (Sweden)

    Mirjam Nielen

    2017-01-01

    Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016. 

  13. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  14. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  15. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  16. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  17. Annual Statistical Supplement, 2002

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  18. Annual Statistical Supplement, 2010

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  19. Annual Statistical Supplement, 2007

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  20. Annual Statistical Supplement, 2001

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...