WorldWideScience

Sample records for statistical analysis electronic

  1. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    International Nuclear Information System (INIS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.; Sales, Brian C.; Sefat, Athena S.

    2014-01-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe 0.55 Se 0.45 (T c = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe 1−x Se x structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces

  2. Multivariate statistical analysis of electron energy-loss spectroscopy in anisotropic materials

    International Nuclear Information System (INIS)

    Hu Xuerang; Sun Yuekui; Yuan Jun

    2008-01-01

    Recently, an expression has been developed to take into account the complex dependence of the fine structure in core-level electron energy-loss spectroscopy (EELS) in anisotropic materials on specimen orientation and spectral collection conditions [Y. Sun, J. Yuan, Phys. Rev. B 71 (2005) 125109]. One application of this expression is the development of a phenomenological theory of magic-angle electron energy-loss spectroscopy (MAEELS), which can be used to extract the isotropically averaged spectral information for materials with arbitrary anisotropy. Here we use this expression to extract not only the isotropically averaged spectral information, but also the anisotropic spectral components, without the restriction of MAEELS. The application is based on a multivariate statistical analysis of core-level EELS for anisotropic materials. To demonstrate the applicability of this approach, we have conducted a study on a set of carbon K-edge spectra of multi-wall carbon nanotube (MWCNT) acquired with energy-loss spectroscopic profiling (ELSP) technique and successfully extracted both the averaged and dichroic spectral components of the wrapped graphite-like sheets. Our result shows that this can be a practical alternative to MAEELS for the study of electronic structure of anisotropic materials, in particular for those nanostructures made of layered materials

  3. Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases Gibbs Method and Statistical Physics of Electron Gases

    CERN Document Server

    Askerov, Bahram M

    2010-01-01

    This book deals with theoretical thermodynamics and the statistical physics of electron and particle gases. While treating the laws of thermodynamics from both classical and quantum theoretical viewpoints, it posits that the basis of the statistical theory of macroscopic properties of a system is the microcanonical distribution of isolated systems, from which all canonical distributions stem. To calculate the free energy, the Gibbs method is applied to ideal and non-ideal gases, and also to a crystalline solid. Considerable attention is paid to the Fermi-Dirac and Bose-Einstein quantum statistics and its application to different quantum gases, and electron gas in both metals and semiconductors is considered in a nonequilibrium state. A separate chapter treats the statistical theory of thermodynamic properties of an electron gas in a quantizing magnetic field.

  4. Ion induced electron emission statistics under Agm- cluster bombardment of Ag

    Science.gov (United States)

    Breuers, A.; Penning, R.; Wucher, A.

    2018-05-01

    The electron emission from a polycrystalline silver surface under bombardment with Agm- cluster ions (m = 1, 2, 3) is investigated in terms of ion induced kinetic excitation. The electron yield γ is determined directly by a current measurement method on the one hand and implicitly by the analysis of the electron emission statistics on the other hand. Successful measurements of the electron emission spectra ensure a deeper understanding of the ion induced kinetic electron emission process, with particular emphasis on the effect of the projectile cluster size to the yield as well as to emission statistics. The results allow a quantitative comparison to computer simulations performed for silver atoms and clusters impinging onto a silver surface.

  5. Statistical analysis of the electronic crosstalk correction in Terra MODIS Band 27

    Science.gov (United States)

    Madhavan, Sriharsha; Sun, Junqiang; Xiong, Xiaoxiong; Wenny, Brian N.; Wu, Aisheng

    2014-10-01

    The first MODerate-resolution Imaging Spectroradiometer (MODIS), also known as the Proto-Flight model (PFM), is on-board the Terra spacecraft and has completed 14 years of on orbit flight as of December 18, 2013. MODIS remotely senses the Earth in 36 spectral bands, with a wavelength range from 0.4 μm to 14.4 μm. The 36 bands can be subdivided into two groups based on their spectral responsivity as Reflective Solar Bands (RSBs) and Thermal Emissive Bands (TEBs). Band 27 centered at 6.77 μm is a TEB used to study the global water vapor distribution. It was found recently that this band has been severely affected by electronic crosstalk. The electronic crosstalk magnitude, its on-orbit change and calibration impact have been well characterized in our previous studies through the use of regularly scheduled lunar observations. Further, the crosstalk correction was implemented in Earth view (EV) images and quantified the improvements of the same. However, improvements remained desirable on several fronts. Firstly, the effectiveness of the correction needed to be analyzed spatially and radiometrically over a number of scenes. Also, the temporal aspect of the correction had to be investigated in a rigorous manner. In order to address these issues, a one-orbit analysis was performed on the Level 1A (L1A) scene granules over a ten year period from 2003 through 2012. Results have been quantified statistically and show a significant reduction of image striping, as well as removal of leaked signal features from the neighboring bands. Statistical analysis was performed by analyzing histograms of the one-orbit granules at a scene and detector level before and after correction. The comprehensive analysis and results reported in this paper will be very helpful to the scientific community in understanding the impacts of crosstalk correction on various scenes and could potentially be applied for future improvements of band 27 calibration and, therefore, its retrieval for the

  6. A Statistical Analysis of Langmuir Wave-Electron Correlations Observed by the CHARM II Auroral Sounding Rocket

    Science.gov (United States)

    Dombrowski, M. P.; Labelle, J. W.; Kletzing, C.; Bounds, S. R.; Kaeppler, S. R.

    2014-12-01

    Langmuir-mode electron plasma waves are frequently observed by spacecraft in active plasma environments such as the ionosphere. Ionospheric Langmuir waves may be excited by the bump-on-tail instability generated by impinging beams of electrons traveling parallel to the background magnetic field (B). The Correlation of High-frequencies and Auroral Roar Measurement (CHARM II) sounding rocket was launched into a substorm at 9:49 UT on 17 February 2010, from the Poker Flat Research Range in Alaska. The primary instruments included the University of Iowa Wave-Particle Correlator (WPC), the Dartmouth High-Frequency Experiment (HFE), several charged particle detectors, low-frequency wave instruments, and a magnetometer. The HFE is a receiver system which effectively yields continuous (100% duty cycle) electric-field waveform measurements from 100 kHz to 5 MHz, and which had its detection axis aligned nominally parallel to B. The HFE output was fed on-payload to the WPC, which uses a phase-locked loop to track the incoming wave frequency with the most power, then sorting incoming electrons at eight energy levels into sixteen wave-phase bins. CHARM II encountered several regions of strong Langmuir wave activity throughout its 15-minute flight, and the WPC showed wave-lock and statistically significant particle correlation distributions during several time periods. We show results of an in-depth analysis of the CHARM II WPC data for the entire flight, including statistical analysis of correlations which show evidence of direct interaction with the Langmuir waves, indicating (at various times) trapping of particles and both driving and damping of Langmuir waves by particles. In particular, the sign of the gradient in particle flux appears to correlate with the phase relation between the electrons and the wave field, with possible implications for the wave physics.

  7. Statistical analysis of oxides particles in ODS ferritic steel using advanced electron microscopy

    International Nuclear Information System (INIS)

    Unifantowicz, P.; Schäublin, R.; Hébert, C.; Płociński, T.; Lucas, G.; Baluc, N.

    2012-01-01

    In this work a combination of advanced transmission electron microscopy and spectroscopy techniques enabled a statistically significant analysis of various types of few nanometer size oxides particles in Fe–14Cr–2W–0.3Ti–0.3Y 2 O 3 ferritic steel. These methods include a scanning TEM with EDS and EFTEM coupled with EELS. In addition, principal component analysis was applied to the chemical maps obtained by EFTEM, which drastically improved the signal to noise ratio. Three types of particles were identified in a size range from 2 to 300 nm, namely Cr–Ti–O, Y–O and Y–Ti–O particles, with an average size of 33,16 and 8 nm, respectively. The Cr–Ti–O particles contain Y and Ti enriched zones, which were not observed previously. The EFTEM analysis showed that the titanium addition leads to formation of Y–Ti–O nano-particles, which constitute 84% of the oxides but also precipitation of larger Cr–Ti–O. The presence of small amount of Y–O particles indicated a not sufficient amount of Ti available for reaction during mechanical alloying or consolidation.

  8. A Statistical Study of Eiscat Electron and Ion Temperature Measurements In The E-region

    Science.gov (United States)

    Hussey, G.; Haldoupis, C.; Schlegel, K.; Bösinger, T.

    Motivated by the large EISCAT data base, which covers over 15 years of common programme operation, and previous statistical work with EISCAT data (e.g., C. Hal- doupis, K. Schlegel, and G. Hussey, Auroral E-region electron density gradients mea- sured with EISCAT, Ann. Geopshysicae, 18, 1172-1181, 2000), a detailed statistical analysis of electron and ion EISCAT temperature measurements has been undertaken. This study was specifically concerned with the statistical dependence of heating events with other ambient parameters such as the electric field and electron density. The re- sults showed previously reported dependences such as the electron temperature being directly correlated with the ambient electric field and inversely related to the electron density. However, these correlations were found to be also dependent upon altitude. There was also evidence of the so called "Schlegel effect" (K. Schlegel, Reduced effective recombination coefficient in the disturbed polar E-region, J. Atmos. Terr. Phys., 44, 183-185, 1982); that is, the heated electron gas leads to increases in elec- tron density through a reduction in the recombination rate. This paper will present the statistical heating results and attempt to offer physical explanations and interpretations of the findings.

  9. Statistics of Electron Avalanches and Streamers

    Directory of Open Access Journals (Sweden)

    T. Ficker

    2007-01-01

    Full Text Available We have studied the severe systematic deviations of populations of electron avalanches from the Furry distribution, which has been held to be the statistical law corresponding to them, and a possible explanation has been sought. A  new theoretical concept based on fractal avalanche multiplication has been proposed and is shown to be a convenient candidate for explaining these deviations from Furry statistics

  10. Nonequilibrium statistical operator in hot-electron transport theory

    International Nuclear Information System (INIS)

    Xing, D.Y.; Liu, M.

    1991-09-01

    The Nonequilibrium Statistical Operator method developed by Zubarev is generalized and applied to the study of hot-electron transport in semiconductors. The steady-state balance equations for momentum and energy are derived to the lowest order in the electron-lattice coupling. We show that the derived balance equations are exactly the same as those obtained by Lei and Ting. This equivalence stems from the fact that to the linear order in the electron-lattice coupling, two statistical density matrices have identical effect when they are used to calculate the average value of a dynamical operator. The application to the steady-state and transient hot-electron transport in multivalley semiconductors is also discussed. (author). 28 refs, 1 fig

  11. Statistics Education Research in Malaysia and the Philippines: A Comparative Analysis

    Science.gov (United States)

    Reston, Enriqueta; Krishnan, Saras; Idris, Noraini

    2014-01-01

    This paper presents a comparative analysis of statistics education research in Malaysia and the Philippines by modes of dissemination, research areas, and trends. An electronic search for published research papers in the area of statistics education from 2000-2012 yielded 20 for Malaysia and 19 for the Philippines. Analysis of these papers showed…

  12. Statistics of excitations in the electron glass model

    Science.gov (United States)

    Palassini, Matteo

    2011-03-01

    We study the statistics of elementary excitations in the classical electron glass model of localized electrons interacting via the unscreened Coulomb interaction in the presence of disorder. We reconsider the long-standing puzzle of the exponential suppression of the single-particle density of states near the Fermi level, by measuring accurately the density of states of charged and electron-hole pair excitations via finite temperature Monte Carlo simulation and zero-temperature relaxation. We also investigate the statistics of large charge rearrangements after a perturbation of the system, which may shed some light on the slow relaxation and glassy phenomena recently observed in a variety of Anderson insulators. In collaboration with Martin Goethe.

  13. Electronic Repository of Russian Historical Statistics

    NARCIS (Netherlands)

    Tykhonov, Vyacheslav; Kessler, Gijs; Markevich, Andrei; de Vries, Jerry

    2014-01-01

    The Electronic Repository for Russian Historical Statistics brings together data extracted from various published and unpublished sources in one place. Its principal focus is Russian economic and social history of the last three centuries (18th-21st). The repository caters to the needs of the

  14. Introductory statistical mechanics for electron storage rings

    International Nuclear Information System (INIS)

    Jowett, J.M.

    1986-07-01

    These lectures introduce the beam dynamics of electron-positron storage rings with particular emphasis on the effects due to synchrotron radiation. They differ from most other introductions in their systematic use of the physical principles and mathematical techniques of the non-equilibrium statistical mechanics of fluctuating dynamical systems. A self-contained exposition of the necessary topics from this field is included. Throughout the development, a Hamiltonian description of the effects of the externally applied fields is maintained in order to preserve the links with other lectures on beam dynamics and to show clearly the extent to which electron dynamics in non-Hamiltonian. The statistical mechanical framework is extended to a discussion of the conceptual foundations of the treatment of collective effects through the Vlasov equation

  15. Statistical analysis of suprathermal electron drivers at 67P/Churyumov-Gerasimenko

    Science.gov (United States)

    Broiles, Thomas W.; Burch, J. L.; Chae, K.; Clark, G.; Cravens, T. E.; Eriksson, A.; Fuselier, S. A.; Frahm, R. A.; Gasc, S.; Goldstein, R.; Henri, P.; Koenders, C.; Livadiotis, G.; Mandt, K. E.; Mokashi, P.; Nemeth, Z.; Odelstad, E.; Rubin, M.; Samara, M.

    2016-11-01

    We use observations from the Ion and Electron Sensor (IES) on board the Rosetta spacecraft to study the relationship between the cometary suprathermal electrons and the drivers that affect their density and temperature. We fit the IES electron observations with the summation of two kappa distributions, which we characterize as a dense and warm population (˜10 cm-3 and ˜16 eV) and a rarefied and hot population (˜0.01 cm-3 and ˜43 eV). The parameters of our fitting technique determine the populations' density, temperature, and invariant kappa index. We focus our analysis on the warm population to determine its origin by comparing the density and temperature with the neutral density and magnetic field strength. We find that the warm electron population is actually two separate sub-populations: electron distributions with temperatures above 8.6 eV and electron distributions with temperatures below 8.6 eV. The two sub-populations have different relationships between their density and temperature. Moreover, the two sub-populations are affected by different drivers. The hotter sub-population temperature is strongly correlated with neutral density, while the cooler sub-population is unaffected by neutral density and is only weakly correlated with magnetic field strength. We suggest that the population with temperatures above 8.6 eV is being heated by lower hybrid waves driven by counterstreaming solar wind protons and newly formed, cometary ions created in localized, dense neutral streams. To the best of our knowledge, this represents the first observations of cometary electrons heated through wave-particle interactions.

  16. Statistical Exploration of Electronic Structure of Molecules from Quantum Monte-Carlo Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Prabhat, Mr; Zubarev, Dmitry; Lester, Jr., William A.

    2010-12-22

    In this report, we present results from analysis of Quantum Monte Carlo (QMC) simulation data with the goal of determining internal structure of a 3N-dimensional phase space of an N-electron molecule. We are interested in mining the simulation data for patterns that might be indicative of the bond rearrangement as molecules change electronic states. We examined simulation output that tracks the positions of two coupled electrons in the singlet and triplet states of an H2 molecule. The electrons trace out a trajectory, which was analyzed with a number of statistical techniques. This project was intended to address the following scientific questions: (1) Do high-dimensional phase spaces characterizing electronic structure of molecules tend to cluster in any natural way? Do we see a change in clustering patterns as we explore different electronic states of the same molecule? (2) Since it is hard to understand the high-dimensional space of trajectories, can we project these trajectories to a lower dimensional subspace to gain a better understanding of patterns? (3) Do trajectories inherently lie in a lower-dimensional manifold? Can we recover that manifold? After extensive statistical analysis, we are now in a better position to respond to these questions. (1) We definitely see clustering patterns, and differences between the H2 and H2tri datasets. These are revealed by the pamk method in a fairly reliable manner and can potentially be used to distinguish bonded and non-bonded systems and get insight into the nature of bonding. (2) Projecting to a lower dimensional subspace ({approx}4-5) using PCA or Kernel PCA reveals interesting patterns in the distribution of scalar values, which can be related to the existing descriptors of electronic structure of molecules. Also, these results can be immediately used to develop robust tools for analysis of noisy data obtained during QMC simulations (3) All dimensionality reduction and estimation techniques that we tried seem to

  17. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  18. Statistical Analysis of the Exchange Rate of Bitcoin.

    Science.gov (United States)

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  19. Statistical Feature Extraction and Recognition of Beverages Using Electronic Tongue

    Directory of Open Access Journals (Sweden)

    P. C. PANCHARIYA

    2010-01-01

    Full Text Available This paper describes an approach for extraction of features from data generated from an electronic tongue based on large amplitude pulse voltammetry. In this approach statistical features of the meaningful selected variables from current response signals are extracted and used for recognition of beverage samples. The proposed feature extraction approach not only reduces the computational complexity but also reduces the computation time and requirement of storage of data for the development of E-tongue for field applications. With the reduced information, a probabilistic neural network (PNN was trained for qualitative analysis of different beverages. Before the qualitative analysis of the beverages, the methodology has been tested for the basic artificial taste solutions i.e. sweet, sour, salt, bitter, and umami. The proposed procedure was compared with the more conventional and linear feature extraction technique employing principal component analysis combined with PNN. Using the extracted feature vectors, highly correct classification by PNN was achieved for eight types of juices and six types of soft drinks. The results indicated that the electronic tongue based on large amplitude pulse voltammetry with reduced feature was capable of discriminating not only basic artificial taste solutions but also the various sorts of the same type of natural beverages (fruit juices, vegetable juices, soft drinks, etc..

  20. Statistical analysis of fragmentation patterns of electron ionization mass spectra of enolized-trimethylsilylated anabolic androgenic steroids

    Science.gov (United States)

    Fragkaki, A. G.; Angelis, Y. S.; Tsantili-Kakoulidou, A.; Koupparis, M.; Georgakopoulos, C.

    2009-08-01

    Anabolic androgenic steroids (AAS) are included in the List of prohibited substances of the World Anti-Doping Agency (WADA) as substances abused to enhance athletic performance. Gas chromatography coupled to mass spectrometry (GC-MS) plays an important role in doping control analyses identifying AAS as their enolized-trimethylsilyl (TMS)-derivatives using the electron ionization (EI) mode. This paper explores the suitability of complementary GC-MS mass spectra and statistical analysis (principal component analysis, PCA and partial least squares-discriminant analysis, PLS-DA) to differentiate AAS as a function of their structural and conformational features expressed by their fragment ions. The results obtained showed that the application of PCA yielded a classification among the AAS molecules which became more apparent after applying PLS-DA to the dataset. The application of PLS-DA yielded a clear separation among the AAS molecules which were, thus, classified as: 1-ene-3-keto, 3-hydroxyl with saturated A-ring, 1-ene-3-hydroxyl, 4-ene-3-keto, 1,4-diene-3-keto and 3-keto with saturated A-ring anabolic steroids. The study of this paper also presents structurally diagnostic fragment ions and dissociation routes providing evidence for the presence of unknown AAS or chemically modified molecules known as designer steroids.

  1. Statistical Analysis of the Exchange Rate of Bitcoin.

    Directory of Open Access Journals (Sweden)

    Jeffrey Chu

    Full Text Available Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  2. Statistical Analysis of the Exchange Rate of Bitcoin

    Science.gov (United States)

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  3. Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data

    Science.gov (United States)

    Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.

    2017-09-01

    The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.

  4. Electron Energy Level Statistics in Graphene Quantum Dots

    NARCIS (Netherlands)

    De Raedt, H.; Katsnellson, M. I.; Katsnelson, M.I.

    2008-01-01

    Motivated by recent experimental observations of size quantization of electron energy levels in graphene quantum dots [7] we investigate the level statistics in the simplest tight-binding model for different dot shapes by computer simulation. The results are in a reasonable agreement with the

  5. Beginning statistics with data analysis

    CERN Document Server

    Mosteller, Frederick; Rourke, Robert EK

    2013-01-01

    This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.

  6. Langmuir waveforms at interplanetary shocks: STEREO statistical analysis

    Science.gov (United States)

    Briand, C.

    2016-12-01

    Wave-particle interactions and particle acceleration are the two main processes allowing energy dissipation at non collisional shocks. Ion acceleration has been deeply studied for many years, also for their central role in the shock front reformation. Electron dynamics is also important in the shock dynamics through the instabilities they can generate which may impact the ion dynamics.Particle measurements can be efficiently completed by wave measurements to determine the characteristics of the electron beams and study the turbulence of the medium. Electric waveforms obtained from the S/WAVES instrument of the STEREO mission between 2007 to 2014 are analyzed. Thus, clear signature of Langmuir waves are observed on 41 interplanetary shocks. These data enable a statistical analysis and to deduce some characteristics of the electron dynamics on different shocks sources (SIR or ICME) and types (quasi-perpendicular or quasi-parallel). The conversion process between electrostatic to electromagnetic waves has also been tested in several cases.

  7. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  8. Statistical distribution of hydrogen over three positions in the brucite Mg(OH)2 structure from electron diffractometry data

    International Nuclear Information System (INIS)

    Zhukhlistov, A.A.; Avilov, A.S.; Ferraris, D.; Zvyagin, B.B.; Plotnikov, V.P.

    1997-01-01

    The method of improved automatic electron diffractometry for measuring and recording intensities to two-dimensionally distributed reflections of texture-type electron diffraction patterns has been used for the analysis of the brucite Mg(OH) 2 structure. The experimental accuracy of the measured intensities proved to be sufficient for studying fine structural details of the statistical distribution of hydrogen atoms over three structure positions located around the threefold axis of the brucite structure

  9. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  10. Electronic Health Record Implementation: A SWOT Analysis.

    Science.gov (United States)

    Shahmoradi, Leila; Darrudi, Alireza; Arji, Goli; Farzaneh Nejad, Ahmadreza

    2017-10-01

    Electronic Health Record (EHR) is one of the most important achievements of information technology in healthcare domain, and if deployed effectively, it can yield predominant results. The aim of this study was a SWOT (strengths, weaknesses, opportunities, and threats) analysis in electronic health record implementation. This is a descriptive, analytical study conducted with the participation of a 90-member work force from Hospitals affiliated to Tehran University of Medical Sciences (TUMS). The data were collected by using a self-structured questionnaire and analyzed by SPSS software. Based on the results, the highest priority in strength analysis was related to timely and quick access to information. However, lack of hardware and infrastructures was the most important weakness. Having the potential to share information between different sectors and access to a variety of health statistics was the significant opportunity of EHR. Finally, the most substantial threats were the lack of strategic planning in the field of electronic health records together with physicians' and other clinical staff's resistance in the use of electronic health records. To facilitate successful adoption of electronic health record, some organizational, technical and resource elements contribute; moreover, the consideration of these factors is essential for HER implementation.

  11. Statistical and coherence properties of radiation from X-ray free electron lasers

    International Nuclear Information System (INIS)

    Saldin, E.L.; Schneidmiller, E.A.; Yurkov, M.V.

    2009-12-01

    We describe statistical and coherence properties of the radiation from X-ray free electron lasers (XFEL). It is shown that the X-ray FEL radiation before saturation is described with gaussian statistics. Particularly important is the case of the optimized X-ray FEL, studied in detail. Applying similarity techniques to the results of numerical simulations allowed us to find universal scaling relations for the main characteristics of an X-ray FEL operating in the saturation regime: efficiency, coherence time and degree of transverse coherence. We find that with an appropriate normalization of these quantities, they are functions of only the ratio of the geometrical emittance of the electron beam to the radiation wavelength. Statistical and coherence properties of the higher harmonics of the radiation are highlighted as well. (orig.)

  12. Statistical and coherence properties of radiation from X-ray free electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Saldin, E L; Schneidmiller, E A; Yurkov, M V

    2009-12-15

    We describe statistical and coherence properties of the radiation from X-ray free electron lasers (XFEL). It is shown that the X-ray FEL radiation before saturation is described with gaussian statistics. Particularly important is the case of the optimized X-ray FEL, studied in detail. Applying similarity techniques to the results of numerical simulations allowed us to find universal scaling relations for the main characteristics of an X-ray FEL operating in the saturation regime: efficiency, coherence time and degree of transverse coherence. We find that with an appropriate normalization of these quantities, they are functions of only the ratio of the geometrical emittance of the electron beam to the radiation wavelength. Statistical and coherence properties of the higher harmonics of the radiation are highlighted as well. (orig.)

  13. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  14. Electronic Health Record Implementation: A SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Leila Shahmoradi

    2017-12-01

    Full Text Available Electronic Health Record (EHR is one of the most important achievements of information technology in healthcare domain, and if deployed effectively, it can yield predominant results. The aim of this study was a SWOT (strengths, weaknesses, opportunities, and threats analysis in electronic health record implementation. This is a descriptive, analytical study conducted with the participation of a 90-member work force from Hospitals affiliated to Tehran University of Medical Sciences (TUMS. The data were collected by using a self-structured questionnaire and analyzed by SPSS software. Based on the results, the highest priority in strength analysis was related to timely and quick access to information. However, lack of hardware and infrastructures was the most important weakness. Having the potential to share information between different sectors and access to a variety of health statistics was the significant opportunity of EHR. Finally, the most substantial threats were the lack of strategic planning in the field of electronic health records together with physicians’ and other clinical staff’s resistance in the use of electronic health records. To facilitate successful adoption of electronic health record, some organizational, technical and resource elements contribute; moreover, the consideration of these factors is essential for HER implementation.

  15. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  16. Regularized Statistical Analysis of Anatomy

    DEFF Research Database (Denmark)

    Sjöstrand, Karl

    2007-01-01

    This thesis presents the application and development of regularized methods for the statistical analysis of anatomical structures. Focus is on structure-function relationships in the human brain, such as the connection between early onset of Alzheimer’s disease and shape changes of the corpus...... and mind. Statistics represents a quintessential part of such investigations as they are preluded by a clinical hypothesis that must be verified based on observed data. The massive amounts of image data produced in each examination pose an important and interesting statistical challenge...... efficient algorithms which make the analysis of large data sets feasible, and gives examples of applications....

  17. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  19. Statistics of counter-streaming solar wind suprathermal electrons at solar minimum: STEREO observations

    Directory of Open Access Journals (Sweden)

    B. Lavraud

    2010-01-01

    Full Text Available Previous work has shown that solar wind suprathermal electrons can display a number of features in terms of their anisotropy. Of importance is the occurrence of counter-streaming electron patterns, i.e., with "beams" both parallel and anti-parallel to the local magnetic field, which is believed to shed light on the heliospheric magnetic field topology. In the present study, we use STEREO data to obtain the statistical properties of counter-streaming suprathermal electrons (CSEs in the vicinity of corotating interaction regions (CIRs during the period March–December 2007. Because this period corresponds to a minimum of solar activity, the results are unrelated to the sampling of large-scale coronal mass ejections, which can lead to CSE owing to their closed magnetic field topology. The present study statistically confirms that CSEs are primarily the result of suprathermal electron leakage from the compressed CIR into the upstream regions with the combined occurrence of halo depletion at 90° pitch angle. The occurrence rate of CSE is found to be about 15–20% on average during the period analyzed (depending on the criteria used, but superposed epoch analysis demonstrates that CSEs are preferentially observed both before and after the passage of the stream interface (with peak occurrence rate >35% in the trailing high speed stream, as well as both inside and outside CIRs. The results quantitatively show that CSEs are common in the solar wind during solar minimum, but yet they suggest that such distributions would be much more common if pitch angle scattering were absent. We further argue that (1 the formation of shocks contributes to the occurrence of enhanced counter-streaming sunward-directed fluxes, but does not appear to be a necessary condition, and (2 that the presence of small-scale transients with closed-field topologies likely also contributes to the occurrence of counter-streaming patterns, but only in the slow solar wind prior to

  20. ZnO crystals obtained by electrodeposition: Statistical analysis of most important process variables

    International Nuclear Information System (INIS)

    Cembrero, Jesus; Busquets-Mataix, David

    2009-01-01

    In this paper a comparative study by means of a statistical analysis of the main process variables affecting ZnO crystal electrodeposition is presented. ZnO crystals were deposited on two different substrates, silicon wafer and indium tin oxide. The control variables were substrate types, electrolyte concentration, temperature, exposition time and current density. The morphologies of the different substrates were observed using scanning electron microscopy. The percentage of substrate area covered by ZnO deposit was calculated by computational image analysis. The design of the applied experiments was based on a two-level factorial analysis involving a series of 32 experiments and an analysis of variance. Statistical results reveal that variables exerting a significant influence on the area covered by ZnO deposit are electrolyte concentration, substrate type and time of deposition, together with a combined two-factor interaction between temperature and current density. However, morphology is also influenced by surface roughness of the substrates

  1. Electron Dropout Echoes Induced by Interplanetary Shock: A Statistical Study

    Science.gov (United States)

    Liu, Z.; Zong, Q.; Hao, Y.; Zhou, X.; Ma, X.; Liu, Y.

    2017-12-01

    "Electron dropout echo" as indicated by repeated moderate dropout and recovery signatures of the flux of energetic electron in the out radiation belt region has been investigated systematically. The electron dropout and its echoes are usually found for higher energy (> 300 keV) channels fluxes, whereas the flux enhancements are obvious for lower energy electrons simultaneously after the interplanetary shock arrives at the Earth's geosynchronous orbit. 104 dropout echo events have been found from 215 interplanetary shock events from 1998 to 2007 based on LANL satellite data. In analogy to substorm injections, these 104 events could be naturally divided into two categories: dispersionless (49 events) or dispersive (55 events) according to the energy dispersion of the initial dropout. It is found that locations of dispersionless events are distributed mainly in the duskside magnetosphere. Further, the obtained locations derived from dispersive events with the time-of-flight technique of the initial dropout regions are mainly located at the duskside as well. Statistical studies have shown that the effect of shock normal, interplanetary magnetic field Bz and solar wind dynamic pressure may be insignificant to these electron dropout events. We suggest that the electric field impulse induced by the IP shock produces a more pronounced inward migration of electrons at the dusk side, resulting in the observed dusk-side moderate dropout of electron flux and its consequent echoes.

  2. New Statistical Multiparticle Approach to the Acceleration of Electrons by the Ion Field in Plasmas

    Directory of Open Access Journals (Sweden)

    Eugene Oks

    2010-01-01

    Full Text Available The phenomenon of the acceleration of the (perturbing electrons by the ion field (AEIF significantly reduces Stark widths and shifts in plasmas of relatively high densities and/or relatively low temperature. Our previous analytical calculations of the AEIF were based on the dynamical treatment: the starting point was the ion-microfield-caused changes of the trajectories and velocities of individual perturbing electrons. In the current paper, we employ a statistical approach: the starting point is the electron velocity distribution function modified by the ion microfield. The latter had been calculated by Romanovsky and Ebeling in the multiparticle description of the ion microfield. The result shows again the reduction of the electron Stark broadening. Thus two totally different analytical approaches (dynamical and statistical agree with each other and therefore disprove the corresponding recent fully-numerical simulations by Stambulchik et al. that claimed an increase of the electron Stark broadening.

  3. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  4. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  5. Statistical considerations on safety analysis

    International Nuclear Information System (INIS)

    Pal, L.; Makai, M.

    2004-01-01

    The authors have investigated the statistical methods applied to safety analysis of nuclear reactors and arrived at alarming conclusions: a series of calculations with the generally appreciated safety code ATHLET were carried out to ascertain the stability of the results against input uncertainties in a simple experimental situation. Scrutinizing those calculations, we came to the conclusion that the ATHLET results may exhibit chaotic behavior. A further conclusion is that the technological limits are incorrectly set when the output variables are correlated. Another formerly unnoticed conclusion of the previous ATHLET calculations that certain innocent looking parameters (like wall roughness factor, the number of bubbles per unit volume, the number of droplets per unit volume) can influence considerably such output parameters as water levels. The authors are concerned with the statistical foundation of present day safety analysis practices and can only hope that their own misjudgment will be dispelled. Until then, the authors suggest applying correct statistical methods in safety analysis even if it makes the analysis more expensive. It would be desirable to continue exploring the role of internal parameters (wall roughness factor, steam-water surface in thermal hydraulics codes, homogenization methods in neutronics codes) in system safety codes and to study their effects on the analysis. In the validation and verification process of a code one carries out a series of computations. The input data are not precisely determined because measured data have an error, calculated data are often obtained from a more or less accurate model. Some users of large codes are content with comparing the nominal output obtained from the nominal input, whereas all the possible inputs should be taken into account when judging safety. At the same time, any statement concerning safety must be aleatory, and its merit can be judged only when the probability is known with which the

  6. Nonextensive statistical mechanics approach to electron trapping in degenerate plasmas

    Science.gov (United States)

    Mebrouk, Khireddine; Gougam, Leila Ait; Tribeche, Mouloud

    2016-06-01

    The electron trapping in a weakly nondegenerate plasma is reformulated and re-examined by incorporating the nonextensive entropy prescription. Using the q-deformed Fermi-Dirac distribution function including the quantum as well as the nonextensive statistical effects, we derive a new generalized electron density with a new contribution proportional to the electron temperature T, which may dominate the usual thermal correction (∼T2) at very low temperatures. To make the physics behind the effect of this new contribution more transparent, we analyze the modifications arising in the propagation of ion-acoustic solitary waves. Interestingly, we find that due to the nonextensive correction, our plasma model allows the possibility of existence of quantum ion-acoustic solitons with velocity higher than the Fermi ion-sound velocity. Moreover, as the nonextensive parameter q increases, the critical temperature Tc beyond which coexistence of compressive and rarefactive solitons sets in, is shifted towards higher values.

  7. Statistical shape analysis with applications in R

    CERN Document Server

    Dryden, Ian L

    2016-01-01

    A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while reta...

  8. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  9. Demonstration of fundamental statistics by studying timing of electronics signals in a physics-based laboratory

    Science.gov (United States)

    Beach, Shaun E.; Semkow, Thomas M.; Remling, David J.; Bradt, Clayton J.

    2017-07-01

    We have developed accessible methods to demonstrate fundamental statistics in several phenomena, in the context of teaching electronic signal processing in a physics-based college-level curriculum. A relationship between the exponential time-interval distribution and Poisson counting distribution for a Markov process with constant rate is derived in a novel way and demonstrated using nuclear counting. Negative binomial statistics is demonstrated as a model for overdispersion and justified by the effect of electronic noise in nuclear counting. The statistics of digital packets on a computer network are shown to be compatible with the fractal-point stochastic process leading to a power-law as well as generalized inverse Gaussian density distributions of time intervals between packets.

  10. Nonequilibrium statistical Zubarev's operator and Green's functions for an inhomogeneous electron gas

    Directory of Open Access Journals (Sweden)

    P.Kostrobii

    2006-01-01

    Full Text Available Nonequilibrium properties of an inhomogeneous electron gas are studied using the method of the nonequilibrium statistical operator by D.N. Zubarev. Generalized transport equations for the mean values of inhomogeneous operators of the electron number density, momentum density, and total energy density for weakly and strongly nonequilibrium states are obtained. We derive a chain of equations for the Green's functions, which connects commutative time-dependent Green's functions "density-density", "momentum-momentum", "enthalpy-enthalpy" with reduced Green's functions of the generalized transport coefficients and with Green's functions for higher order memory kernels in the case of a weakly nonequilibrium spatially inhomogeneous electron gas.

  11. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  12. Statistical Analysis of Research Data | Center for Cancer Research

    Science.gov (United States)

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

  13. Statistical analysis with Excel for dummies

    CERN Document Server

    Schmuller, Joseph

    2013-01-01

    Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro

  14. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    Science.gov (United States)

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  15. Electron teleportation and statistical transmutation in multiterminal Majorana islands

    Science.gov (United States)

    Michaeli, Karen; Landau, L. Aviad; Sela, Eran; Fu, Liang

    2017-11-01

    We study a topological superconductor island with spatially separated Majorana modes coupled to multiple normal-metal leads by single-electron tunneling in the Coulomb blockade regime. We show that low-temperature transport in such a Majorana island is carried by an emergent charge-e boson composed of a Majorana mode and an electronic excitation in leads. This transmutation from Fermi to Bose statistics has remarkable consequences. For noninteracting leads, the system flows to a non-Fermi-liquid fixed point, which is stable against tunnel couplings anisotropy or detuning away from the charge-degeneracy point. As a result, the system exhibits a universal conductance at zero temperature, which is a fraction of the conductance quantum, and low-temperature corrections with a universal power-law exponent. In addition, we consider Majorana islands connected to interacting one-dimensional leads, and find different stable fixed points near and far from the charge-degeneracy point.

  16. Physics colloquium: Single-electron counting in quantum metrology and in statistical mechanics

    CERN Multimedia

    Geneva University

    2011-01-01

    GENEVA UNIVERSITY Ecole de physique Département de physique nucléaire et corspusculaire 24, quai Ernest-Ansermet 1211 Genève 4 Tél.: (022) 379 62 73 Fax: (022) 379 69 92olé   Lundi 17 octobre 2011 17h00 - Ecole de Physique, Auditoire Stueckelberg PHYSICS COLLOQUIUM « Single-electron counting in quantum metrology and in statistical mechanics » Prof. Jukka Pekola Low Temperature Laboratory, Aalto University Helsinki, Finland   First I discuss the basics of single-electron tunneling and its potential applications in metrology. My main focus is in developing an accurate source of single-electron current for the realization of the unit ampere. I discuss the principle and the present status of the so-called single- electron turnstile. Investigation of errors in transporting electrons one by one has revealed a wealth of observations on fundamental phenomena in mesoscopic superconductivity, including individual Andreev...

  17. Statistical studies of energetic electrons in the outer radiation belt

    Energy Technology Data Exchange (ETDEWEB)

    Johnstone, A.D.; Rodgers, D.J.; Jones, G.H. E-mail: g.h.jones@ic.ac.uk

    1999-10-01

    The medium electron A (MEA) instrument aboard the CRRES spacecraft provided data on terrestrial radiation belt electrons in the energy range from 153 to 1582 keV, during 1990-91. These data have previously been used to produce an empirical model of the radiation belts from L=1.1 to 8.9, ordered according to 17 energy bands, 18 pitch angle bins, and 5 Kp ranges. Empirical models such as this are very valuable, but are prone to statistical fluctuations and gaps in coverage. In this study, in order to smooth the data and make it more easy to interpolate within data gaps, the pitch angle distribution at each energy in the model was fitted with a Bessel function. This provided a way to characterize the pitch angle in terms of only two parameters for each energy. It was not possible to model fluxes reliably within the loss cone because of poor statistics. The fitted distributions give an indication of the way in which pitch angle diffusion varies in the outer radiation belts. The two parameters of the Bessel function were found to vary systematically with L value, energy and Kp. Through the fitting of a simple function to these systematic variations, the number of parameters required to describe the model could be reduced drastically.

  18. Statistical analysis of dynamic parameters of the core

    International Nuclear Information System (INIS)

    Ionov, V.S.

    2007-01-01

    The transients of various types were investigated for the cores of zero power critical facilities in RRC KI and NPP. Dynamic parameters of neutron transients were explored by tool statistical analysis. Its have sufficient duration, few channels for currents of chambers and reactivity and also some channels for technological parameters. On these values the inverse period. reactivity, lifetime of neutrons, reactivity coefficients and some effects of a reactivity are determinate, and on the values were restored values of measured dynamic parameters as result of the analysis. The mathematical means of statistical analysis were used: approximation(A), filtration (F), rejection (R), estimation of parameters of descriptive statistic (DSP), correlation performances (kk), regression analysis(KP), the prognosis (P), statistician criteria (SC). The calculation procedures were realized by computer language MATLAB. The reasons of methodical and statistical errors are submitted: inadequacy of model operation, precision neutron-physical parameters, features of registered processes, used mathematical model in reactivity meters, technique of processing for registered data etc. Examples of results of statistical analysis. Problems of validity of the methods used for definition and certification of values of statistical parameters and dynamic characteristics are considered (Authors)

  19. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  20. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  1. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2013-01-01

    Statistics and Analysis of Scientific Data covers the foundations of probability theory and statistics, and a number of numerical and analytical methods that are essential for the present-day analyst of scientific data. Topics covered include probability theory, distribution functions of statistics, fits to two-dimensional datasheets and parameter estimation, Monte Carlo methods and Markov chains. Equal attention is paid to the theory and its practical application, and results from classic experiments in various fields are used to illustrate the importance of statistics in the analysis of scientific data. The main pedagogical method is a theory-then-application approach, where emphasis is placed first on a sound understanding of the underlying theory of a topic, which becomes the basis for an efficient and proactive use of the material for practical applications. The level is appropriate for undergraduates and beginning graduate students, and as a reference for the experienced researcher. Basic calculus is us...

  2. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  3. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  4. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  5. Classification, (big) data analysis and statistical learning

    CERN Document Server

    Conversano, Claudio; Vichi, Maurizio

    2018-01-01

    This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...

  6. Statistical hot spot analysis of reactor cores

    International Nuclear Information System (INIS)

    Schaefer, H.

    1974-05-01

    This report is an introduction into statistical hot spot analysis. After the definition of the term 'hot spot' a statistical analysis is outlined. The mathematical method is presented, especially the formula concerning the probability of no hot spots in a reactor core is evaluated. A discussion with the boundary conditions of a statistical hot spot analysis is given (technological limits, nominal situation, uncertainties). The application of the hot spot analysis to the linear power of pellets and the temperature rise in cooling channels is demonstrated with respect to the test zone of KNK II. Basic values, such as probability of no hot spots, hot spot potential, expected hot spot diagram and cumulative distribution function of hot spots, are discussed. It is shown, that the risk of hot channels can be dispersed equally over all subassemblies by an adequate choice of the nominal temperature distribution in the core

  7. The statistical analysis of anisotropies

    International Nuclear Information System (INIS)

    Webster, A.

    1977-01-01

    One of the many uses to which a radio survey may be put is an analysis of the distribution of the radio sources on the celestial sphere to find out whether they are bunched into clusters or lie in preferred regions of space. There are many methods of testing for clustering in point processes and since they are not all equally good this contribution is presented as a brief guide to what seems to be the best of them. The radio sources certainly do not show very strong clusering and may well be entirely unclustered so if a statistical method is to be useful it must be both powerful and flexible. A statistic is powerful in this context if it can efficiently distinguish a weakly clustered distribution of sources from an unclustered one, and it is flexible if it can be applied in a way which avoids mistaking defects in the survey for true peculiarities in the distribution of sources. The paper divides clustering statistics into two classes: number density statistics and log N/log S statistics. (Auth.)

  8. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  9. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  10. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  11. Dynamic and statistical thermodynamic properties of electrons in a thin quantum well in a parallel magnetic field

    International Nuclear Information System (INIS)

    Horing, Norman J Morgenstern; Glasser, M Lawrence; Dong Bing

    2006-01-01

    We carry out a theoretical analysis of quantum well electron dynamics in a parallel magnetic field of arbitrary strength, for a narrow quantum well. An explicit analytical closed-form solution is obtained for the retarded Green's function for Landau-quantized electrons in skipping states of motion between the narrow well walls, effectively involving in-plane translational motion, and hybridized with the zero-field lowest subband energy eigenstate. The dispersion relation for electron eigenstates is examined, and we find a plethora of such discrete Landau-quantized modes coupled to the subband state. In the weak field limit, we determine low magnetic field corrections to the lowest subband state energy associated with close-packing (phase averaging) of the Landau levels in the skipping states. At higher fields the discrete energy levels of the well lie between adjacent Landau levels, but they are not equally spaced, albeit undamped. Furthermore, we also examine the associated thermodynamic Green's function for Landau-quantized electrons in a thin quantum well in a parallel magnetic field and construct the (grand) thermodynamic potential (logarithm of the grand partition function) determining the statistical thermodynamics of the system

  12. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2017-01-01

    The revised second edition of this textbook provides the reader with a solid foundation in probability theory and statistics as applied to the physical sciences, engineering and related fields. It covers a broad range of numerical and analytical methods that are essential for the correct analysis of scientific data, including probability theory, distribution functions of statistics, fits to two-dimensional data and parameter estimation, Monte Carlo methods and Markov chains. Features new to this edition include: • a discussion of statistical techniques employed in business science, such as multiple regression analysis of multivariate datasets. • a new chapter on the various measures of the mean including logarithmic averages. • new chapters on systematic errors and intrinsic scatter, and on the fitting of data with bivariate errors. • a new case study and additional worked examples. • mathematical derivations and theoretical background material have been appropriately marked,to improve the readabili...

  13. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  14. Thermo-dynamical contours of electronic-vibrational spectra simulated using the statistical quantum-mechanical methods

    DEFF Research Database (Denmark)

    Pomogaev, Vladimir; Pomogaeva, Anna; Avramov, Pavel

    2011-01-01

    Three polycyclic organic molecules in various solvents focused on thermo-dynamical aspects were theoretically investigated using the recently developed statistical quantum mechanical/classical molecular dynamics method for simulating electronic-vibrational spectra. The absorption bands of estradiol...

  15. Bayesian Inference in Statistical Analysis

    CERN Document Server

    Box, George E P

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Rob

  16. Statistical electron angular correlation coefficients for atoms within the Hohenberg-Kohn-Sham theory

    International Nuclear Information System (INIS)

    Pathak, R.K.

    1985-01-01

    Statistical electron angular correlation coefficients tau = 2 2 He through 14 Si, within the Hohenberg-Kohn-Sham density-functional formalism. These are computed with use of the spectral sum rules obtained from the pseudoexcitation spectrum employing the recent formulation of the time-dependent Kohn-Sham theory due to Bartolotti. Various approximations to the exchange-correlation energy functional are used and for first-row atoms, a comparison is made with the highly accurate correlation coefficients recently obtained by Thakkar. The present tau values show closer agreement with those of Thakkar with increasing number of electrons

  17. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  18. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    Science.gov (United States)

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  19. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  20. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  1. Application of Ontology Technology in Health Statistic Data Analysis.

    Science.gov (United States)

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  2. Explorations in Statistics: The Analysis of Change

    Science.gov (United States)

    Curran-Everett, Douglas; Williams, Calvin L.

    2015-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of "Explorations in Statistics" explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can…

  3. An application of random field theory to analysis of electron trapping sites in disordered media

    International Nuclear Information System (INIS)

    Hilczer, M.; Bartczak, W.M.

    1993-01-01

    The potential energy surface in a disordered medium is considered a random field and described using the concepts of the mathematical theory of random fields. The preexisting traps for excess electrons are identified with certain regions of excursion (extreme regions) of the potential field. The theory provides an analytical method of statistical analysis of these regions. Parameters of the cavity-averaged potential field, which are provided by computer simulation of a given medium, serve as input data for the analysis. The statistics of preexisting traps are obtained for liquid methanol as a numerical example of the random field method. 26 refs., 6 figs

  4. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  5. Statistical electron correlation coefficients for the five lowest states of the heliumlike ions

    International Nuclear Information System (INIS)

    Thakkar, A.J.; Smith, V.H. Jr.

    1981-01-01

    Statistical correlation coefficients were introduced by Kutzelnigg, Del Re, and Berthier to provide overall measures of the difference between the electron pair density and the product of one-electron densities in atoms and molecules. Some properties of these coefficients are discussed, and it is shown that an angular correlation coefficient is experimentally accessible. Radial and angular correlation coefficients are computed from highly accurate wave functions for the 1 1 S, 2 3 S, 2 1 S, 2 3 P, and 2 1 P states of the heliumlike ions from He through Mg 10+ . It is found that positive angular correlation coefficients occur in the 2 1 P state of the two-electron positive ions but not in neutral helium. Moreover, the angular correlation coefficients for the 2 1 S and 2 3 S states of the positively charged two-electron ions show that a previously proposed reformulation of Hund's rule is incorrect

  6. Independent component analysis: A new possibility for analysing series of electron energy loss spectra

    International Nuclear Information System (INIS)

    Bonnet, Nogl; Nuzillard, Danielle

    2005-01-01

    A complementary approach is proposed for analysing series of electron energy-loss spectra that can be recorded with the spectrum-line technique, across an interface for instance. This approach, called blind source separation (BSS) or independent component analysis (ICA), complements two existing methods: the spatial difference approach and multivariate statistical analysis. The principle of the technique is presented and illustrations are given through one simulated example and one real example

  7. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  8. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  9. Semiclassical analysis, Witten Laplacians, and statistical mechanis

    CERN Document Server

    Helffer, Bernard

    2002-01-01

    This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S

  10. A novel statistic for genome-wide interaction analysis.

    Directory of Open Access Journals (Sweden)

    Xuesen Wu

    2010-09-01

    Full Text Available Although great progress in genome-wide association studies (GWAS has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked. The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001analysis is a valuable tool for finding remaining missing heritability unexplained by the current GWAS, and the developed novel statistic is able to search significant interaction between SNPs across the genome. Real data analysis showed that the results of genome-wide interaction analysis can be replicated in two independent studies.

  11. A statistical approach to plasma profile analysis

    International Nuclear Information System (INIS)

    Kardaun, O.J.W.F.; McCarthy, P.J.; Lackner, K.; Riedel, K.S.

    1990-05-01

    A general statistical approach to the parameterisation and analysis of tokamak profiles is presented. The modelling of the profile dependence on both the radius and the plasma parameters is discussed, and pertinent, classical as well as robust, methods of estimation are reviewed. Special attention is given to statistical tests for discriminating between the various models, and to the construction of confidence intervals for the parameterised profiles and the associated global quantities. The statistical approach is shown to provide a rigorous approach to the empirical testing of plasma profile invariance. (orig.)

  12. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    Science.gov (United States)

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  13. Statistical analysis of brake squeal noise

    Science.gov (United States)

    Oberst, S.; Lai, J. C. S.

    2011-06-01

    Despite substantial research efforts applied to the prediction of brake squeal noise since the early 20th century, the mechanisms behind its generation are still not fully understood. Squealing brakes are of significant concern to the automobile industry, mainly because of the costs associated with warranty claims. In order to remedy the problems inherent in designing quieter brakes and, therefore, to understand the mechanisms, a design of experiments study, using a noise dynamometer, was performed by a brake system manufacturer to determine the influence of geometrical parameters (namely, the number and location of slots) of brake pads on brake squeal noise. The experimental results were evaluated with a noise index and ranked for warm and cold brake stops. These data are analysed here using statistical descriptors based on population distributions, and a correlation analysis, to gain greater insight into the functional dependency between the time-averaged friction coefficient as the input and the peak sound pressure level data as the output quantity. The correlation analysis between the time-averaged friction coefficient and peak sound pressure data is performed by applying a semblance analysis and a joint recurrence quantification analysis. Linear measures are compared with complexity measures (nonlinear) based on statistics from the underlying joint recurrence plots. Results show that linear measures cannot be used to rank the noise performance of the four test pad configurations. On the other hand, the ranking of the noise performance of the test pad configurations based on the noise index agrees with that based on nonlinear measures: the higher the nonlinearity between the time-averaged friction coefficient and peak sound pressure, the worse the squeal. These results highlight the nonlinear character of brake squeal and indicate the potential of using nonlinear statistical analysis tools to analyse disc brake squeal.

  14. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  15. Analysis of room transfer function and reverberant signal statistics

    DEFF Research Database (Denmark)

    Georganti, Eleftheria; Mourjopoulos, John; Jacobsen, Finn

    2008-01-01

    For some time now, statistical analysis has been a valuable tool in analyzing room transfer functions (RTFs). This work examines existing statistical time-frequency models and techniques for RTF analysis (e.g., Schroeder's stochastic model and the standard deviation over frequency bands for the RTF...... magnitude and phase). RTF fractional octave smoothing, as with 1-slash 3 octave analysis, may lead to RTF simplifications that can be useful for several audio applications, like room compensation, room modeling, auralisation purposes. The aim of this work is to identify the relationship of optimal response...... and the corresponding ratio of the direct and reverberant signal. In addition, this work examines the statistical quantities for speech and audio signals prior to their reproduction within rooms and when recorded in rooms. Histograms and other statistical distributions are used to compare RTF minima of typical...

  16. Electron bunchlength measurement from analysis of fluctuations in spontaneous emission

    International Nuclear Information System (INIS)

    Catravas, P.; Leemans, W.P.; Wurtele, J.S.; Zolotorev, M.S.; Babzien, M.; Ben-Zvi, I.; Segalov, Z.; Wang, X.; Yakimenko, V.

    1999-01-01

    A statistical analysis of fluctuations in the spontaneous emission of a single bunch of electrons is shown to provide a new bunchlength diagnostic. This concept, originally proposed by Zolotorev and Stupakov [1], is based on the fact that shot noise from a finite bunch has a correlation length defined by the bunchlength, and therefore has a spiky spectrum. Single shot spectra of wiggler spontaneous emission have been measured at 632 nm from 44 MeV single electron bunches of 1 - 5 ps. The scaling of the spectral fluctuations with frequency resolution and the scaling of the spectral intensity distribution with bunchlength are studied. Bunchlength was extracted in a single shot measurement. Agreement was obtained between the experiment and a theoretical model, and with independent time integrated measurements. copyright 1999 American Institute of Physics

  17. Hearing Loss in Children: Data and Statistics

    Science.gov (United States)

    ... 5 Chapter 6 EHDI-IS Functional Standards EHDI Electronic Health Records EHDI Data Analysis and Statistical Hub (DASH) Articles & ... RSS ABOUT About CDC Jobs Funding LEGAL Policies Privacy FOIA No Fear Act OIG 1600 Clifton Road ...

  18. Transit safety & security statistics & analysis 2002 annual report (formerly SAMIS)

    Science.gov (United States)

    2004-12-01

    The Transit Safety & Security Statistics & Analysis 2002 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  19. Transit safety & security statistics & analysis 2003 annual report (formerly SAMIS)

    Science.gov (United States)

    2005-12-01

    The Transit Safety & Security Statistics & Analysis 2003 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  20. Correlated electrons and generalized statistics

    International Nuclear Information System (INIS)

    Wang, Q.A.

    2003-01-01

    Several important generalizations of Fermi-Dirac distribution are compared to numerical and experimental results for correlated electron systems. It is found that the quantum distributions based on incomplete information hypothesis can be useful for describing this kind of systems. We show that the additive incomplete fermion distribution gives very good description of weakly correlated electrons and that the non-additive one is suitable to very strong correlated cases. (author)

  1. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  2. Statistical analysis of long term spatial and temporal trends of ...

    Indian Academy of Sciences (India)

    Statistical analysis of long term spatial and temporal trends of temperature ... CGCM3; HadCM3; modified Mann–Kendall test; statistical analysis; Sutlej basin. ... Water Resources Systems Division, National Institute of Hydrology, Roorkee 247 ...

  3. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    Science.gov (United States)

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  4. Electronic Circuit Analysis Language (ECAL)

    Science.gov (United States)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  5. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  6. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  7. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  8. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  9. Australian Apprentice & Trainee Statistics: Electrical and Electronics Trades, 1995 to 1999. Australian Vocational Education & Training.

    Science.gov (United States)

    National Centre for Vocational Education Research, Leabrook (Australia).

    Statistics regarding Australians participating in apprenticeships and traineeships in the electrical and electronics trades in 1995-1999 were reviewed to provide an indication of where skill shortages may be occurring or will likely occur in relation to the following occupations: electrical engineering associate professional; electronics…

  10. Statistical analysis on extreme wave height

    Digital Repository Service at National Institute of Oceanography (India)

    Teena, N.V.; SanilKumar, V.; Sudheesh, K.; Sajeev, R.

    -294. • WAFO (2000) – A MATLAB toolbox for analysis of random waves and loads, Lund University, Sweden, homepage http://www.maths.lth.se/matstat/wafo/,2000. 15    Table 1: Statistical results of data and fitted distribution for cumulative distribution...

  11. Statistical Analysis of Zebrafish Locomotor Response.

    Science.gov (United States)

    Liu, Yiwen; Carmer, Robert; Zhang, Gaonan; Venkatraman, Prahatha; Brown, Skye Ashton; Pang, Chi-Pui; Zhang, Mingzhi; Ma, Ping; Leung, Yuk Fai

    2015-01-01

    Zebrafish larvae display rich locomotor behaviour upon external stimulation. The movement can be simultaneously tracked from many larvae arranged in multi-well plates. The resulting time-series locomotor data have been used to reveal new insights into neurobiology and pharmacology. However, the data are of large scale, and the corresponding locomotor behavior is affected by multiple factors. These issues pose a statistical challenge for comparing larval activities. To address this gap, this study has analyzed a visually-driven locomotor behaviour named the visual motor response (VMR) by the Hotelling's T-squared test. This test is congruent with comparing locomotor profiles from a time period. Different wild-type (WT) strains were compared using the test, which shows that they responded differently to light change at different developmental stages. The performance of this test was evaluated by a power analysis, which shows that the test was sensitive for detecting differences between experimental groups with sample numbers that were commonly used in various studies. In addition, this study investigated the effects of various factors that might affect the VMR by multivariate analysis of variance (MANOVA). The results indicate that the larval activity was generally affected by stage, light stimulus, their interaction, and location in the plate. Nonetheless, different factors affected larval activity differently over time, as indicated by a dynamical analysis of the activity at each second. Intriguingly, this analysis also shows that biological and technical repeats had negligible effect on larval activity. This finding is consistent with that from the Hotelling's T-squared test, and suggests that experimental repeats can be combined to enhance statistical power. Together, these investigations have established a statistical framework for analyzing VMR data, a framework that should be generally applicable to other locomotor data with similar structure.

  12. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  13. Sensitivity analysis of ranked data: from order statistics to quantiles

    NARCIS (Netherlands)

    Heidergott, B.F.; Volk-Makarewicz, W.

    2015-01-01

    In this paper we provide the mathematical theory for sensitivity analysis of order statistics of continuous random variables, where the sensitivity is with respect to a distributional parameter. Sensitivity analysis of order statistics over a finite number of observations is discussed before

  14. Electron microscopy and image analysis of the GroEL-like protein and its complexes with glutamine synthetase from pea leaves

    NARCIS (Netherlands)

    Tsuprun, Vladimir L.; Boekema, Egbert J.; Pushkin, Alexander V.; Tagunova, Irina V.

    1992-01-01

    The molecular structure of groEL-like protein from pea leaves has been studied by electron microscopy and image analysis of negatively stained particles. Over 1500 molecular projections were selected and classified by multivariate statistical analysis. It was shown that the molecule consists of 14

  15. Comparative study of macrotexture analysis using X-ray diffraction and electron backscattered diffraction techniques

    International Nuclear Information System (INIS)

    Serna, Marilene Morelli

    2002-01-01

    The macrotexture is one of the main characteristics in metallic materials, which the physical properties depend on the crystallographic direction. The analysis of the macrotexture to middles of the decade of 80 was just accomplished by the techniques of Xray diffraction and neutrons diffraction. The possibility of the analysis of the macrotexture using, the technique of electron backscattering diffraction in the scanning electronic microscope, that allowed to correlate the measure of the orientation with its location in the micro structure, was a very welcome tool in the area of engineering of materials. In this work it was studied the theoretical aspects of the two techniques and it was used of both techniques for the analysis of the macrotexture of aluminum sheets 1050 and 3003 with intensity, measured through the texture index 'J', from 2.00 to 5.00. The results obtained by the two techniques were shown reasonably similar, being considered that the statistics of the data obtained by the technique of electron backscatter diffraction is much inferior to the obtained by the X-ray diffraction. (author)

  16. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  17. Correlation analysis of electronic products with myopia in preschool and school aged children

    Directory of Open Access Journals (Sweden)

    Li-Li Sun

    2016-02-01

    Full Text Available AIM: To explore the influence of electronic products on myopia in preschool and school aged children, and the development regularities of myopia, to formulate reasonable guidelines for using eyes healthily, and lay a solid foundation for the prevention and control work. METHODS: This retrospective analysis enrolled 900 3~12 years old children from outpatients department, and all of them were established individualized archives, recording: uncorrected visual acuity, optometry, slit lamp, ophthalmoscopy, strabismus inspection results; recording eye usage condition on TVs, computers, mobile phones, iPad, homework, extra-curricular books. Statistical analyze the refractive status of each age group, the use of electronic products of different age groups and their correlation with refractive status. RESULTS: The number of preschool children with normal uncorrected visual acuity was more than that of early school-age children, and the difference was statistically significant(PP>0.05; the number of children aged 7~12(early school aged childrenwith myopia was more than that of children aged 3~6(preschool childrenand the difference was statistically significant(PCONCLUSION: For preschool children, it is necessary to conduct early screening, health guidance, the establishment of personalized medical records and one-to-one personalized guidance; it is also needed to avoid the arduous learning task with the stacking usage of eyes, to fight for myopia and to control the development of myopia. Therefore, to reduce the use of electronic products has become a topic worthy of further study.

  18. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  19. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  20. The fuzzy approach to statistical analysis

    NARCIS (Netherlands)

    Coppi, Renato; Gil, Maria A.; Kiers, Henk A. L.

    2006-01-01

    For the last decades, research studies have been developed in which a coalition of Fuzzy Sets Theory and Statistics has been established with different purposes. These namely are: (i) to introduce new data analysis problems in which the objective involves either fuzzy relationships or fuzzy terms;

  1. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  2. Statistical properties of single-mode emission in free-electron lasers

    International Nuclear Information System (INIS)

    Bertolotti, M.; Luks, A.; Perina, J.; Perinova, V.; Sibilia, C.

    1984-01-01

    The authors of this paper discuss the statistical properties of radiation produced in the free electron laser, in the case of singlemode emission when the system is used as an amplifier, with very small gain. The coherent states technique and the q-c number correspondence is employed, starting from the master-equation and obtaining the generalized Fokker-Planck equation for the anti-normal quasidistribution function. Solutions of Fokker-Planck equation provide the photocounting distribution and its factorial moments. No losses are included. It is shown that, in the short-time approximation, the radiation field exhibits antibunching, and that the photocounting distributions, when some suitable conditions on the field intensities are fulfilled, in the stationary regime shows a two-peak behavior, evidencing the existence of bistable states

  3. Analysis of neutron flux measurement systems using statistical functions

    International Nuclear Information System (INIS)

    Pontes, Eduardo Winston

    1997-01-01

    This work develops an integrated analysis for neutron flux measurement systems using the concepts of cumulants and spectra. Its major contribution is the generalization of Campbell's theorem in the form of spectra in the frequency domain, and its application to the analysis of neutron flux measurement systems. Campbell's theorem, in its generalized form, constitutes an important tool, not only to find the nth-order frequency spectra of the radiation detector, but also in the system analysis. The radiation detector, an ionization chamber for neutrons, is modeled for cylindrical, plane and spherical geometries. The detector current pulses are characterized by a vector of random parameters, and the associated charges, statistical moments and frequency spectra of the resulting current are calculated. A computer program is developed for application of the proposed methodology. In order for the analysis to integrate the associated electronics, the signal processor is studied, considering analog and digital configurations. The analysis is unified by developing the concept of equivalent systems that can be used to describe the cumulants and spectra in analog or digital systems. The noise in the signal processor input stage is analysed in terms of second order spectrum. Mathematical expressions are presented for cumulants and spectra up to fourth order, for important cases of filter positioning relative to detector spectra. Unbiased conventional estimators for cumulants are used, and, to evaluate systems precision and response time, expressions are developed for their variances. Finally, some possibilities for obtaining neutron radiation flux as a function of cumulants are discussed. In summary, this work proposes some analysis tools which make possible important decisions in the design of better neutron flux measurement systems. (author)

  4. Statistical theory of electron densities

    International Nuclear Information System (INIS)

    Pratt, L.R.; Hoffman, G.G.; Harris, R.A.

    1988-01-01

    An optimized Thomas--Fermi theory is proposed which retains the simplicity of the original theory and is a suitable reference theory for Monte Carlo density functional treatments of condensed materials. The key ingredient of the optimized theory is a neighborhood sampled potential which contains effects of the inhomogeneities in the one-electron potential. In contrast to the traditional Thomas--Fermi approach, the optimized theory predicts a finite electron density in the vicinity of a nucleus. Consideration of the example of an ideal electron gas subject to a central Coulomb field indicates that implementation of the approach is straightforward. The optimized theory is found to fail completely when a classically forbidden region is approached. However, these circumstances are not of primary interest for calculations of interatomic forces. It is shown how the energy functional of the density may be constructed by integration of a generalized Hellmann--Feynman relation. This generalized Hellmann--Feynman relation proves to be equivalent to the variational principle of density functional quantum mechanics, and, therefore, the present density theory can be viewed as a variational consequence of the constructed energy functional

  5. Foundation of statistical energy analysis in vibroacoustics

    CERN Document Server

    Le Bot, A

    2015-01-01

    This title deals with the statistical theory of sound and vibration. The foundation of statistical energy analysis is presented in great detail. In the modal approach, an introduction to random vibration with application to complex systems having a large number of modes is provided. For the wave approach, the phenomena of propagation, group speed, and energy transport are extensively discussed. Particular emphasis is given to the emergence of diffuse field, the central concept of the theory.

  6. IMPLEMENTATION AND VALIDATION OF STATISTICAL TESTS IN RESEARCH'S SOFTWARE HELPING DATA COLLECTION AND PROTOCOLS ANALYSIS IN SURGERY.

    Science.gov (United States)

    Kuretzki, Carlos Henrique; Campos, Antônio Carlos Ligocki; Malafaia, Osvaldo; Soares, Sandramara Scandelari Kusano de Paula; Tenório, Sérgio Bernardo; Timi, Jorge Rufino Ribas

    2016-03-01

    The use of information technology is often applied in healthcare. With regard to scientific research, the SINPE(c) - Integrated Electronic Protocols was created as a tool to support researchers, offering clinical data standardization. By the time, SINPE(c) lacked statistical tests obtained by automatic analysis. Add to SINPE(c) features for automatic realization of the main statistical methods used in medicine . The study was divided into four topics: check the interest of users towards the implementation of the tests; search the frequency of their use in health care; carry out the implementation; and validate the results with researchers and their protocols. It was applied in a group of users of this software in their thesis in the strict sensu master and doctorate degrees in one postgraduate program in surgery. To assess the reliability of the statistics was compared the data obtained both automatically by SINPE(c) as manually held by a professional in statistics with experience with this type of study. There was concern for the use of automatic statistical tests, with good acceptance. The chi-square, Mann-Whitney, Fisher and t-Student were considered as tests frequently used by participants in medical studies. These methods have been implemented and thereafter approved as expected. The incorporation of the automatic SINPE (c) Statistical Analysis was shown to be reliable and equal to the manually done, validating its use as a research tool for medical research.

  7. A STATISTICAL STUDY OF SOLAR ELECTRON EVENTS OVER ONE SOLAR CYCLE

    International Nuclear Information System (INIS)

    Wang Linghua; Lin, R. P.; Krucker, Säm; Mason, Glenn M.

    2012-01-01

    We survey the statistical properties of 1191 solar electron events observed by the WIND 3DP instrument from 300 keV for a solar cycle (1995 through 2005). After taking into account times of high background, the corrected occurrence frequency of solar electron events versus peak flux exhibits a power-law distribution over three orders of magnitude with exponents between –1.0 and –1.6 for different years, comparable to the frequency distribution of solar proton events, microflares, and coronal mass ejections (CMEs), but significantly flatter than that of soft X-ray (SXR) flares. At 40 keV (2.8 keV), the integrated occurrence rate above ∼0.29 (∼330) cm –2 s –1 sr –1 keV –1 near 1 AU is ∼1000 year –1 (∼600 year –1 ) at solar maximum and ∼35 year –1 (∼25 year –1 ) at solar minimum, about an order of magnitude larger than the observed occurrence rate. We find these events typically extend over ∼45° in longitude, implying the occurrence rate over the whole Sun is ∼10 4 year –1 near solar maximum. The observed solar electron events have a 98.75% association with type III radio bursts, suggesting all type III bursts may be associated with a solar electron event. They have a close (∼76%) association with the presence of low-energy (∼0.02-2 MeV nucleon –1 ), 3 He-rich ( 3 He/ 4 He ≥ 0.01) ion emissions measured by the ACE ULEIS instrument. For these electron events, only ∼35% are associated with a reported GOES SXR flare, but ∼60% appear to be associated with a CME, with ∼50% of these CMEs being narrow. These electrons are often detected down to below 1 keV, indicating a source high in the corona.

  8. Statistical Analysis of Big Data on Pharmacogenomics

    Science.gov (United States)

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  9. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  10. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  11. Robust statistics and geochemical data analysis

    International Nuclear Information System (INIS)

    Di, Z.

    1987-01-01

    Advantages of robust procedures over ordinary least-squares procedures in geochemical data analysis is demonstrated using NURE data from the Hot Springs Quadrangle, South Dakota, USA. Robust principal components analysis with 5% multivariate trimming successfully guarded the analysis against perturbations by outliers and increased the number of interpretable factors. Regression with SINE estimates significantly increased the goodness-of-fit of the regression and improved the correspondence of delineated anomalies with known uranium prospects. Because of the ubiquitous existence of outliers in geochemical data, robust statistical procedures are suggested as routine procedures to replace ordinary least-squares procedures

  12. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    International Nuclear Information System (INIS)

    Reed, J.K.

    1999-01-01

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities

  13. Comparative analysis of the terms "electronic commerce" and "electronic business"

    OpenAIRE

    Kavaliauskienė, Virginija; Šarapovas, Tadas

    2002-01-01

    Establishing the terms that clearly and consistently describe growing and dynamic networked economy is a critical first step toward further analysis and evaluation of electronic commerce and electronic business processes. Electronic commerce is making an impact on the ways that purchasing activities are being conducted. Much of the early literature on this subject was very speculative. However, the growth of e-commerce has enabled more observations to be made of the use of electronic business...

  14. Conjunction analysis and propositional logic in fMRI data analysis using Bayesian statistics.

    Science.gov (United States)

    Rudert, Thomas; Lohmann, Gabriele

    2008-12-01

    To evaluate logical expressions over different effects in data analyses using the general linear model (GLM) and to evaluate logical expressions over different posterior probability maps (PPMs). In functional magnetic resonance imaging (fMRI) data analysis, the GLM was applied to estimate unknown regression parameters. Based on the GLM, Bayesian statistics can be used to determine the probability of conjunction, disjunction, implication, or any other arbitrary logical expression over different effects or contrast. For second-level inferences, PPMs from individual sessions or subjects are utilized. These PPMs can be combined to a logical expression and its probability can be computed. The methods proposed in this article are applied to data from a STROOP experiment and the methods are compared to conjunction analysis approaches for test-statistics. The combination of Bayesian statistics with propositional logic provides a new approach for data analyses in fMRI. Two different methods are introduced for propositional logic: the first for analyses using the GLM and the second for common inferences about different probability maps. The methods introduced extend the idea of conjunction analysis to a full propositional logic and adapt it from test-statistics to Bayesian statistics. The new approaches allow inferences that are not possible with known standard methods in fMRI. (c) 2008 Wiley-Liss, Inc.

  15. Quantum-mechanical analysis of low-gain free-electron laser oscillators

    Science.gov (United States)

    Fares, H.; Yamada, M.; Chiadroni, E.; Ferrario, M.

    2018-05-01

    In the previous classical theory of the low-gain free-electron laser (FEL) oscillators, the electron is described as a point-like particle, a delta function in the spatial space. On the other hand, in the previous quantum treatments, the electron is described as a plane wave with a single momentum state, a delta function in the momentum space. In reality, an electron must have statistical uncertainties in the position and momentum domains. Then, the electron is neither a point-like charge nor a plane wave of a single momentum. In this paper, we rephrase the theory of the low-gain FEL where the interacting electron is represented quantum mechanically by a plane wave with a finite spreading length (i.e., a wave packet). Using the concepts of the transformation of reference frames and the statistical quantum mechanics, an expression for the single-pass radiation gain is derived. The spectral broadening of the radiation is expressed in terms of the spreading length of an electron, the relaxation time characterizing the energy spread of electrons, and the interaction time. We introduce a comparison between our results and those obtained in the already known classical analyses where a good agreement between both results is shown. While the correspondence between our results and the classical results are shown, novel insights into the electron dynamics and the interaction mechanism are presented.

  16. Full counting statistics of level renormalization in electron transport through double quantum dots

    International Nuclear Information System (INIS)

    Luo Junyan; Shen Yu; Cen Gang; He Xiaoling; Wang Changrong; Jiao Hujun

    2011-01-01

    We examine the full counting statistics of electron transport through double quantum dots coupled in series, with particular attention being paid to the unique features originating from level renormalization. It is clearly illustrated that the energy renormalization gives rise to a dynamic charge blockade mechanism, which eventually results in super-Poissonian noise. Coupling of the double dots to an external heat bath leads to dephasing and relaxation mechanisms, which are demonstrated to suppress the noise in a unique way.

  17. Empirical and Statistical Evaluation of the Effectiveness of Four ...

    African Journals Online (AJOL)

    Akorede

    ABSTRACT: Data compression is the process of reducing the size of a file to effectively ... Through the statistical analysis performed using Boxplot and ANOVA and comparison made ...... Automatic Control, Electronics and Computer Science.

  18. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  19. Multivariate Statistical Methods as a Tool of Financial Analysis of Farm Business

    Czech Academy of Sciences Publication Activity Database

    Novák, J.; Sůvová, H.; Vondráček, Jiří

    2002-01-01

    Roč. 48, č. 1 (2002), s. 9-12 ISSN 0139-570X Institutional research plan: AV0Z1030915 Keywords : financial analysis * financial ratios * multivariate statistical methods * correlation analysis * discriminant analysis * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research

  20. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  1. Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA)

    Science.gov (United States)

    SRD 100 Database for Simulation of Electron Spectra for Surface Analysis (SESSA)Database for Simulation of Electron Spectra for Surface Analysis (SESSA) (PC database for purchase)   This database has been designed to facilitate quantitative interpretation of Auger-electron and X-ray photoelectron spectra and to improve the accuracy of quantitation in routine analysis. The database contains all physical data needed to perform quantitative interpretation of an electron spectrum for a thin-film specimen of given composition. A simulation module provides an estimate of peak intensities as well as the energy and angular distributions of the emitted electron flux.

  2. Statistical analysis of environmental data

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Bowman, K.O.; Miller, F.L. Jr.

    1975-10-01

    This report summarizes the analyses of data obtained by the Radiological Hygiene Branch of the Tennessee Valley Authority from samples taken around the Browns Ferry Nuclear Plant located in Northern Alabama. The data collection was begun in 1968 and a wide variety of types of samples have been gathered on a regular basis. The statistical analysis of environmental data involving very low-levels of radioactivity is discussed. Applications of computer calculations for data processing are described

  3. Path integral molecular dynamics for exact quantum statistics of multi-electronic-state systems.

    Science.gov (United States)

    Liu, Xinzijian; Liu, Jian

    2018-03-14

    An exact approach to compute physical properties for general multi-electronic-state (MES) systems in thermal equilibrium is presented. The approach is extended from our recent progress on path integral molecular dynamics (PIMD), Liu et al. [J. Chem. Phys. 145, 024103 (2016)] and Zhang et al. [J. Chem. Phys. 147, 034109 (2017)], for quantum statistical mechanics when a single potential energy surface is involved. We first define an effective potential function that is numerically favorable for MES-PIMD and then derive corresponding estimators in MES-PIMD for evaluating various physical properties. Its application to several representative one-dimensional and multi-dimensional models demonstrates that MES-PIMD in principle offers a practical tool in either of the diabatic and adiabatic representations for studying exact quantum statistics of complex/large MES systems when the Born-Oppenheimer approximation, Condon approximation, and harmonic bath approximation are broken.

  4. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  5. Statistical Power Analysis with Missing Data A Structural Equation Modeling Approach

    CERN Document Server

    Davey, Adam

    2009-01-01

    Statistical power analysis has revolutionized the ways in which we conduct and evaluate research.  Similar developments in the statistical analysis of incomplete (missing) data are gaining more widespread applications. This volume brings statistical power and incomplete data together under a common framework, in a way that is readily accessible to those with only an introductory familiarity with structural equation modeling.  It answers many practical questions such as: How missing data affects the statistical power in a study How much power is likely with different amounts and types

  6. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2003-01-01

    Statistical analyses are performed for material strength parameters from a large number of specimens of structural timber. Non-parametric statistical analysis and fits have been investigated for the following distribution types: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...... fits to the data available, especially if tail fits are used whereas the Log Normal distribution generally gives a poor fit and larger coefficients of variation, especially if tail fits are used. The implications on the reliability level of typical structural elements and on partial safety factors...... for timber are investigated....

  7. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  8. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  9. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    Energy Technology Data Exchange (ETDEWEB)

    Vartanyants, I.A.; Singer, A. [HASYLAB at Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2009-07-15

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  10. Analysis of coherence properties of 3-rd generation synchrotron sources and free-electron lasers

    International Nuclear Information System (INIS)

    Vartanyants, I.A.; Singer, A.

    2009-07-01

    A general theoretical approach based on the results of statistical optics is used for the analysis of the transverse coherence properties of 3-rd generation synchrotron sources and X-ray free-electron lasers (XFEL). Correlation properties of the wave elds are calculated at different distances from an equivalent Gaussian Schell-model source. This model is used to describe coherence properties of the five meter undulator source at the synchrotron storage ring PETRA III. In the case of XFEL sources the decomposition of the statistical fields into a sum of independently propagating transverse modes is used for the analysis of the coherence properties of these new sources. A detailed calculation is performed for the parameters of the SASE1 undulator at the European XFEL. It is demonstrated that only a few modes contribute significantly to the total radiation field of that source. (orig.)

  11. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  12. On the Statistical Validation of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Rosane Riera Freire

    2007-06-01

    Full Text Available Technical analysis, or charting, aims on visually identifying geometrical patterns in price charts in order to antecipate price "trends". In this paper we revisit the issue of thecnical analysis validation which has been tackled in the literature without taking care for (i the presence of heterogeneity and (ii statistical dependence in the analyzed data - various agglutinated return time series from distinct financial securities. The main purpose here is to address the first cited problem by suggesting a validation methodology that also "homogenizes" the securities according to the finite dimensional probability distribution of their return series. The general steps go through the identification of the stochastic processes for the securities returns, the clustering of similar securities and, finally, the identification of presence, or absence, of informatinal content obtained from those price patterns. We illustrate the proposed methodology with a real data exercise including several securities of the global market. Our investigation shows that there is a statistically significant informational content in two out of three common patterns usually found through technical analysis, namely: triangle, rectangle and head and shoulders.

  13. The "Core Concepts Plus" Paradigm for Creating an Electronic Textbook for Introductory Business and Economic Statistics

    Science.gov (United States)

    Haley, M. Ryan

    2013-01-01

    This paper describes a flexible paradigm for creating an electronic "Core Concepts Plus" textbook (CCP-text) for a course in Introductory Business and Economic Statistics (IBES). In general terms, "core concepts" constitute the intersection of IBES course material taught by all IBES professors at the author's university. The…

  14. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Wendelberger, J.R.; McVittie, T.I.

    1995-01-01

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  15. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  16. ANALYSIS OF ORGANIZATIONAL AND TECHNOLOGICAL ENVIRONMENT OF THE EXISTENCE OF ELECTRONIC SERVICES

    Directory of Open Access Journals (Sweden)

    R. Y. Vakulenko

    2016-01-01

    Full Text Available In this paper based on the analysis of statistical information on market development of electronic services as part of market information services, as well as the author's understanding of the achievements in this area, given the factors of dynamic development of information and communication technologies, the benefits of e-business in the Russian Federation, in particular, trade and market development activities of this type of service. In particular, it was found that despite the positive dynamics of development of market information (electronic services in our country, it is still very low compared with that of other states. The main problems of development of market information services to the example of e-commerce. The most internet-capacious is the work of institutions of higher education, finance, health care.

  17. Diagnosis checking of statistical analysis in RCTs indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-11-01

    Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  18. A κ-generalized statistical mechanics approach to income analysis

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  19. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2009-01-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  20. Normality Tests for Statistical Analysis: A Guide for Non-Statisticians

    Science.gov (United States)

    Ghasemi, Asghar; Zahediasl, Saleh

    2012-01-01

    Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808

  1. Classification of Malaysia aromatic rice using multivariate statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A. [School of Mechatronic Engineering, Universiti Malaysia Perlis, Kampus Pauh Putra, 02600 Arau, Perlis (Malaysia); Omar, O. [Malaysian Agriculture Research and Development Institute (MARDI), Persiaran MARDI-UPM, 43400 Serdang, Selangor (Malaysia)

    2015-05-15

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  2. Classification of Malaysia aromatic rice using multivariate statistical analysis

    Science.gov (United States)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-05-01

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC-MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.

  3. Classification of Malaysia aromatic rice using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.

    2015-01-01

    Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties

  4. Statistical analysis of MMS observations of energetic electron escape observed at/beyond the dayside magnetopause

    Science.gov (United States)

    Cohen, Ian J.; Mauk, Barry H.; Anderson, Brian J.; Westlake, Joseph H.; Sibeck, David G.; Turner, Drew L.; Fennell, Joseph F.; Blake, J. Bern; Jaynes, Allison N.; Leonard, Trevor W.; Baker, Daniel N.; Spence, Harlan E.; Reeves, Geoff D.; Giles, Barbara J.; Strangeway, Robert J.; Torbert, Roy B.; Burch, James L.

    2017-09-01

    Observations from the Energetic Particle Detector (EPD) instrument suite aboard the Magnetospheric Multiscale (MMS) spacecraft show that energetic (greater than tens of keV) magnetospheric particle escape into the magnetosheath occurs commonly across the dayside. This includes the surprisingly frequent observation of magnetospheric electrons in the duskside magnetosheath, an unexpected result given assumptions regarding magnetic drift shadowing. The 238 events identified in the 40 keV electron energy channel during the first MMS dayside season that exhibit strongly anisotropic pitch angle distributions indicating monohemispheric field-aligned streaming away from the magnetopause. A review of the extremely rich literature of energetic electron observations beyond the magnetopause is provided to place these new observations into historical context. Despite the extensive history of such research, these new observations provide a more comprehensive data set that includes unprecedented magnetic local time (MLT) coverage of the dayside equatorial magnetopause/magnetosheath. These data clearly highlight the common escape of energetic electrons along magnetic field lines concluded to have been reconnected across the magnetopause. While these streaming escape events agree with prior studies which show strong correlation with geomagnetic activity (suggesting a magnetotail source) and occur most frequently during periods of southward IMF, the high number of duskside events is unexpected and previously unobserved. Although the lowest electron energy channel was the focus of this study, the events reported here exhibit pitch angle anisotropies indicative of streaming up to 200 keV, which could represent the magnetopause loss of >1 MeV electrons from the outer radiation belt.

  5. Statistical analysis of MMS observations of energetic electron escape observed at/beyond the dayside magnetopause

    International Nuclear Information System (INIS)

    Cohen, Ian J.; Mauk, Barry H.; Anderson, Brian J.; Westlake, Joseph H.; Sibeck, David G.

    2017-01-01

    Here, observations from the Energetic Particle Detector (EPD) instrument suite aboard the Magnetospheric Multiscale (MMS) spacecraft show that energetic (greater than tens of keV) magnetospheric particle escape into the magnetosheath occurs commonly across the dayside. This includes the surprisingly frequent observation of magnetospheric electrons in the duskside magnetosheath, an unexpected result given assumptions regarding magnetic drift shadowing. The 238 events identified in the 40 keV electron energy channel during the first MMS dayside season that exhibit strongly anisotropic pitch angle distributions indicating monohemispheric field-aligned streaming away from the magnetopause. A review of the extremely rich literature of energetic electron observations beyond the magnetopause is provided to place these new observations into historical context. Despite the extensive history of such research, these new observations provide a more comprehensive data set that includes unprecedented magnetic local time (MLT) coverage of the dayside equatorial magnetopause/magnetosheath. These data clearly highlight the common escape of energetic electrons along magnetic field lines concluded to have been reconnected across the magnetopause. While these streaming escape events agree with prior studies which show strong correlation with geomagnetic activity (suggesting a magnetotail source) and occur most frequently during periods of southward IMF, the high number of duskside events is unexpected and previously unobserved. Although the lowest electron energy channel was the focus of this study, the events reported here exhibit pitch angle anisotropies indicative of streaming up to 200 keV, which could represent the magnetopause loss of >1 MeV electrons from the outer radiation belt.

  6. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    Science.gov (United States)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  7. Sources of Safety Data and Statistical Strategies for Design and Analysis: Postmarket Surveillance.

    Science.gov (United States)

    Izem, Rima; Sanchez-Kam, Matilde; Ma, Haijun; Zink, Richard; Zhao, Yueqin

    2018-03-01

    Safety data are continuously evaluated throughout the life cycle of a medical product to accurately assess and characterize the risks associated with the product. The knowledge about a medical product's safety profile continually evolves as safety data accumulate. This paper discusses data sources and analysis considerations for safety signal detection after a medical product is approved for marketing. This manuscript is the second in a series of papers from the American Statistical Association Biopharmaceutical Section Safety Working Group. We share our recommendations for the statistical and graphical methodologies necessary to appropriately analyze, report, and interpret safety outcomes, and we discuss the advantages and disadvantages of safety data obtained from passive postmarketing surveillance systems compared to other sources. Signal detection has traditionally relied on spontaneous reporting databases that have been available worldwide for decades. However, current regulatory guidelines and ease of reporting have increased the size of these databases exponentially over the last few years. With such large databases, data-mining tools using disproportionality analysis and helpful graphics are often used to detect potential signals. Although the data sources have many limitations, analyses of these data have been successful at identifying safety signals postmarketing. Experience analyzing these dynamic data is useful in understanding the potential and limitations of analyses with new data sources such as social media, claims, or electronic medical records data.

  8. Statistical analysis of metallicity in spiral galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Galeotti, P [Consiglio Nazionale delle Ricerche, Turin (Italy). Lab. di Cosmo-Geofisica; Turin Univ. (Italy). Ist. di Fisica Generale)

    1981-04-01

    A principal component analysis of metallicity and other integral properties of 33 spiral galaxies is presented; the involved parameters are: morphological type, diameter, luminosity and metallicity. From the statistical analysis it is concluded that the sample has only two significant dimensions and additonal tests, involving different parameters, show similar results. Thus it seems that only type and luminosity are independent variables, being the other integral properties of spiral galaxies correlated with them.

  9. Semi-empirical prediction of moisture build-up in an electronic enclosure using analysis of variance (ANOVA)

    DEFF Research Database (Denmark)

    Shojaee Nasirabadi, Parizad; Conseil, Helene; Mohanty, Sankhya

    2016-01-01

    Electronic systems are exposed to harsh environmental conditions such as high humidity in many applications. Moisture transfer into electronic enclosures and condensation can cause several problems as material degradation and corrosion. Therefore, it is important to control the moisture content...... and the relative humidity inside electronic enclosures. In this work, moisture transfer into a typical polycarbonate electronic enclosure with a cylindrical shape opening is studied. The effects of four influential parameters namely, initial relative humidity inside the enclosure, radius and length of the opening...... and temperature are studied. A set of experiments are done based on a fractional factorial design in order to estimate the time constant for moisture transfer into the enclosure by fitting the experimental data to an analytical quasi-steady-state model. According to the statistical analysis, temperature...

  10. Print and Electronic Resources: Usage Statistics at Guru Gobind Singh Indraprastha University Library

    Science.gov (United States)

    Kapoor, Kanta

    2010-01-01

    Purpose: The purpose of this paper is to quantify the use of electronic journals in comparison with the print collections in the Guru Gobind Singh Indraprastha University Library. Design/methodology/approach: A detailed analysis was made of the use of lending services, the Xerox facility and usage of electronic journals such as Science Direct,…

  11. Statistical Analysis of Protein Ensembles

    Science.gov (United States)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  12. [Electronic poison information management system].

    Science.gov (United States)

    Kabata, Piotr; Waldman, Wojciech; Kaletha, Krystian; Sein Anand, Jacek

    2013-01-01

    We describe deployment of electronic toxicological information database in poison control center of Pomeranian Center of Toxicology. System was based on Google Apps technology, by Google Inc., using electronic, web-based forms and data tables. During first 6 months from system deployment, we used it to archive 1471 poisoning cases, prepare monthly poisoning reports and facilitate statistical analysis of data. Electronic database usage made Poison Center work much easier.

  13. State analysis of BOP using statistical and heuristic methods

    International Nuclear Information System (INIS)

    Heo, Gyun Young; Chang, Soon Heung

    2003-01-01

    Under the deregulation environment, the performance enhancement of BOP in nuclear power plants is being highlighted. To analyze performance level of BOP, we use the performance test procedures provided from an authorized institution such as ASME. However, through plant investigation, it was proved that the requirements of the performance test procedures about the reliability and quantity of sensors was difficult to be satisfied. As a solution of this, state analysis method that are the expanded concept of signal validation, was proposed on the basis of the statistical and heuristic approaches. Authors recommended the statistical linear regression model by analyzing correlation among BOP parameters as a reference state analysis method. Its advantage is that its derivation is not heuristic, it is possible to calculate model uncertainty, and it is easy to apply to an actual plant. The error of the statistical linear regression model is below 3% under normal as well as abnormal system states. Additionally a neural network model was recommended since the statistical model is impossible to apply to the validation of all of the sensors and is sensitive to the outlier that is the signal located out of a statistical distribution. Because there are a lot of sensors need to be validated in BOP, wavelet analysis (WA) were applied as a pre-processor for the reduction of input dimension and for the enhancement of training accuracy. The outlier localization capability of WA enhanced the robustness of the neural network. The trained neural network restored the degraded signals to the values within ±3% of the true signals

  14. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  15. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (pstatistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Statistical analysis of RHIC beam position monitors performance

    Science.gov (United States)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  17. Statistical analysis of RHIC beam position monitors performance

    Directory of Open Access Journals (Sweden)

    R. Calaga

    2004-04-01

    Full Text Available A detailed statistical analysis of beam position monitors (BPM performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  18. Statistical analysis of next generation sequencing data

    CERN Document Server

    Nettleton, Dan

    2014-01-01

    Next Generation Sequencing (NGS) is the latest high throughput technology to revolutionize genomic research. NGS generates massive genomic datasets that play a key role in the big data phenomenon that surrounds us today. To extract signals from high-dimensional NGS data and make valid statistical inferences and predictions, novel data analytic and statistical techniques are needed. This book contains 20 chapters written by prominent statisticians working with NGS data. The topics range from basic preprocessing and analysis with NGS data to more complex genomic applications such as copy number variation and isoform expression detection. Research statisticians who want to learn about this growing and exciting area will find this book useful. In addition, many chapters from this book could be included in graduate-level classes in statistical bioinformatics for training future biostatisticians who will be expected to deal with genomic data in basic biomedical research, genomic clinical trials and personalized med...

  19. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  20. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  1. Comparative analysis of positive and negative attitudes toward statistics

    Science.gov (United States)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  2. Statistical analysis and digital processing of the Mössbauer spectra

    International Nuclear Information System (INIS)

    Prochazka, Roman; Tucek, Jiri; Mashlan, Miroslav; Pechousek, Jiri; Tucek, Pavel; Marek, Jaroslav

    2010-01-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions

  3. Statistical analysis and digital processing of the Mössbauer spectra

    Science.gov (United States)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  4. Quantum chaos in multicharged ions and statistical approach to the calculation of electron-ion resonant radiative recombination

    International Nuclear Information System (INIS)

    Gribakin, G.F.; Gribakina, A.A.; Flambaum, V.V.

    1999-01-01

    We show that the spectrum and eigenstates of open-shell multicharged atomic ions near the ionisation threshold are chaotic, as a result of extremely high level densities of multiply excited electron states (10 3 eV -1 in Au 24+ ) and strong configuration mixing. This complexity enables one to use statistical methods to analyse the system. We examine the dependence of the orbital occupation numbers and single-particle energies on the excitation energy of the system, and show that the occupation numbers are described by the Fermi-Dirac distribution, and the temperature and chemical potential can be introduced. The Fermi-Dirac temperature is close to the temperature defined through the canonical distribution. Using a statistical approach we estimate the contribution of multielectron resonant states to the radiative capture of low-energy electrons by Au 25+ and demonstrate that this mechanism fully accounts for the 10 2 times enhancement of the recombination over the direct radiative recombination, in agreement with recent experimental observations. Copyright (1999) CSIRO Australia

  5. Vapor Pressure Data Analysis and Statistics

    Science.gov (United States)

    2016-12-01

    near 8, 2000, and 200, respectively. The A (or a) value is directly related to vapor pressure and will be greater for high vapor pressure materials...1, (10) where n is the number of data points, Yi is the natural logarithm of the i th experimental vapor pressure value, and Xi is the...VAPOR PRESSURE DATA ANALYSIS AND STATISTICS ECBC-TR-1422 Ann Brozena RESEARCH AND TECHNOLOGY DIRECTORATE

  6. Statistical analysis of planktic foraminifera of the surface Continental ...

    African Journals Online (AJOL)

    Planktic foraminiferal assemblage recorded from selected samples obtained from shallow continental shelf sediments off southwestern Nigeria were subjected to statistical analysis. The Principal Component Analysis (PCA) was used to determine variants of planktic parameters. Values obtained for these parameters were ...

  7. Imaging mass spectrometry statistical analysis.

    Science.gov (United States)

    Jones, Emrys A; Deininger, Sören-Oliver; Hogendoorn, Pancras C W; Deelder, André M; McDonnell, Liam A

    2012-08-30

    Imaging mass spectrometry is increasingly used to identify new candidate biomarkers. This clinical application of imaging mass spectrometry is highly multidisciplinary: expertise in mass spectrometry is necessary to acquire high quality data, histology is required to accurately label the origin of each pixel's mass spectrum, disease biology is necessary to understand the potential meaning of the imaging mass spectrometry results, and statistics to assess the confidence of any findings. Imaging mass spectrometry data analysis is further complicated because of the unique nature of the data (within the mass spectrometry field); several of the assumptions implicit in the analysis of LC-MS/profiling datasets are not applicable to imaging. The very large size of imaging datasets and the reporting of many data analysis routines, combined with inadequate training and accessible reviews, have exacerbated this problem. In this paper we provide an accessible review of the nature of imaging data and the different strategies by which the data may be analyzed. Particular attention is paid to the assumptions of the data analysis routines to ensure that the reader is apprised of their correct usage in imaging mass spectrometry research. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. NETWORKS OF NANOPARTICLES IN ORGANIC – INORGANIC COMPOSITES: ALGORITHMIC EXTRACTION AND STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Ralf Thiedmann

    2012-03-01

    Full Text Available The rising global demand in energy and the limited resources in fossil fuels require new technologies in renewable energies like solar cells. Silicon solar cells offer a good efficiency but suffer from high production costs. A promising alternative are polymer solar cells, due to potentially low production costs and high flexibility of the panels. In this paper, the nanostructure of organic–inorganic composites is investigated, which can be used as photoactive layers in hybrid–polymer solar cells. These materials consist of a polymeric (OC1C10-PPV phase with CdSe nanoparticles embedded therein. On the basis of 3D image data with high spatial resolution, gained by electron tomography, an algorithm is developed to automatically extract the CdSe nanoparticles from grayscale images, where we assume them as spheres. The algorithm is based on a modified version of the Hough transform, where a watershed algorithm is used to separate the image data into basins such that each basin contains exactly one nanoparticle. After their extraction, neighboring nanoparticles are connected to form a 3D network that is related to the transport of electrons in polymer solar cells. A detailed statistical analysis of the CdSe network morphology is accomplished, which allows deeper insight into the hopping percolation pathways of electrons.

  9. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  10. On the correct implementation of Fermi-Dirac statistics and electron trapping in nonlinear electrostatic plane wave propagation in collisionless plasmas

    Science.gov (United States)

    Schamel, Hans; Eliasson, Bengt

    2016-05-01

    Quantum statistics and electron trapping have a decisive influence on the propagation characteristics of coherent stationary electrostatic waves. The description of these strictly nonlinear structures, which are of electron hole type and violate linear Vlasov theory due to the particle trapping at any excitation amplitude, is obtained by a correct reduction of the three-dimensional Fermi-Dirac distribution function to one dimension and by a proper incorporation of trapping. For small but finite amplitudes, the holes become of cnoidal wave type and the electron density is shown to be described by a ϕ ( x ) 1 / 2 rather than a ϕ ( x ) expansion, where ϕ ( x ) is the electrostatic potential. The general coefficients are presented for a degenerate plasma as well as the quantum statistical analogue to these steady state coherent structures, including the shape of ϕ ( x ) and the nonlinear dispersion relation, which describes their phase velocity.

  11. A scanning electron microscope study and statistical analysis of adipocyte morphology in lipofilling: comparing the effects of harvesting and purification procedures with 2 different techniques.

    Science.gov (United States)

    Rubino, Corrado; Mazzarello, Vittorio; Faenza, Mario; Montella, Andrea; Santanelli, Fabio; Farace, Francesco

    2015-06-01

    The aim of this study was to evaluate the effects on adipocyte morphology of 2 techniques of fat harvesting and of fat purification in lipofilling, considering that the number of viable healthy adipocytes is important in fat survival in recipient areas of lipofilling. Fat harvesting was performed in 10 female patients from flanks, on one side with a 2-mm Coleman cannula and on the other side with a 3-mm Mercedes cannula. Thirty milliliter of fat tissue from each side was collected and divided into three 10 mL syringes: A, B, and C. The fat inside syringe A was left untreated, the fat in syringe B underwent simple sedimentation, and the fat inside syringe C underwent centrifugation at 3000 rpm for 3 minutes. Each fat graft specimen was processed for examination under low-vacuum scanning electron microscope. Diameter (μ) and number of adipocytes per square millimeter and number of altered adipocytes per square millimeter were evaluated. Untreated specimens harvested with the 2 different techniques were first compared, then sedimented versus centrifuged specimens harvested with the same technique were compared. Statistical analysis was performed using Wilcoxon signed rank test. The number of adipocytes per square millimeter was statistically higher in specimens harvested with the 3-mm Mercedes cannula (P = 0.0310). The number of altered cells was statistically higher in centrifuged specimens than in sedimented ones using both methods of fat harvesting (P = 0.0080) with a 2-mm Coleman cannula and (P = 0.0050) with a 3-mm Mercedes cannula. Alterations in adipocyte morphology consisted in wrinkling of the membrane, opening of pore with leakage of oily material, reduction of cellular diameter, and total collapse of the cellular membrane. Fat harvesting by a 3-mm cannula results in a higher number of adipocytes and centrifugation of the harvested fat results in a higher number of morphologic altered cells than sedimentation.

  12. Statistical analysis of JET disruptions

    International Nuclear Information System (INIS)

    Tanga, A.; Johnson, M.F.

    1991-07-01

    In the operation of JET and of any tokamak many discharges are terminated by a major disruption. The disruptive termination of a discharge is usually an unwanted event which may cause damage to the structure of the vessel. In a reactor disruptions are potentially a very serious problem, hence the importance of studying them and devising methods to avoid disruptions. Statistical information has been collected about the disruptions which have occurred at JET over a long span of operations. The analysis is focused on the operational aspects of the disruptions rather than on the underlining physics. (Author)

  13. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  14. Precession technique and electron diffractometry as new tools for crystal structure analysis and chemical bonding determination

    International Nuclear Information System (INIS)

    Avilov, A.; Kuligin, K.; Nicolopoulos, S.; Nickolskiy, M.; Boulahya, K.; Portillo, J.; Lepeshov, G.; Sobolev, B.; Collette, J.P.; Martin, N.; Robins, A.C.; Fischione, P.

    2007-01-01

    We have developed a new fast electron diffractometer working with high dynamic range and linearity for crystal structure determinations. Electron diffraction (ED) patterns can be scanned serially in front of a Faraday cage detector; the total measurement time for several hundred ED reflections can be tens of seconds having high statistical accuracy for all measured intensities (1-2%). This new tool can be installed to any type of TEM without any column modification and is linked to a specially developed electron beam precession 'Spinning Star' system. Precession of the electron beam (Vincent-Midgley technique) reduces dynamical effects allowing also use of accurate intensities for crystal structure analysis. We describe the technical characteristics of this new tool together with the first experimental results. Accurate measurement of electron diffraction intensities by electron diffractometer opens new possibilities not only for revealing unknown structures, but also for electrostatic potential determination and chemical bonding investigation. As an example, we present detailed atomic bonding information of CaF 2 as revealed for the first time by precise electron diffractometry

  15. Critical analysis of industrial electron accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Korenev, S. E-mail: sergey_korenev@steris.com

    2004-10-01

    The critical analysis of electron linacs for industrial applications (degradation of PTFE, curing of composites, modification of materials, sterilization and others) is considered in this report. Main physical requirements for industrial electron accelerators consist in the variations of beam parameters, such as kinetic energy and beam power. Questions for regulation of these beam parameters are considered. The level of absorbed dose in the irradiated product and throughput determines the main parameters of electron accelerator. The type of ideal electron linac for industrial applications is discussed.

  16. Critical analysis of industrial electron accelerators

    Science.gov (United States)

    Korenev, S.

    2004-09-01

    The critical analysis of electron linacs for industrial applications (degradation of PTFE, curing of composites, modification of materials, sterlization and others) is considered in this report. Main physical requirements for industrial electron accelerators consist in the variations of beam parameters, such as kinetic energy and beam power. Questions for regulation of these beam parameters are considered. The level of absorbed dose in the irradiated product and throughput determines the main parameters of electron accelerator. The type of ideal electron linac for industrial applications is discussed.

  17. Critical analysis of industrial electron accelerators

    International Nuclear Information System (INIS)

    Korenev, S.

    2004-01-01

    The critical analysis of electron linacs for industrial applications (degradation of PTFE, curing of composites, modification of materials, sterilization and others) is considered in this report. Main physical requirements for industrial electron accelerators consist in the variations of beam parameters, such as kinetic energy and beam power. Questions for regulation of these beam parameters are considered. The level of absorbed dose in the irradiated product and throughput determines the main parameters of electron accelerator. The type of ideal electron linac for industrial applications is discussed

  18. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  19. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  20. Statistical trend analysis methods for temporal phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Lehtinen, E.; Pulkkinen, U. [VTT Automation, (Finland); Poern, K. [Poern Consulting, Nykoeping (Sweden)

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods. 14 refs, 10 figs.

  1. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    Lehtinen, E.; Pulkkinen, U.; Poern, K.

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  2. Statistical Methods for Single-Particle Electron Cryomicroscopy

    DEFF Research Database (Denmark)

    Jensen, Katrine Hommelhoff

    Electron cryomicroscopy (cryo-EM) is a form of transmission electron microscopy, aimed at reconstructing the 3D structure of a macromolecular complex from a large set of 2D projection images, as they exhibit a very low signal-to-noise ratio (SNR). In the single-particle reconstruction (SPR) probl...

  3. StOCNET : Software for the statistical analysis of social networks

    NARCIS (Netherlands)

    Huisman, M.; van Duijn, M.A.J.

    2003-01-01

    StOCNET3 is an open software system in a Windows environment for the advanced statistical analysis of social networks. It provides a platform to make a number of recently developed and therefore not (yet) standard statistical methods available to a wider audience. A flexible user interface utilizing

  4. AutoBayes: A System for Generating Data Analysis Programs from Statistical Models

    OpenAIRE

    Fischer, Bernd; Schumann, Johann

    2003-01-01

    Data analysis is an important scientific task which is required whenever information needs to be extracted from raw data. Statistical approaches to data analysis, which use methods from probability theory and numerical analysis, are well-founded but dificult to implement: the development of a statistical data analysis program for any given application is time-consuming and requires substantial knowledge and experience in several areas. In this paper, we describe AutoBayes, a program synthesis...

  5. Process variation in electron beam sterilization

    International Nuclear Information System (INIS)

    Beck, Jeffrey A.

    2012-01-01

    The qualification and control of electron beam sterilization can be improved by the application of proven statistical analysis techniques such as Analysis of Variance (ANOVA) and Statistical Tolerance Limits. These statistical techniques can be useful tools in: •Locating and quantifying the minimum and maximum absorbed dose in a product. •Estimating the expected process maximum dose, given a minimum sterilizing dose. •Setting a process minimum dose target, based on an allowance for random measurement and process variation. •Determining the dose relationship between a reference dosimeter and process minimum and maximum doses. This study investigates and demonstrates the application of these tools in qualifying electron beam sterilization, and compares the conclusions obtained with those obtained using practices recommended in Guide for Process Control in Radiation Sterilization. The study supports the following conclusions for electron beam processes: 1.ANOVA is a more effective tool for evaluating the equivalency of absorbed doses than methods suggested in . 2.Process limits computed using statistical tolerance limits more accurately reflect actual process variability than the AAMI method, which applies +/−2 sample standard deviations (s) regardless of sample size. 3.The use of reference dose ratios lends itself to qualification using statistical tolerance limits. The current AAMI recommended approach may result in an overly optimistic estimate of the reference dose adjustment factor, as it is based on application of +/−2(s) tolerances regardless of sample size.

  6. Network similarity and statistical analysis of earthquake seismic data

    OpenAIRE

    Deyasi, Krishanu; Chakraborty, Abhijit; Banerjee, Anirban

    2016-01-01

    We study the structural similarity of earthquake networks constructed from seismic catalogs of different geographical regions. A hierarchical clustering of underlying undirected earthquake networks is shown using Jensen-Shannon divergence in graph spectra. The directed nature of links indicates that each earthquake network is strongly connected, which motivates us to study the directed version statistically. Our statistical analysis of each earthquake region identifies the hub regions. We cal...

  7. Statistical analysis and interpolation of compositional data in materials science.

    Science.gov (United States)

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  8. An Application of Multivariate Statistical Analysis for Query-Driven Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Gosink, Luke J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Garth, Christoph [Univ. of California, Davis, CA (United States); Anderson, John C. [Univ. of California, Davis, CA (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Joy, Kenneth I. [Univ. of California, Davis, CA (United States)

    2011-03-01

    Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.

  9. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu

    2014-01-15

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.

  10. Simultaneous assessment of phase chemistry, phase abundance and bulk chemistry with statistical electron probe micro-analyses: Application to cement clinkers

    International Nuclear Information System (INIS)

    Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef

    2014-01-01

    According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located on a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers

  11. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  12. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  13. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    Science.gov (United States)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  15. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is

  16. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  17. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  18. Statistical wind analysis for near-space applications

    Science.gov (United States)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  19. Analysis of photon statistics with Silicon Photomultiplier

    International Nuclear Information System (INIS)

    D'Ascenzo, N.; Saveliev, V.; Wang, L.; Xie, Q.

    2015-01-01

    The Silicon Photomultiplier (SiPM) is a novel silicon-based photodetector, which represents the modern perspective of low photon flux detection. The aim of this paper is to provide an introduction on the statistical analysis methods needed to understand and estimate in quantitative way the correct features and description of the response of the SiPM to a coherent source of light

  20. Development of statistical analysis code for meteorological data (W-View)

    International Nuclear Information System (INIS)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  1. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Heising, Carolyn D.

    1998-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)

  2. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  3. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  4. Statistical analysis of thermal conductivity of nanofluid containing ...

    Indian Academy of Sciences (India)

    Thermal conductivity measurements of nanofluids were analysed via two-factor completely randomized design and comparison of data means is carried out with Duncan's multiple-range test. Statistical analysis of experimental data show that temperature and weight fraction have a reasonable impact on the thermal ...

  5. Variance analysis of x-ray CT sinograms in the presence of electronic noise background.

    Science.gov (United States)

    Ma, Jianhua; Liang, Zhengrong; Fan, Yi; Liu, Yan; Huang, Jing; Chen, Wufan; Lu, Hongbing

    2012-07-01

    Low-dose x-ray computed tomography (CT) is clinically desired. Accurate noise modeling is a fundamental issue for low-dose CT image reconstruction via statistics-based sinogram restoration or statistical iterative image reconstruction. In this paper, the authors analyzed the statistical moments of low-dose CT data in the presence of electronic noise background. The authors first studied the statistical moment properties of detected signals in CT transmission domain, where the noise of detected signals is considered as quanta fluctuation upon electronic noise background. Then the authors derived, via the Taylor expansion, a new formula for the mean-variance relationship of the detected signals in CT sinogram domain, wherein the image formation becomes a linear operation between the sinogram data and the unknown image, rather than a nonlinear operation in the CT transmission domain. To get insight into the derived new formula by experiments, an anthropomorphic torso phantom was scanned repeatedly by a commercial CT scanner at five different mAs levels from 100 down to 17. The results demonstrated that the electronic noise background is significant when low-mAs (or low-dose) scan is performed. The influence of the electronic noise background should be considered in low-dose CT imaging.

  6. Longitudinal data analysis a handbook of modern statistical methods

    CERN Document Server

    Fitzmaurice, Garrett; Verbeke, Geert; Molenberghs, Geert

    2008-01-01

    Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data. After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint

  7. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  8. Bayesian Sensitivity Analysis of Statistical Models with Missing Data.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Tang, Niansheng

    2014-04-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures.

  9. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  10. Quantitative analysis and IBM SPSS statistics a guide for business and finance

    CERN Document Server

    Aljandali, Abdulkader

    2016-01-01

    This guide is for practicing statisticians and data scientists who use IBM SPSS for statistical analysis of big data in business and finance. This is the first of a two-part guide to SPSS for Windows, introducing data entry into SPSS, along with elementary statistical and graphical methods for summarizing and presenting data. Part I also covers the rudiments of hypothesis testing and business forecasting while Part II will present multivariate statistical methods, more advanced forecasting methods, and multivariate methods. IBM SPSS Statistics offers a powerful set of statistical and information analysis systems that run on a wide variety of personal computers. The software is built around routines that have been developed, tested, and widely used for more than 20 years. As such, IBM SPSS Statistics is extensively used in industry, commerce, banking, local and national governments, and education. Just a small subset of users of the package include the major clearing banks, the BBC, British Gas, British Airway...

  11. What type of statistical model to choose for the analysis of radioimmunoassays

    International Nuclear Information System (INIS)

    Huet, S.

    1984-01-01

    The current techniques used for statistical analysis of radioimmunoassays are not very satisfactory for either the statistician or the biologist. They are based on an attempt to make the response curve linear to avoid complicated computations. The present article shows that this practice has considerable effects (often neglected) on the statistical assumptions which must be formulated. A more strict analysis is proposed by applying the four-parameter logistic model. The advantages of this method are: the statistical assumptions formulated are based on observed data, and the model can be applied to almost all radioimmunoassays [fr

  12. Computerized statistical analysis with bootstrap method in nuclear medicine

    International Nuclear Information System (INIS)

    Zoccarato, O.; Sardina, M.; Zatta, G.; De Agostini, A.; Barbesti, S.; Mana, O.; Tarolo, G.L.

    1988-01-01

    Statistical analysis of data samples involves some hypothesis about the features of data themselves. The accuracy of these hypotheses can influence the results of statistical inference. Among the new methods of computer-aided statistical analysis, the bootstrap method appears to be one of the most powerful, thanks to its ability to reproduce many artificial samples starting from a single original sample and because it works without hypothesis about data distribution. The authors applied the bootstrap method to two typical situation of Nuclear Medicine Department. The determination of the normal range of serum ferritin, as assessed by radioimmunoassay and defined by the mean value ±2 standard deviations, starting from an experimental sample of small dimension, shows an unacceptable lower limit (ferritin plasmatic levels below zero). On the contrary, the results obtained by elaborating 5000 bootstrap samples gives ans interval of values (10.95 ng/ml - 72.87 ng/ml) corresponding to the normal ranges commonly reported. Moreover the authors applied the bootstrap method in evaluating the possible error associated with the correlation coefficient determined between left ventricular ejection fraction (LVEF) values obtained by first pass radionuclide angiocardiography with 99m Tc and 195m Au. The results obtained indicate a high degree of statistical correlation and give the range of r 2 values to be considered acceptable for this type of studies

  13. Software for statistical data analysis used in Higgs searches

    International Nuclear Information System (INIS)

    Gumpert, Christian; Moneta, Lorenzo; Cranmer, Kyle; Kreiss, Sven; Verkerke, Wouter

    2014-01-01

    The analysis and interpretation of data collected by the Large Hadron Collider (LHC) requires advanced statistical tools in order to quantify the agreement between observation and theoretical models. RooStats is a project providing a statistical framework for data analysis with the focus on discoveries, confidence intervals and combination of different measurements in both Bayesian and frequentist approaches. It employs the RooFit data modelling language where mathematical concepts such as variables, (probability density) functions and integrals are represented as C++ objects. RooStats and RooFit rely on the persistency technology of the ROOT framework. The usage of a common data format enables the concept of digital publishing of complicated likelihood functions. The statistical tools have been developed in close collaboration with the LHC experiments to ensure their applicability to real-life use cases. Numerous physics results have been produced using the RooStats tools, with the discovery of the Higgs boson by the ATLAS and CMS experiments being certainly the most popular among them. We will discuss tools currently used by LHC experiments to set exclusion limits, to derive confidence intervals and to estimate discovery significances based on frequentist statistics and the asymptotic behaviour of likelihood functions. Furthermore, new developments in RooStats and performance optimisation necessary to cope with complex models depending on more than 1000 variables will be reviewed

  14. PRECISE - pregabalin in addition to usual care: Statistical analysis plan

    NARCIS (Netherlands)

    S. Mathieson (Stephanie); L. Billot (Laurent); C. Maher (Chris); A.J. McLachlan (Andrew J.); J. Latimer (Jane); B.W. Koes (Bart); M.J. Hancock (Mark J.); I. Harris (Ian); R.O. Day (Richard O.); J. Pik (Justin); S. Jan (Stephen); C.-W.C. Lin (Chung-Wei Christine)

    2016-01-01

    textabstractBackground: Sciatica is a severe, disabling condition that lacks high quality evidence for effective treatment strategies. This a priori statistical analysis plan describes the methodology of analysis for the PRECISE study. Methods/design: PRECISE is a prospectively registered, double

  15. The Non-Equilibrium Statistical Distribution Function for Electrons and Holes in Semiconductor Heterostructures in Steady-State Conditions

    Directory of Open Access Journals (Sweden)

    Krzysztof Jόzwikowska

    2015-06-01

    Full Text Available The main goal of this work is to determine a statistical non-equilibrium distribution function for the electron and holes in semiconductor heterostructures in steady-state conditions. Based on the postulates of local equilibrium, as well as on the integral form of the weighted Gyarmati’s variational principle in the force representation, using an alternative method, we have derived general expressions, which have the form of the Fermi–Dirac distribution function with four additional components. The physical interpretation of these components has been carried out in this paper. Some numerical results of a non-equilibrium distribution function for an electron in HgCdTe structures are also presented.

  16. The analysis for energy distribution and biological effects of the clusters from electrons in the tissue equivalent material

    International Nuclear Information System (INIS)

    Zhang Wenzhong; Guo Yong; Luo Yisheng; Wang Yong

    2004-01-01

    Objective: To study energy distribution of the clusters from electrons in the tissue equivalent material, and discuss the important aspects of these clusters on inducing biological effects. Methods: Based on the physical mechanism for electrons interacting with tissue equivalent material, the Monte Carlo (MC) method was used. The electron tracks were lively simulated on an event-by-event (ionization, excitation, elastic scattering, Auger electron emission) basis in the material. The relevant conclusions were drawn from the statistic analysis of these events. Results: The electrons will deposit their energy in the form (30%) of cluster in passing through tissue equivalent material, and most clusters (80%) have the energy amount of more than 50 eV. The cluster density depends on its diameter and energy of electrons, and the deposited energy in the cluster depends on the type and energy of radiation. Conclusion: The deposited energy in cluster is the most important factor in inducing all sort of lesions on DNA molecules in tissue cells

  17. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  18. Multivariate statistical analysis of atom probe tomography data

    International Nuclear Information System (INIS)

    Parish, Chad M.; Miller, Michael K.

    2010-01-01

    The application of spectrum imaging multivariate statistical analysis methods, specifically principal component analysis (PCA), to atom probe tomography (APT) data has been investigated. The mathematical method of analysis is described and the results for two example datasets are analyzed and presented. The first dataset is from the analysis of a PM 2000 Fe-Cr-Al-Ti steel containing two different ultrafine precipitate populations. PCA properly describes the matrix and precipitate phases in a simple and intuitive manner. A second APT example is from the analysis of an irradiated reactor pressure vessel steel. Fine, nm-scale Cu-enriched precipitates having a core-shell structure were identified and qualitatively described by PCA. Advantages, disadvantages, and future prospects for implementing these data analysis methodologies for APT datasets, particularly with regard to quantitative analysis, are also discussed.

  19. Development of statistical analysis code for meteorological data (W-View)

    Energy Technology Data Exchange (ETDEWEB)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  20. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    Science.gov (United States)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  1. Big-data reflection high energy electron diffraction analysis for understanding epitaxial film growth processes.

    Science.gov (United States)

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V

    2014-10-28

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED images, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the data set are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of a RHEED image sequence. This approach is illustrated for growth of La(x)Ca(1-x)MnO(3) films grown on etched (001) SrTiO(3) substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the asymmetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.

  2. Recent advances in statistical energy analysis

    Science.gov (United States)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  3. Statistical analysis of tourism destination competitiveness

    Directory of Open Access Journals (Sweden)

    Attilio Gardini

    2013-05-01

    Full Text Available The growing relevance of tourism industry for modern advanced economies has increased the interest among researchers and policy makers in the statistical analysis of destination competitiveness. In this paper we outline a new model of destination competitiveness based on sound theoretical grounds and we develop a statistical test of the model on sample data based on Italian tourist destination decisions and choices. Our model focuses on the tourism decision process which starts from the demand schedule for holidays and ends with the choice of a specific holiday destination. The demand schedule is a function of individual preferences and of destination positioning, while the final decision is a function of the initial demand schedule and the information concerning services for accommodation and recreation in the selected destinations. Moreover, we extend previous studies that focused on image or attributes (such as climate and scenery by paying more attention to the services for accommodation and recreation in the holiday destinations. We test the proposed model using empirical data collected from a sample of 1.200 Italian tourists interviewed in 2007 (October - December. Data analysis shows that the selection probability for the destination included in the consideration set is not proportional to the share of inclusion because the share of inclusion is determined by the brand image, while the selection of the effective holiday destination is influenced by the real supply conditions. The analysis of Italian tourists preferences underline the existence of a latent demand for foreign holidays which points out a risk of market share reduction for Italian tourism system in the global market. We also find a snow ball effect which helps the most popular destinations, mainly in the northern Italian regions.

  4. Visual and statistical analysis of 18F-FDG PET in primary progressive aphasia

    International Nuclear Information System (INIS)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge; Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis

    2015-01-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  5. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the

  6. Tsallis non-extensive statistical mechanics in the ionospheric detrended total electron content during quiet and storm periods

    Science.gov (United States)

    Ogunsua, B. O.; Laoye, J. A.

    2018-05-01

    In this paper, the Tsallis non-extensive q-statistics in ionospheric dynamics was investigated using the total electron content (TEC) obtained from two Global Positioning System (GPS) receiver stations. This investigation was carried out considering the geomagnetically quiet and storm periods. The micro density variation of the ionospheric total electron content was extracted from the TEC data by method of detrending. The detrended total electron content, which represent the variation in the internal dynamics of the system was further analyzed using for non-extensive statistical mechanics using the q-Gaussian methods. Our results reveals that for all the analyzed data sets the Tsallis Gaussian probability distribution (q-Gaussian) with value q > 1 were obtained. It was observed that there is no distinct difference in pattern between the values of qquiet and qstorm. However the values of q varies with geophysical conditions and possibly with local dynamics for the two stations. Also observed are the asymmetric pattern of the q-Gaussian and a highly significant level of correlation for the q-index values obtained for the storm periods compared to the quiet periods between the two GPS receiver stations where the TEC was measured. The factors responsible for this variation can be mostly attributed to the varying mechanisms resulting in the self-reorganization of the system dynamics during the storm periods. The result shows the existence of long range correlation for both quiet and storm periods for the two stations.

  7. Applied statistical methods in agriculture, health and life sciences

    CERN Document Server

    Lawal, Bayo

    2014-01-01

    This textbook teaches crucial statistical methods to answer research questions using a unique range of statistical software programs, including MINITAB and R. This textbook is developed for undergraduate students in agriculture, nursing, biology and biomedical research. Graduate students will also find it to be a useful way to refresh their statistics skills and to reference software options. The unique combination of examples is approached using MINITAB and R for their individual strengths. Subjects covered include among others data description, probability distributions, experimental design, regression analysis, randomized design and biological assay. Unlike other biostatistics textbooks, this text also includes outliers, influential observations in regression and an introduction to survival analysis. Material is taken from the author's extensive teaching and research in Africa, USA and the UK. Sample problems, references and electronic supplementary material accompany each chapter.

  8. Measuring the Success of an Academic Development Programme: A Statistical Analysis

    Science.gov (United States)

    Smith, L. C.

    2009-01-01

    This study uses statistical analysis to estimate the impact of first-year academic development courses in microeconomics, statistics, accountancy, and information systems, offered by the University of Cape Town's Commerce Academic Development Programme, on students' graduation performance relative to that achieved by mainstream students. The data…

  9. Inferential statistics of electron backscatter diffraction data from within individual crystalline grains

    DEFF Research Database (Denmark)

    Bachmann, Florian; Hielscher, Ralf; Jupp, Peter E.

    2010-01-01

    -spatial statistical analysis adapts ideas borrowed from the Bingham quaternion distribution on . Special emphasis is put on the mathematical definition and the numerical determination of a `mean orientation' characterizing the crystallographic grain as well as on distinguishing several types of symmetry......Highly concentrated distributed crystallographic orientation measurements within individual crystalline grains are analysed by means of ordinary statistics neglecting their spatial reference. Since crystallographic orientations are modelled as left cosets of a given subgroup of SO(3), the non...... of the orientation distribution with respect to the mean orientation, like spherical, prolate or oblate symmetry. Applications to simulated as well as to experimental data are presented. All computations have been done with the free and open-source texture toolbox MTEX....

  10. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  11. Study of relationship between MUF correlation and detection sensitivity of statistical analysis

    International Nuclear Information System (INIS)

    Tamura, Toshiaki; Ihara, Hitoshi; Yamamoto, Yoichi; Ikawa, Koji

    1989-11-01

    Various kinds of statistical analysis are proposed to NRTA (Near Real Time Materials Accountancy) which was devised to satisfy the timeliness goal of one of the detection goals of IAEA. It will be presumed that different statistical analysis results will occur between the case of considered rigorous error propagation (with MUF correlation) and the case of simplified error propagation (without MUF correlation). Therefore, measurement simulation and decision analysis were done using flow simulation of 800 MTHM/Y model reprocessing plant, and relationship between MUF correlation and detection sensitivity and false alarm of statistical analysis was studied. Specific character of material accountancy for 800 MTHM/Y model reprocessing plant was grasped by this simulation. It also became clear that MUF correlation decreases not only false alarm but also detection probability for protracted loss in case of CUMUF test and Page's test applied to NRTA. (author)

  12. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Science.gov (United States)

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the... on Documenting Statistical Analysis Programs and Data Files; Availability'' giving interested persons...

  13. Hybrid statistics-simulations based method for atom-counting from ADF STEM images

    Energy Technology Data Exchange (ETDEWEB)

    De wael, Annelies, E-mail: annelies.dewael@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); De Backer, Annick [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Jones, Lewys; Nellist, Peter D. [Department of Materials, University of Oxford, Parks Road, OX1 3PH Oxford (United Kingdom); Van Aert, Sandra, E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2017-06-15

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. - Highlights: • A hybrid method for atom-counting from ADF STEM images is introduced. • Image simulations are incorporated into a statistical framework in a reliable manner. • Limits of the existing methods for atom-counting are far exceeded. • Reliable counting results from an experimental low dose image are obtained. • Progress towards reliable quantitative analysis of beam-sensitive materials is made.

  14. Point defect characterization in HAADF-STEM images using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Sarahan, Michael C.; Chi, Miaofang; Masiel, Daniel J.; Browning, Nigel D.

    2011-01-01

    Quantitative analysis of point defects is demonstrated through the use of multivariate statistical analysis. This analysis consists of principal component analysis for dimensional estimation and reduction, followed by independent component analysis to obtain physically meaningful, statistically independent factor images. Results from these analyses are presented in the form of factor images and scores. Factor images show characteristic intensity variations corresponding to physical structure changes, while scores relate how much those variations are present in the original data. The application of this technique is demonstrated on a set of experimental images of dislocation cores along a low-angle tilt grain boundary in strontium titanate. A relationship between chemical composition and lattice strain is highlighted in the analysis results, with picometer-scale shifts in several columns measurable from compositional changes in a separate column. -- Research Highlights: → Multivariate analysis of HAADF-STEM images. → Distinct structural variations among SrTiO 3 dislocation cores. → Picometer atomic column shifts correlated with atomic column population changes.

  15. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  16. Methods of Analysis of Electronic Money in Banks

    Directory of Open Access Journals (Sweden)

    Melnychenko Oleksandr V.

    2014-03-01

    Full Text Available The article identifies methods of analysis of electronic money, formalises its instruments and offers an integral indicator, which should be calculated by issuing banks and those banks, which carry out operations with electronic money, issued by other banks. Calculation of the integral indicator would allow complex assessment of activity of the studied bank with electronic money and would allow comparison of parameters of different banks by the aggregate of indicators for the study of the electronic money market, its level of development, etc. The article presents methods which envisage economic analysis of electronic money in banks by the following directions: solvency and liquidity, efficiency of electronic money issue, business activity of the bank and social responsibility. Moreover, the proposed indicators by each of the directions are offered to be taken into account when building integral indicators, with the help of which banks are studied: business activity, profitability, solvency, liquidity and so on.

  17. FADTTS: functional analysis of diffusion tensor tract statistics.

    Science.gov (United States)

    Zhu, Hongtu; Kong, Linglong; Li, Runze; Styner, Martin; Gerig, Guido; Lin, Weili; Gilmore, John H

    2011-06-01

    The aim of this paper is to present a functional analysis of a diffusion tensor tract statistics (FADTTS) pipeline for delineating the association between multiple diffusion properties along major white matter fiber bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these white matter tract properties in various diffusion tensor imaging studies. The FADTTS integrates five statistical tools: (i) a multivariate varying coefficient model for allowing the varying coefficient functions in terms of arc length to characterize the varying associations between fiber bundle diffusion properties and a set of covariates, (ii) a weighted least squares estimation of the varying coefficient functions, (iii) a functional principal component analysis to delineate the structure of the variability in fiber bundle diffusion properties, (iv) a global test statistic to test hypotheses of interest, and (v) a simultaneous confidence band to quantify the uncertainty in the estimated coefficient functions. Simulated data are used to evaluate the finite sample performance of FADTTS. We apply FADTTS to investigate the development of white matter diffusivities along the splenium of the corpus callosum tract and the right internal capsule tract in a clinical study of neurodevelopment. FADTTS can be used to facilitate the understanding of normal brain development, the neural bases of neuropsychiatric disorders, and the joint effects of environmental and genetic factors on white matter fiber bundles. The advantages of FADTTS compared with the other existing approaches are that they are capable of modeling the structured inter-subject variability, testing the joint effects, and constructing their simultaneous confidence bands. However, FADTTS is not crucial for estimation and reduces to the functional analysis method for the single measure. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  19. Effect of the absolute statistic on gene-sampling gene-set analysis methods.

    Science.gov (United States)

    Nam, Dougu

    2017-06-01

    Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.

  20. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    Science.gov (United States)

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  1. An improved method for statistical analysis of raw accelerator mass spectrometry data

    International Nuclear Information System (INIS)

    Gutjahr, A.; Phillips, F.; Kubik, P.W.; Elmore, D.

    1987-01-01

    Hierarchical statistical analysis is an appropriate method for statistical treatment of raw accelerator mass spectrometry (AMS) data. Using Monte Carlo simulations we show that this method yields more accurate estimates of isotope ratios and analytical uncertainty than the generally used propagation of errors approach. The hierarchical analysis is also useful in design of experiments because it can be used to identify sources of variability. 8 refs., 2 figs

  2. Statistical Image Analysis of Tomograms with Application to Fibre Geometry Characterisation

    DEFF Research Database (Denmark)

    Emerson, Monica Jane

    The goal of this thesis is to develop statistical image analysis tools to characterise the micro-structure of complex materials used in energy technologies, with a strong focus on fibre composites. These quantification tools are based on extracting geometrical parameters defining structures from 2D...... with high resolution both in space and time to observe fast micro-structural changes. This thesis demonstrates that statistical image analysis combined with X-ray CT opens up numerous possibilities for understanding the behaviour of fibre composites under real life conditions. Besides enabling...

  3. Nonstationary statistical theory for multipactor

    International Nuclear Information System (INIS)

    Anza, S.; Vicente, C.; Gil, J.; Boria, V. E.; Gimeno, B.; Raboso, D.

    2010-01-01

    This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.

  4. The art of data analysis how to answer almost any question using basic statistics

    CERN Document Server

    Jarman, Kristin H

    2013-01-01

    A friendly and accessible approach to applying statistics in the real worldWith an emphasis on critical thinking, The Art of Data Analysis: How to Answer Almost Any Question Using Basic Statistics presents fun and unique examples, guides readers through the entire data collection and analysis process, and introduces basic statistical concepts along the way.Leaving proofs and complicated mathematics behind, the author portrays the more engaging side of statistics and emphasizes its role as a problem-solving tool.  In addition, light-hearted case studies

  5. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  6. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  7. Energy dispersive X-ray analysis in the electron microscope

    CERN Document Server

    Bell, DC

    2003-01-01

    This book provides an in-depth description of x-ray microanalysis in the electron microscope. It is sufficiently detailed to ensure that novices will understand the nuances of high-quality EDX analysis. Includes information about hardware design as well as the physics of x-ray generation, absorption and detection, and most post-detection data processing. Details on electron optics and electron probe formation allow the novice to make sensible adjustments to the electron microscope in order to set up a system which optimises analysis. It also helps the reader determine which microanalytical me

  8. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  9. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  10. Statistical Challenges of Big Data Analysis in Medicine

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2015-01-01

    Roč. 3, č. 1 (2015), s. 24-27 ISSN 1805-8698 R&D Projects: GA ČR GA13-23940S Grant - others:CESNET Development Fund(CZ) 494/2013 Institutional support: RVO:67985807 Keywords : big data * variable selection * classification * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research http://www.ijbh.org/ijbh2015-1.pdf

  11. Statistical Analysis of Hypercalcaemia Data related to Transferability

    DEFF Research Database (Denmark)

    Frølich, Anne; Nielsen, Bo Friis

    2005-01-01

    In this report we describe statistical analysis related to a study of hypercalcaemia carried out in the Copenhagen area in the ten year period from 1984 to 1994. Results from the study have previously been publised in a number of papers [3, 4, 5, 6, 7, 8, 9] and in various abstracts and posters...... at conferences during the late eighties and early nineties. In this report we give a more detailed description of many of the analysis and provide some new results primarily by simultaneous studies of several databases....

  12. Statistical analysis of questionnaires a unified approach based on R and Stata

    CERN Document Server

    Bartolucci, Francesco; Gnaldi, Michela

    2015-01-01

    Statistical Analysis of Questionnaires: A Unified Approach Based on R and Stata presents special statistical methods for analyzing data collected by questionnaires. The book takes an applied approach to testing and measurement tasks, mirroring the growing use of statistical methods and software in education, psychology, sociology, and other fields. It is suitable for graduate students in applied statistics and psychometrics and practitioners in education, health, and marketing.The book covers the foundations of classical test theory (CTT), test reliability, va

  13. Reducing bias in the analysis of counting statistics data

    International Nuclear Information System (INIS)

    Hammersley, A.P.; Antoniadis, A.

    1997-01-01

    In the analysis of counting statistics data it is common practice to estimate the variance of the measured data points as the data points themselves. This practice introduces a bias into the results of further analysis which may be significant, and under certain circumstances lead to false conclusions. In the case of normal weighted least squares fitting this bias is quantified and methods to avoid it are proposed. (orig.)

  14. Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.

    Science.gov (United States)

    MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C

    2018-03-29

    This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  16. Bayesian statistics applied to neutron activation data for reactor flux spectrum analysis

    International Nuclear Information System (INIS)

    Chiesa, Davide; Previtali, Ezio; Sisti, Monica

    2014-01-01

    Highlights: • Bayesian statistics to analyze the neutron flux spectrum from activation data. • Rigorous statistical approach for accurate evaluation of the neutron flux groups. • Cross section and activation data uncertainties included for the problem solution. • Flexible methodology applied to analyze different nuclear reactor flux spectra. • The results are in good agreement with the MCNP simulations of neutron fluxes. - Abstract: In this paper, we present a statistical method, based on Bayesian statistics, to analyze the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation experiment performed at the TRIGA Mark II reactor of Pavia University (Italy) in four irradiation positions characterized by different neutron spectra. In order to evaluate the neutron flux spectrum, subdivided in energy groups, a system of linear equations, containing the group effective cross sections and the activation rate data, has to be solved. However, since the system’s coefficients are experimental data affected by uncertainties, a rigorous statistical approach is fundamental for an accurate evaluation of the neutron flux groups. For this purpose, we applied the Bayesian statistical analysis, that allows to include the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, was used to define the problem statistical model and solve it. The first analysis involved the determination of the thermal, resonance-intermediate and fast flux components and the dependence of the results on the Prior distribution choice was investigated to confirm the reliability of the Bayesian analysis. After that, the main resonances of the activation cross sections were analyzed to implement multi-group models with finer energy subdivisions that would allow to determine the

  17. Reactor noise analysis by statistical pattern recognition methods

    International Nuclear Information System (INIS)

    Howington, L.C.; Gonzalez, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis is presented. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, updating, and data compacting capabilities. System design emphasizes control of the false-alarm rate. Its abilities to learn normal patterns, to recognize deviations from these patterns, and to reduce the dimensionality of data with minimum error were evaluated by experiments at the Oak Ridge National Laboratory (ORNL) High-Flux Isotope Reactor. Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the pattern recognition system

  18. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  19. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  20. Analytical and statistical analysis of elemental composition of lichens

    International Nuclear Information System (INIS)

    Calvelo, S.; Baccala, N.; Bubach, D.; Arribere, M.A.; Riberio Guevara, S.

    1997-01-01

    The elemental composition of lichens from remote southern South America regions has been studied with analytical and statistical techniques to determine if the values obtained reflect species, growth forms or habitat characteristics. The enrichment factors are calculated discriminated by species and collection site and compared with data available in the literature. The elemental concentrations are standardized and compared for different species. The information was statistically processed, a cluster analysis was performed using the three first principal axes of the PCA; the three groups formed are presented. Their relationship with the species, collection sites and the lichen growth forms are interpreted. (author)

  1. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    Science.gov (United States)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  2. Parametric analysis of the statistical model of the stick-slip process

    Science.gov (United States)

    Lima, Roberta; Sampaio, Rubens

    2017-06-01

    In this paper it is performed a parametric analysis of the statistical model of the response of a dry-friction oscillator. The oscillator is a spring-mass system which moves over a base with a rough surface. Due to this roughness, the mass is subject to a dry-frictional force modeled as a Coulomb friction. The system is stochastically excited by an imposed bang-bang base motion. The base velocity is modeled by a Poisson process for which a probabilistic model is fully specified. The excitation induces in the system stochastic stick-slip oscillations. The system response is composed by a random sequence alternating stick and slip-modes. With realizations of the system, a statistical model is constructed for this sequence. In this statistical model, the variables of interest of the sequence are modeled as random variables, as for example, the number of time intervals in which stick or slip occur, the instants at which they begin, and their duration. Samples of the system response are computed by integration of the dynamic equation of the system using independent samples of the base motion. Statistics and histograms of the random variables which characterize the stick-slip process are estimated for the generated samples. The objective of the paper is to analyze how these estimated statistics and histograms vary with the system parameters, i.e., to make a parametric analysis of the statistical model of the stick-slip process.

  3. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  4. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  5. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  6. ARL Supplementary Statistics, 2006-2007

    Science.gov (United States)

    Bland, Les, Comp.; Kyrillidou, Martha, Comp.

    2009-01-01

    This report presents statistics on how Association of Research Libraries (ARL) member libraries spend money on electronic resources. This report indicates that 108 ARL libraries purchased 25,006,758 electronic books. In 2006-2007, there was an ARL median of 243,725 acquisitions of electronic books (this includes one institution that purchased…

  7. ARL Supplementary Statistics, 2007-2008

    Science.gov (United States)

    Bland, Les, Comp.; Kyrillidou, Martha, Comp.

    2009-01-01

    This report presents statistics on how Association of Research Libraries (ARL) member libraries spend money on electronic resources. This report indicates that 109 ARL libraries purchased 32,329,187 electronic books. In 2007-2008, there was a median of 28,319 acquisitions of electronic books by ARL libraries (this includes one institution that…

  8. Statistical analysis of extreme values from insurance, finance, hydrology and other fields

    CERN Document Server

    Reiss, Rolf-Dieter

    1997-01-01

    The statistical analysis of extreme data is important for various disciplines, including hydrology, insurance, finance, engineering and environmental sciences. This book provides a self-contained introduction to the parametric modeling, exploratory analysis and statistical interference for extreme values. The entire text of this third edition has been thoroughly updated and rearranged to meet the new requirements. Additional sections and chapters, elaborated on more than 100 pages, are particularly concerned with topics like dependencies, the conditional analysis and the multivariate modeling of extreme data. Parts I–III about the basic extreme value methodology remain unchanged to some larger extent, yet notable are, e.g., the new sections about "An Overview of Reduced-Bias Estimation" (co-authored by M.I. Gomes), "The Spectral Decomposition Methodology", and "About Tail Independence" (co-authored by M. Frick), and the new chapter about "Extreme Value Statistics of Dependent Random Variables" (co-authored ...

  9. Power flow as a complement to statistical energy analysis and finite element analysis

    Science.gov (United States)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  10. Developments in quantitative electron probe microanalysis

    International Nuclear Information System (INIS)

    Tixier, R.

    1977-01-01

    A study of the range of validity of the formulae for corrections used with massive specimen analysis is made. The method used is original; we have shown that it was possible to use a property of invariability of corrected intensity ratios for standards. This invariance property provides a test for the self consistency of the theory. The theoretical and experimental conditions required for quantitative electron probe microanalysis of thin transmission electron microscope specimens are examined. The correction formulae for atomic number, absorption and fluorescence effects are calculated. Several examples of experimental results are given, relative to the quantitative analysis of intermetallic precipitates and carbides in steels. Advances in applications of electron probe instruments related to the use of computer and the present development of fully automated instruments are reviewed. The necessary statistics for measurements of X ray count data are studied. Estimation procedure and tests are developed. These methods are used to perform a statistical check of electron probe microanalysis measurements and to reject rogue values. An estimator of the confidence interval of the apparent concentration is derived. Formulae were also obtained to optimize the counting time in order to obtain the best precision in a minimum amount of time [fr

  11. EDI Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...

  12. Reliability prediction system based on the failure rate model for electronic components

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Lee, Hwa Ki

    2008-01-01

    Although many methodologies for predicting the reliability of electronic components have been developed, their reliability might be subjective according to a particular set of circumstances, and therefore it is not easy to quantify their reliability. Among the reliability prediction methods are the statistical analysis based method, the similarity analysis method based on an external failure rate database, and the method based on the physics-of-failure model. In this study, we developed a system by which the reliability of electronic components can be predicted by creating a system for the statistical analysis method of predicting reliability most easily. The failure rate models that were applied are MILHDBK- 217F N2, PRISM, and Telcordia (Bellcore), and these were compared with the general purpose system in order to validate the effectiveness of the developed system. Being able to predict the reliability of electronic components from the stage of design, the system that we have developed is expected to contribute to enhancing the reliability of electronic components

  13. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  14. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  15. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  16. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  17. STATISTICAL ANALYSIS OF THE HEAVY NEUTRAL ATOMS MEASURED BY IBEX

    International Nuclear Information System (INIS)

    Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.

    2015-01-01

    We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O and Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O and Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath

  18. Analysis on a electron gun for metal fusion

    International Nuclear Information System (INIS)

    Paes, A.C.J.; Galvao, R.M.O.; Boscolo, P.; Passaro, A.

    1987-09-01

    The characteristics of the electron beam of the HK-011600 Δ, electron gun for metal fusion at the 'Divisao de Materiais do Instituto de Pesquisa e Desenvolvimento do CTA (PMR/IPD/CTA)', is analyzed. In this analysis, the Pierce gun model and the SLAC computational code for electron optics are used. The electron beam R and Z profiles are obtained in the gun region and in the magnetic lenses region. The behaviour of the electron beam in the prism region is also discussed using a simple model. (author) [pt

  19. Statistics of electron multiplication in multiplier phototube: iterative method

    International Nuclear Information System (INIS)

    Grau Malonda, A.; Ortiz Sanchez, J.F.

    1985-01-01

    An iterative method is applied to study the variation of dynode response in the multiplier phototube. Three different situations are considered that correspond to the following ways of electronic incidence on the first dynode: incidence of exactly one electron, incidence of exactly r electrons and incidence of an average anti-r electrons. The responses are given for a number of steps between 1 and 5, and for values of the multiplication factor of 2.1, 2.5, 3 and 5. We study also the variance, the skewness and the excess of jurtosis for different multiplication factors. (author)

  20. Big Data as a Source for Official Statistics

    Directory of Open Access Journals (Sweden)

    Daas Piet J.H.

    2015-06-01

    Full Text Available More and more data are being produced by an increasing number of electronic devices physically surrounding us and on the internet. The large amount of data and the high frequency at which they are produced have resulted in the introduction of the term ‘Big Data’. Because these data reflect many different aspects of our daily lives and because of their abundance and availability, Big Data sources are very interesting from an official statistics point of view. This article discusses the exploration of both opportunities and challenges for official statistics associated with the application of Big Data. Experiences gained with analyses of large amounts of Dutch traffic loop detection records and Dutch social media messages are described to illustrate the topics characteristic of the statistical analysis and use of Big Data.

  1. Construction of Electronics Database for East Asian Countries and Empirical Analysis of International Competitiveness of Japanese Companies (Japanese)

    OpenAIRE

    MOTOHASHI Kazuyuki

    2010-01-01

    The international competitiveness of Japanese electronics firms is fading as firms in East Asian countries such as China, Korea, and Taiwan catch up. In this paper, we have constructed an electronics industry database from 1996 to 2005 for China, Korea, Japan, Taiwan, and the United States. It covers industrial statistics in these countries including trade and overseas production statistics, which makes it possible to control for global production activities of electronics firms. We have also...

  2. Explorations in statistics: the analysis of ratios and normalized data.

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-09-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of Explorations in Statistics explores the analysis of ratios and normalized-or standardized-data. As researchers, we compute a ratio-a numerator divided by a denominator-to compute a proportion for some biological response or to derive some standardized variable. In each situation, we want to control for differences in the denominator when the thing we really care about is the numerator. But there is peril lurking in a ratio: only if the relationship between numerator and denominator is a straight line through the origin will the ratio be meaningful. If not, the ratio will misrepresent the true relationship between numerator and denominator. In contrast, regression techniques-these include analysis of covariance-are versatile: they can accommodate an analysis of the relationship between numerator and denominator when a ratio is useless.

  3. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  4. Perceptual and statistical analysis of cardiac phase and amplitude images

    International Nuclear Information System (INIS)

    Houston, A.; Craig, A.

    1991-01-01

    A perceptual experiment was conducted using cardiac phase and amplitude images. Estimates of statistical parameters were derived from the images and the diagnostic potential of human and statistical decisions compared. Five methods were used to generate the images from 75 gated cardiac studies, 39 of which were classified as pathological. The images were presented to 12 observers experienced in nuclear medicine. The observers rated the images using a five-category scale based on their confidence of an abnormality presenting. Circular and linear statistics were used to analyse phase and amplitude image data, respectively. Estimates of mean, standard deviation (SD), skewness, kurtosis and the first term of the spatial correlation function were evaluated in the region of the left ventricle. A receiver operating characteristic analysis was performed on both sets of data and the human and statistical decisions compared. For phase images, circular SD was shown to discriminate better between normal and abnormal than experienced observers, but no single statistic discriminated as well as the human observer for amplitude images. (orig.)

  5. Statistical and stochastic aspects of the delocalization problem in quantum mechanics

    International Nuclear Information System (INIS)

    Claverie, P.; Diner, S.

    1976-01-01

    The space-time behaviour of electrons in atoms and molecules is reviewed. The wave conception of the electron is criticized and the poverty of the non-reductionist attitude is underlined. Further, the two main interpretations of quantum mechanics are recalled: the Copenhagen and the Statistical Interpretations. The meaning and the successes of the Statistical Interpretation are explained and it is shown that it does not solve all problems because quantum mechanics is irreducible to a classical statistical theory. The fluctuation of the particle number and its relationship to loge theory, delocalization and correlation is studied. Finally, different stochastic models for microphysics are reviewed. The markovian Fenyes-Nelson process allows an interpretation of the original heuristic considerations of Schroedinger. Non-markov processes with Schroedinger time evolution are shown to be equivalent to the base state analysis of Feynmann but they are unsatisfactory from a probabilistic point of view. Stochastic electrodynamics is presented as the most satisfactory conception nowadays

  6. Statistical analysis of the count and profitability of air conditioners.

    Science.gov (United States)

    Rady, El Houssainy A; Mohamed, Salah M; Abd Elmegaly, Alaa A

    2018-08-01

    This article presents the statistical analysis of the number and profitability of air conditioners in an Egyptian company. Checking the same distribution for each categorical variable has been made using Kruskal-Wallis test.

  7. Statistical analysis of subjective preferences for video enhancement

    Science.gov (United States)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  8. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    Science.gov (United States)

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  9. Statistical analysis and Monte Carlo simulation of growing self-avoiding walks on percolation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Yuxia [Department of Physics, Wuhan University, Wuhan 430072 (China); Sang Jianping [Department of Physics, Wuhan University, Wuhan 430072 (China); Department of Physics, Jianghan University, Wuhan 430056 (China); Zou Xianwu [Department of Physics, Wuhan University, Wuhan 430072 (China)]. E-mail: xwzou@whu.edu.cn; Jin Zhunzhi [Department of Physics, Wuhan University, Wuhan 430072 (China)

    2005-09-26

    The two-dimensional growing self-avoiding walk on percolation was investigated by statistical analysis and Monte Carlo simulation. We obtained the expression of the mean square displacement and effective exponent as functions of time and percolation probability by statistical analysis and made a comparison with simulations. We got a reduced time to scale the motion of walkers in growing self-avoiding walks on regular and percolation lattices.

  10. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  11. α -induced reactions on 115In: Cross section measurements and statistical model analysis

    Science.gov (United States)

    Kiss, G. G.; Szücs, T.; Mohr, P.; Török, Zs.; Huszánk, R.; Gyürky, Gy.; Fülöp, Zs.

    2018-05-01

    Background: α -nucleus optical potentials are basic ingredients of statistical model calculations used in nucleosynthesis simulations. While the nucleon+nucleus optical potential is fairly well known, for the α +nucleus optical potential several different parameter sets exist and large deviations, reaching sometimes even an order of magnitude, are found between the cross section predictions calculated using different parameter sets. Purpose: A measurement of the radiative α -capture and the α -induced reaction cross sections on the nucleus 115In at low energies allows a stringent test of statistical model predictions. Since experimental data are scarce in this mass region, this measurement can be an important input to test the global applicability of α +nucleus optical model potentials and further ingredients of the statistical model. Methods: The reaction cross sections were measured by means of the activation method. The produced activities were determined by off-line detection of the γ rays and characteristic x rays emitted during the electron capture decay of the produced Sb isotopes. The 115In(α ,γ )119Sb and 115In(α ,n )Sb118m reaction cross sections were measured between Ec .m .=8.83 and 15.58 MeV, and the 115In(α ,n )Sb118g reaction was studied between Ec .m .=11.10 and 15.58 MeV. The theoretical analysis was performed within the statistical model. Results: The simultaneous measurement of the (α ,γ ) and (α ,n ) cross sections allowed us to determine a best-fit combination of all parameters for the statistical model. The α +nucleus optical potential is identified as the most important input for the statistical model. The best fit is obtained for the new Atomki-V1 potential, and good reproduction of the experimental data is also achieved for the first version of the Demetriou potentials and the simple McFadden-Satchler potential. The nucleon optical potential, the γ -ray strength function, and the level density parametrization are also

  12. A method for statistical steady state thermal analysis of reactor cores

    International Nuclear Information System (INIS)

    Whetton, P.A.

    1981-01-01

    In a previous publication the author presented a method for undertaking statistical steady state thermal analyses of reactor cores. The present paper extends the technique to an assessment of confidence limits for the resulting probability functions which define the probability that a given thermal response value will be exceeded in a reactor core. Establishing such confidence limits is considered an integral part of any statistical thermal analysis and essential if such analysis are to be considered in any regulatory process. In certain applications the use of a best estimate probability function may be justifiable but it is recognised that a demonstrably conservative probability function is required for any regulatory considerations. (orig.)

  13. A statistical test for outlier identification in data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Morteza Khodabin

    2010-09-01

    Full Text Available In the use of peer group data to assess individual, typical or best practice performance, the effective detection of outliers is critical for achieving useful results. In these ‘‘deterministic’’ frontier models, statistical theory is now mostly available. This paper deals with the statistical pared sample method and its capability of detecting outliers in data envelopment analysis. In the presented method, each observation is deleted from the sample once and the resulting linear program is solved, leading to a distribution of efficiency estimates. Based on the achieved distribution, a pared test is designed to identify the potential outlier(s. We illustrate the method through a real data set. The method could be used in a first step, as an exploratory data analysis, before using any frontier estimation.

  14. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    Science.gov (United States)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  15. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Science.gov (United States)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-06-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  16. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    International Nuclear Information System (INIS)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-01-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  17. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Energy Technology Data Exchange (ETDEWEB)

    Glascock, M. D.; Neff, H. [University of Missouri, Research Reactor Center (United States); Vaughn, K. J. [Pacific Lutheran University, Department of Anthropology (United States)

    2004-06-15

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  18. Statistical analysis and data management

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found

  19. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said...... to be in statistical control. Significant deviations between analytical results from different laboratories reveal the presence of systematic errors, and agreement between different laboratories indicate the absence of systematic errors. This statistical approach, referred to as the analysis of precision, was applied...

  20. Statistical 3D damage accumulation model for ion implant simulators

    CERN Document Server

    Hernandez-Mangas, J M; Enriquez, L E; Bailon, L; Barbolla, J; Jaraiz, M

    2003-01-01

    A statistical 3D damage accumulation model, based on the modified Kinchin-Pease formula, for ion implant simulation has been included in our physically based ion implantation code. It has only one fitting parameter for electronic stopping and uses 3D electron density distributions for different types of targets including compound semiconductors. Also, a statistical noise reduction mechanism based on the dose division is used. The model has been adapted to be run under parallel execution in order to speed up the calculation in 3D structures. Sequential ion implantation has been modelled including previous damage profiles. It can also simulate the implantation of molecular and cluster projectiles. Comparisons of simulated doping profiles with experimental SIMS profiles are presented. Also comparisons between simulated amorphization and experimental RBS profiles are shown. An analysis of sequential versus parallel processing is provided.

  1. Statistical 3D damage accumulation model for ion implant simulators

    International Nuclear Information System (INIS)

    Hernandez-Mangas, J.M.; Lazaro, J.; Enriquez, L.; Bailon, L.; Barbolla, J.; Jaraiz, M.

    2003-01-01

    A statistical 3D damage accumulation model, based on the modified Kinchin-Pease formula, for ion implant simulation has been included in our physically based ion implantation code. It has only one fitting parameter for electronic stopping and uses 3D electron density distributions for different types of targets including compound semiconductors. Also, a statistical noise reduction mechanism based on the dose division is used. The model has been adapted to be run under parallel execution in order to speed up the calculation in 3D structures. Sequential ion implantation has been modelled including previous damage profiles. It can also simulate the implantation of molecular and cluster projectiles. Comparisons of simulated doping profiles with experimental SIMS profiles are presented. Also comparisons between simulated amorphization and experimental RBS profiles are shown. An analysis of sequential versus parallel processing is provided

  2. Statistical analysis of the BOIL program in RSYST-III

    International Nuclear Information System (INIS)

    Beck, W.; Hausch, H.J.

    1978-11-01

    The paper describes a statistical analysis in the RSYST-III program system. Using the example of the BOIL program, it is shown how the effects of inaccurate input data on the output data can be discovered. The existing possibilities of data generation, data handling, and data evaluation are outlined. (orig.) [de

  3. Multivariate statistical analysis of precipitation chemistry in Northwestern Spain

    International Nuclear Information System (INIS)

    Prada-Sanchez, J.M.; Garcia-Jurado, I.; Gonzalez-Manteiga, W.; Fiestras-Janeiro, M.G.; Espada-Rios, M.I.; Lucas-Dominguez, T.

    1993-01-01

    149 samples of rainwater were collected in the proximity of a power station in northwestern Spain at three rainwater monitoring stations. The resulting data are analyzed using multivariate statistical techniques. Firstly, the Principal Component Analysis shows that there are three main sources of pollution in the area (a marine source, a rural source and an acid source). The impact from pollution from these sources on the immediate environment of the stations is studied using Factorial Discriminant Analysis. 8 refs., 7 figs., 11 tabs

  4. Multivariate statistical analysis of precipitation chemistry in Northwestern Spain

    Energy Technology Data Exchange (ETDEWEB)

    Prada-Sanchez, J.M.; Garcia-Jurado, I.; Gonzalez-Manteiga, W.; Fiestras-Janeiro, M.G.; Espada-Rios, M.I.; Lucas-Dominguez, T. (University of Santiago, Santiago (Spain). Faculty of Mathematics, Dept. of Statistics and Operations Research)

    1993-07-01

    149 samples of rainwater were collected in the proximity of a power station in northwestern Spain at three rainwater monitoring stations. The resulting data are analyzed using multivariate statistical techniques. Firstly, the Principal Component Analysis shows that there are three main sources of pollution in the area (a marine source, a rural source and an acid source). The impact from pollution from these sources on the immediate environment of the stations is studied using Factorial Discriminant Analysis. 8 refs., 7 figs., 11 tabs.

  5. Determination of electron beam polarization using electron detector in Compton polarimeter with less than 1% statistical and systematic uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Narayan, Amrendra [Mississippi State Univ., Mississippi State, MS (United States)

    2015-05-01

    The Q-weak experiment aims to measure the weak charge of proton with a precision of 4.2%. The proposed precision on weak charge required a 2.5% measurement of the parity violating asymmetry in elastic electron - proton scattering. Polarimetry was the largest experimental contribution to this uncertainty and a new Compton polarimeter was installed in Hall C at Jefferson Lab to make the goal achievable. In this polarimeter the electron beam collides with green laser light in a low gain Fabry-Perot Cavity; the scattered electrons are detected in 4 planes of a novel diamond micro strip detector while the back scattered photons are detected in lead tungstate crystals. This diamond micro-strip detector is the first such device to be used as a tracking detector in a nuclear and particle physics experiment. The diamond detectors are read out using custom built electronic modules that include a preamplifier, a pulse shaping amplifier and a discriminator for each detector micro-strip. We use field programmable gate array based general purpose logic modules for event selection and histogramming. Extensive Monte Carlo simulations and data acquisition simulations were performed to estimate the systematic uncertainties. Additionally, the Moller and Compton polarimeters were cross calibrated at low electron beam currents using a series of interleaved measurements. In this dissertation, we describe all the subsystems of the Compton polarimeter with emphasis on the electron detector. We focus on the FPGA based data acquisition system built by the author and the data analysis methods implemented by the author. The simulations of the data acquisition and the polarimeter that helped rigorously establish the systematic uncertainties of the polarimeter are also elaborated, resulting in the first sub 1% measurement of low energy (?1 GeV) electron beam polarization with a Compton electron detector. We have demonstrated that diamond based micro-strip detectors can be used for tracking in a

  6. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    Science.gov (United States)

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  7. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    Science.gov (United States)

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  8. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    Science.gov (United States)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  9. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    Directory of Open Access Journals (Sweden)

    Sergis Antonis

    2011-01-01

    Full Text Available Abstract This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  10. A standards-based method for compositional analysis by energy dispersive X-ray spectrometry using multivariate statistical analysis: application to multicomponent alloys.

    Science.gov (United States)

    Rathi, Monika; Ahrenkiel, S P; Carapella, J J; Wanlass, M W

    2013-02-01

    Given an unknown multicomponent alloy, and a set of standard compounds or alloys of known composition, can one improve upon popular standards-based methods for energy dispersive X-ray (EDX) spectrometry to quantify the elemental composition of the unknown specimen? A method is presented here for determining elemental composition of alloys using transmission electron microscopy-based EDX with appropriate standards. The method begins with a discrete set of related reference standards of known composition, applies multivariate statistical analysis to those spectra, and evaluates the compositions with a linear matrix algebra method to relate the spectra to elemental composition. By using associated standards, only limited assumptions about the physical origins of the EDX spectra are needed. Spectral absorption corrections can be performed by providing an estimate of the foil thickness of one or more reference standards. The technique was applied to III-V multicomponent alloy thin films: composition and foil thickness were determined for various III-V alloys. The results were then validated by comparing with X-ray diffraction and photoluminescence analysis, demonstrating accuracy of approximately 1% in atomic fraction.

  11. Modeling of asphalt-rubber rotational viscosity by statistical analysis and neural networks

    Directory of Open Access Journals (Sweden)

    Luciano Pivoto Specht

    2007-03-01

    Full Text Available It is of a great importance to know binders' viscosity in order to perform handling, mixing, application processes and asphalt mixes compaction in highway surfacing. This paper presents the results of viscosity measurement in asphalt-rubber binders prepared in laboratory. The binders were prepared varying the rubber content, rubber particle size, duration and temperature of mixture, all following a statistical design plan. The statistical analysis and artificial neural networks were used to create mathematical models for prediction of the binders viscosity. The comparison between experimental data and simulated results with the generated models showed best performance of the neural networks analysis in contrast to the statistic models. The results indicated that the rubber content and duration of mixture have major influence on the observed viscosity for the considered interval of parameters variation.

  12. A theory of electron baths: One-electron system dynamics

    International Nuclear Information System (INIS)

    McDowell, H.K.

    1992-01-01

    The second-quantized, many-electron, atomic, and molecular Hamiltonian is partitioned both by the identity or labeling of the spin orbitals and by the dynamics of the spin orbitals into a system coupled to a bath. The electron bath is treated by a molecular time scale generalized Langevin equation approach designed to include one-electron dynamics in the system dynamics. The bath is formulated as an equivalent chain of spin orbitals through the introduction of equivalent-chain annihilation and creation operators. Both the dynamics and the quantum grand canonical statistical properties of the electron bath are examined. Two versions for the statistical properties of the bath are pursued. Using a weak bath assumption, a bath statistical average is defined which allows one to achieve a reduced dynamics description of the electron system which is coupled to the electron bath. In a strong bath assumption effective Hamiltonians are obtained which reproduce the dynamics of the bath and which lead to the same results as found in the weak bath assumption. The effective (but exact) Hamiltonian is found to be a one-electron Hamiltonian. A reduced dynamics equation of motion for the system population matrix is derived and found to agree with a previous version. This equation of motion is useful for studying electron transfer in the system when coupled to an electron bath

  13. Common pitfalls in statistical analysis: Odds versus risk

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh; Pramesh, C. S.

    2015-01-01

    In biomedical research, we are often interested in quantifying the relationship between an exposure and an outcome. “Odds” and “Risk” are the most common terms which are used as measures of association between variables. In this article, which is the fourth in the series of common pitfalls in statistical analysis, we explain the meaning of risk and odds and the difference between the two. PMID:26623395

  14. A statistically harmonized alignment-classification in image space enables accurate and robust alignment of noisy images in single particle analysis.

    Science.gov (United States)

    Kawata, Masaaki; Sato, Chikara

    2007-06-01

    In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.

  15. Analysis of Variance with Summary Statistics in Microsoft® Excel®

    Science.gov (United States)

    Larson, David A.; Hsu, Ko-Cheng

    2010-01-01

    Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…

  16. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and

  17. Statistical Analysis Of Tank 19F Floor Sample Results

    International Nuclear Information System (INIS)

    Harris, S.

    2010-01-01

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  18. Vector-field statistics for the analysis of time varying clinical gait data.

    Science.gov (United States)

    Donnelly, C J; Alexander, C; Pataky, T C; Stannage, K; Reid, S; Robinson, M A

    2017-01-01

    In clinical settings, the time varying analysis of gait data relies heavily on the experience of the individual(s) assessing these biological signals. Though three dimensional kinematics are recognised as time varying waveforms (1D), exploratory statistical analysis of these data are commonly carried out with multiple discrete or 0D dependent variables. In the absence of an a priori 0D hypothesis, clinicians are at risk of making type I and II errors in their analyis of time varying gait signatures in the event statistics are used in concert with prefered subjective clinical assesment methods. The aim of this communication was to determine if vector field waveform statistics were capable of providing quantitative corroboration to practically significant differences in time varying gait signatures as determined by two clinically trained gait experts. The case study was a left hemiplegic Cerebral Palsy (GMFCS I) gait patient following a botulinum toxin (BoNT-A) injection to their left gastrocnemius muscle. When comparing subjective clinical gait assessments between two testers, they were in agreement with each other for 61% of the joint degrees of freedom and phases of motion analysed. For tester 1 and tester 2, they were in agreement with the vector-field analysis for 78% and 53% of the kinematic variables analysed. When the subjective analyses of tester 1 and tester 2 were pooled together and then compared to the vector-field analysis, they were in agreement for 83% of the time varying kinematic variables analysed. These outcomes demonstrate that in principle, vector-field statistics corroborates with what a team of clinical gait experts would classify as practically meaningful pre- versus post time varying kinematic differences. The potential for vector-field statistics to be used as a useful clinical tool for the objective analysis of time varying clinical gait data is established. Future research is recommended to assess the usefulness of vector-field analyses

  19. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    Science.gov (United States)

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  20. Introduction to statistics and data analysis with exercises, solutions and applications in R

    CERN Document Server

    Heumann, Christian; Shalabh

    2016-01-01

    This introductory statistics textbook conveys the essential concepts and tools needed to develop and nurture statistical thinking. It presents descriptive, inductive and explorative statistical methods and guides the reader through the process of quantitative data analysis. In the experimental sciences and interdisciplinary research, data analysis has become an integral part of any scientific study. Issues such as judging the credibility of data, analyzing the data, evaluating the reliability of the obtained results and finally drawing the correct and appropriate conclusions from the results are vital. The text is primarily intended for undergraduate students in disciplines like business administration, the social sciences, medicine, politics, macroeconomics, etc. It features a wealth of examples, exercises and solutions with computer code in the statistical programming language R as well as supplementary material that will enable the reader to quickly adapt all methods to their own applications.

  1. Methodology сomparative statistical analysis of Russian industry based on cluster analysis

    Directory of Open Access Journals (Sweden)

    Sergey S. Shishulin

    2017-01-01

    Full Text Available The article is devoted to researching of the possibilities of applying multidimensional statistical analysis in the study of industrial production on the basis of comparing its growth rates and structure with other developed and developing countries of the world. The purpose of this article is to determine the optimal set of statistical methods and the results of their application to industrial production data, which would give the best access to the analysis of the result.Data includes such indicators as output, output, gross value added, the number of employed and other indicators of the system of national accounts and operational business statistics. The objects of observation are the industry of the countrys of the Customs Union, the United States, Japan and Erope in 2005-2015. As the research tool used as the simplest methods of transformation, graphical and tabular visualization of data, and methods of statistical analysis. In particular, based on a specialized software package (SPSS, the main components method, discriminant analysis, hierarchical methods of cluster analysis, Ward’s method and k-means were applied.The application of the method of principal components to the initial data makes it possible to substantially and effectively reduce the initial space of industrial production data. Thus, for example, in analyzing the structure of industrial production, the reduction was from fifteen industries to three basic, well-interpreted factors: the relatively extractive industries (with a low degree of processing, high-tech industries and consumer goods (medium-technology sectors. At the same time, as a result of comparison of the results of application of cluster analysis to the initial data and data obtained on the basis of the principal components method, it was established that clustering industrial production data on the basis of new factors significantly improves the results of clustering.As a result of analyzing the parameters of

  2. Data analysis for radiological characterisation: Geostatistical and statistical complementarity

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; Dubot, Didier

    2012-01-01

    Radiological characterisation may cover a large range of evaluation objectives during a decommissioning and dismantling (D and D) project: removal of doubt, delineation of contaminated materials, monitoring of the decontamination work and final survey. At each stage, collecting relevant data to be able to draw the conclusions needed is quite a big challenge. In particular two radiological characterisation stages require an advanced sampling process and data analysis, namely the initial categorization and optimisation of the materials to be removed and the final survey to demonstrate compliance with clearance levels. On the one hand the latter is widely used and well developed in national guides and norms, using random sampling designs and statistical data analysis. On the other hand a more complex evaluation methodology has to be implemented for the initial radiological characterisation, both for sampling design and for data analysis. The geostatistical framework is an efficient way to satisfy the radiological characterisation requirements providing a sound decision-making approach for the decommissioning and dismantling of nuclear premises. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. Thus geo-statistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, leading to a sound classification of radiological waste (surfaces and volumes). This way, the radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive or not) surface survey of the contamination is implemented on a regular grid. Finally, in order to assess activity levels and contamination depths, destructive samples are collected at several locations within the premises (based on the surface survey results) and analysed. Combined with

  3. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system

  4. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1975-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system. 19 references

  5. RESEARCH OF THE DATA BANK OF STATISTICAL ANALYSIS OF THE ADVERTISING MARKET

    Directory of Open Access Journals (Sweden)

    Ekaterina F. Devochkina

    2014-01-01

    Full Text Available The article contains the description of the process of making statistical accounting of the Russian advertising market. The author pays attention to the forms of state statistical accounting of different years, marks their different features and shortage. Also the article contains analysis of alternative sources of numerical information of Russian advertising market.

  6. Assessment of the surface chemistry of carbon blacks by TGA-MS, XPS and inverse gas chromatography using statistical chemometric analysis

    International Nuclear Information System (INIS)

    Strzemiecka, Beata; Voelkel, Adam; Donate-Robles, Jessica; Martín-Martínez, José Miguel

    2014-01-01

    Highlights: • Carbon blacks with lower specific surface area had basic character (electron donor) due to C=O and C-O groups. • Carbon blacks with higher specific surface area had acidic character (acceptor electron) due to OH groups. • Total surface energy and its dispersive component of carbon blacks increased by increasing their specific surface area. (table) - Abstract: Four carbon blacks with different specific surface areas and surface chemistries (C32, C71, C159 and C178) were analyzed by transmission electron microscopy (TEM) and nitrogen adsorption isotherms at 77 K. Their surface chemistries were analyzed by X-ray photoelectron spectroscopy (XPS), thermal gravimetric analysis coupled with mass spectrometry (TGA-MS) and inverse gas chromatography (IGC). The carbon blacks contained 2.7–5.8 wt% volatiles corresponding to -OH, C-O, C=O and COO groups. The surface chemistry parameters obtained with the different experimental techniques were inter-related by using chemometric statistical analysis tools. The application of this methodology showed that the carbon blacks with lower specific surface area (C32 and C71) had basic character (electron donor) mainly due to C=O and C-O groups, whereas the carbon black with the highest specific surface area (C178) showed acidic character (acceptor electron) due to its high content of OH groups. Moreover, the total surface energy and the dispersive component of the surface energy of the carbon blacks increased with the increase of their specific surface area. In general the specific interactions of the carbon blacks also increased with the increase of their specific surface area although C71 is exceptional due to higher oxygen content corresponding to C-O groups

  7. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  8. Halo statistics analysis within medium volume cosmological N-body simulation

    Directory of Open Access Journals (Sweden)

    Martinović N.

    2015-01-01

    Full Text Available In this paper we present halo statistics analysis of a ΛCDM N body cosmological simulation (from first halo formation until z = 0. We study mean major merger rate as a function of time, where for time we consider both per redshift and per Gyr dependence. For latter we find that it scales as the well known power law (1 + zn for which we obtain n = 2.4. The halo mass function and halo growth function are derived and compared both with analytical and empirical fits. We analyse halo growth through out entire simulation, making it possible to continuously monitor evolution of halo number density within given mass ranges. The halo formation redshift is studied exploring possibility for a new simple preliminary analysis during the simulation run. Visualization of the simulation is portrayed as well. At redshifts z = 0−7 halos from simulation have good statistics for further analysis especially in mass range of 1011 − 1014 M./h. [176021 ’Visible and invisible matter in nearby galaxies: theory and observations

  9. Statistical analysis of support thickness and particle size effects in HRTEM imaging of metal nanoparticles

    International Nuclear Information System (INIS)

    House, Stephen D.; Bonifacio, Cecile S.; Grieshaber, Ross V.; Li, Long; Zhang, Zhongfan; Ciston, Jim; Stach, Eric A.; Yang, Judith C.

    2016-01-01

    High-resolution transmission electron microscopy (HRTEM) examination of nanoparticles requires their placement on some manner of support – either TEM grid membranes or part of the material itself, as in many heterogeneous catalyst systems – but a systematic quantification of the practical imaging limits of this approach has been lacking. Here we address this issue through a statistical evaluation of how nanoparticle size and substrate thickness affects the ability to resolve structural features of interest in HRTEM images of metallic nanoparticles on common support membranes. The visibility of lattice fringes from crystalline Au nanoparticles on amorphous carbon and silicon supports of varying thickness was investigated with both conventional and aberration-corrected TEM. Over the 1–4 nm nanoparticle size range examined, the probability of successfully resolving lattice fringes differed significantly as a function both of nanoparticle size and support thickness. Statistical analysis was used to formulate guidelines for the selection of supports and to quantify the impact a given support would have on HRTEM imaging of crystalline structure. For nanoparticles ≥1 nm, aberration-correction was found to provide limited benefit for the purpose of visualizing lattice fringes; electron dose is more predictive of lattice fringe visibility than aberration correction. These results confirm that the ability to visualize lattice fringes is ultimately dependent on the signal-to-noise ratio of the HRTEM images, rather than the point-to-point resolving power of the microscope. This study provides a benchmark for HRTEM imaging of crystalline supported metal nanoparticles and is extensible to a wide variety of supports and nanostructures. - Highlights: • The impact of supports on imaging nanoparticle lattice structure is quantified. • Visualization probabilities given particle size and support thickness are estimated. • Aberration-correction provided limited benefit

  10. Statistical analysis of support thickness and particle size effects in HRTEM imaging of metal nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    House, Stephen D., E-mail: sdh46@pitt.edu [Chemical and Petroleum Engineering, and Physics, University of Pittsburgh, Pittsburgh, PA 15261 (United States); Bonifacio, Cecile S.; Grieshaber, Ross V.; Li, Long; Zhang, Zhongfan [Chemical and Petroleum Engineering, and Physics, University of Pittsburgh, Pittsburgh, PA 15261 (United States); Ciston, Jim [National Center of Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stach, Eric A. [Center for Functional Nanomaterials, Brookhaven National Laboratory, Upton, NY 11973 (United States); Yang, Judith C. [Chemical and Petroleum Engineering, and Physics, University of Pittsburgh, Pittsburgh, PA 15261 (United States)

    2016-10-15

    High-resolution transmission electron microscopy (HRTEM) examination of nanoparticles requires their placement on some manner of support – either TEM grid membranes or part of the material itself, as in many heterogeneous catalyst systems – but a systematic quantification of the practical imaging limits of this approach has been lacking. Here we address this issue through a statistical evaluation of how nanoparticle size and substrate thickness affects the ability to resolve structural features of interest in HRTEM images of metallic nanoparticles on common support membranes. The visibility of lattice fringes from crystalline Au nanoparticles on amorphous carbon and silicon supports of varying thickness was investigated with both conventional and aberration-corrected TEM. Over the 1–4 nm nanoparticle size range examined, the probability of successfully resolving lattice fringes differed significantly as a function both of nanoparticle size and support thickness. Statistical analysis was used to formulate guidelines for the selection of supports and to quantify the impact a given support would have on HRTEM imaging of crystalline structure. For nanoparticles ≥1 nm, aberration-correction was found to provide limited benefit for the purpose of visualizing lattice fringes; electron dose is more predictive of lattice fringe visibility than aberration correction. These results confirm that the ability to visualize lattice fringes is ultimately dependent on the signal-to-noise ratio of the HRTEM images, rather than the point-to-point resolving power of the microscope. This study provides a benchmark for HRTEM imaging of crystalline supported metal nanoparticles and is extensible to a wide variety of supports and nanostructures. - Highlights: • The impact of supports on imaging nanoparticle lattice structure is quantified. • Visualization probabilities given particle size and support thickness are estimated. • Aberration-correction provided limited benefit

  11. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  12. Using time series for the statistical monitoring of spectral quality index of electron beams for clinical use

    International Nuclear Information System (INIS)

    Martinez-Luna, R. J.; Vega, J. M. de la; Vilches, M.; Guirado, D.; Zamora, L. I.; Lallena, A. M.

    2011-01-01

    Using the techniques of statistical process control (SPC) keeps track of the variable that controls the stability of the spectrum of electron beam accelerators in clinical use. In this process, applied since 1995, we obtained a high number of false alarms. Our work shows that this unexpected behavior appears to treat the variable of interest as a normal random variable, independent and identically distributed (iid), when in fact the observations of this variable are positively correlated with each other. (Author)

  13. Radiation belt seed population and its association with the relativistic electron dynamics: A statistical study: Radiation Belt Seed Population

    International Nuclear Information System (INIS)

    Tang, C. L.; Wang, Y. X.; Ni, B.; Zhang, J.-C.

    2017-01-01

    Using the Van Allen Probes data, we study the radiation belt seed population and it associated with the relativistic electron dynamics during 74 geomagnetic storm events. Based on the flux changes of 1 MeV electrons before and after the storm peak, these storm events are divided into two groups of “non-preconditioned” and “preconditioned”. The statistical study shows that the storm intensity is of significant importance for the distribution of the seed population (336 keV electrons) in the outer radiation belt. However, substorm intensity can also be important to the evolution of the seed population for some geomagnetic storm events. For non-preconditioned storm events, the correlation between the peak fluxes and their L-shell locations of the seed population and relativistic electrons (592 keV, 1.0 MeV, 1.8 MeV, and 2.1 MeV) is consistent with the energy-dependent dynamic processes in the outer radiation belt. For preconditioned storm events, the correlation between the features of the seed population and relativistic electrons is not fully consistent with the energy-dependent processes. It is suggested that the good correlation between the radiation belt seed population and ≤1.0 MeV electrons contributes to the prediction of the evolution of ≤1.0 MeV electrons in the Earth’s outer radiation belt during periods of geomagnetic storms.

  14. Statistical dynamics of transient processes in a gas discharge plasma

    International Nuclear Information System (INIS)

    Smirnov, G.I.; Telegin, G.G.

    1991-01-01

    The properties of a gas discharge plasma to a great extent depend on random processes whose study has recently become particularly important. The present work is concerned with analyzing the statistical phenomena that occur during the prebreakdown stage in a gas discharge. Unlike other studies of breakdown in the discharge gap, in which secondary electron effects and photon processes at the electrodes must be considered, here the authors treat the case of an electrodeless rf discharge or a laser photoresonant plasma. The analysis is based on the balance between the rates of electron generation and recombination in the plasma. The fluctuation kinetics for ionization of atoms in the hot plasma may also play an important role when the electron temperature changes abruptly, as occurs during adiabatic pinching of the plasma or during electron cyclotron heating

  15. Qualitative analysis of barium particles coated in small intestinal mucosa of rabbit by using scanning electron microscopy

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Ha, Hyun Kwon; Lee, Yang Seob; Kim, Jae Kyun; Yoon, Seong Eon; Kim, Jung Hoon; Chung, Dong Jin; Auh, Yong Ho

    1998-01-01

    To qualitatively analysed barium coating status in the intestinal mucosa, we used scanning electron microscopy to observe barium particles coated in the small intestinal mucosa of rabbit, and we attempted to assess the relationship between electron microscopic findings and radiographic densities. Six different combination of barium and methylcellulose suspensions were infused into the resected small intestines of 15 rabbits. Barium powders were mixed with water to make 40% and 70% w/v barium solutions, and also mixed with 0.5% methylcellulose solutions were used as a double contrast agent. After the infusion of barium suspensions, a mammography unit was used to obtain radiographs of the small intestine, and their optical densities were measured by a densitometer. Thereafter, photographs of barium-coated small intestinal mucosa were obtained using a scanning electron microscope (x 8,000), and the number of barium particles in the unit area were measured. To compare the relationship between the electron microscopic findings and optical densities, statistical analysis using Spearman correlation was performed. This study shows that by using scanning electron microscopy, barium particles coated on the small intestinal mucosa can be qualitatively analysed. It also shows that the number of small barium particles measured by scanning electron microscopy is related to optical densities. (author). 14 refs., 2 figs

  16. An 'electronic' extramural course in epidemiology and medical statistics.

    Science.gov (United States)

    Ostbye, T

    1989-03-01

    This article describes an extramural university course in epidemiology and medical statistics taught using a computer conferencing system, microcomputers and data communications. Computer conferencing was shown to be a powerful, yet quite easily mastered, vehicle for distance education. It allows health personnel unable to attend regular classes due to geographical or time constraints, to take part in an interactive learning environment at low cost. This overcomes part of the intellectual and social isolation associated with traditional correspondence courses. Teaching of epidemiology and medical statistics is well suited to computer conferencing, even if the asynchronicity of the medium makes discussion of the most complex statistical concepts a little cumbersome. Computer conferencing may also prove to be a useful tool for teaching other medical and health related subjects.

  17. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  18. Basic statistics for nuclear electronics

    International Nuclear Information System (INIS)

    Stanchi, L.

    1981-01-01

    The paper reports many equations of general utility in the non-destructive measurement of nuclear materials. Its content is the synthesis of several lectures given in various courses of nuclear electronics and non-destructive measurements. It does not contain original derivations but provides in a convenient format formulae which may be useful to experimenters in their daily work. General equations on variances and standard deviations as well as the probability of single observations are represented. The most common frequency distributions of events (binomial, normal, poissonian and interval) are presented and practical formulae are given. Particular attention is devoted to deadtime effects (even although the cascade of deadtimes is not considered) and to the regularizing effect of scalers. Finally a short examination of ratemeters and the resolution problem in detection and amplification chains is provided

  19. STATISTICAL ANALYSIS OF FLARING LOOPS OBSERVED BY NOBEYAMA RADIOHELIOGRAPH. I. COMPARISON OF LOOPTOP AND FOOTPOINTS

    International Nuclear Information System (INIS)

    Huang Guangli; Nakajima, Hiroshi

    2009-01-01

    Twenty-four events with looplike structures at 17 and 34 GHz are selected from the flare list of Nobeyama Radioheliograph. We obtained the brightness temperatures at 17 and 34 GHz, the polarization degrees at 17 GHz, and the power-law spectral indices at the radio peak time for one looptop (LT) and two footpoints (FPs) of each event. We also calculated the magnetic field strengths and the column depths of nonthermal electrons in the LT and FPs of each event, using the equations modified from the gyrosynchrotron equations by Dulk. The main statistical results from those data are summarized as follows. (1) The spectral indices, the brightness temperatures at 17 and 34 GHz, the polarization degrees at 17 GHz, the calculated magnetic field strengths, and the calculated column densities of nonthermal electrons are always positively correlated between the LT and the two FPs of the selected events. (2) About one-half of the events have the brightest LT at 17 and 34 GHz. (3) The spectral indices in the two FPs are larger (softer) than those in the corresponding LT in most events. (4) The calculated magnetic field strengths in the two FPs are always larger than those in the corresponding LT. (5) Most of the events have the same positive or negative polarization sense in the LT and the two FPs. (6) The brightness temperatures at 17 and 34 GHz in each of the LT and the two FPs statistically decrease with their spectral indices and the calculated magnetic field strengths, but increase with their calculated column densities of nonthermal electrons. Moreover, we try to discuss the possible causes of the present statistical results.

  20. Electron angular distribution axial channeling

    International Nuclear Information System (INIS)

    Khokonov, A.Kh.; Khokonov, M.Kh.

    1989-01-01

    Angular distributions of ultra-relativistic electrons are calculated in the assumption about presence of statistical equilibrium. Analysis is based on numerical solution of Fokker-Planck type kinetic equation. It is shown that in contrast to case of amorphous medium, the multiple scattering at axial channeling of negative particles results in self-focusing of the initial beam particles and due to it number of electrons moving at an angles to the chain, which are smaller, than critical angle of channeling, may increase by several times as compared to the initial one

  1. A statistical approach to evaluate hydrocarbon remediation in the unsaturated zone

    International Nuclear Information System (INIS)

    Hajali, P.; Marshall, T.; Overman, S.

    1991-01-01

    This paper presents an evaluation of performance and cleanup effectiveness of a vapor extraction system (VES) in extracting chlorinated hydrocarbons and petroleum-based hydrocarbons (mineral spirits) from the unsaturated zone. The statistical analysis of soil concentration data to evaluate the VES remediation success is described. The site is a former electronics refurbishing facility in southern California; soil contamination from organic solvents was found mainly in five areas (Area A through E) beneath two buildings. The evaluation begins with a brief description of the site background, discusses the statistical approach, and presents conclusions

  2. Short-run and Current Analysis Model in Statistics

    Directory of Open Access Journals (Sweden)

    Constantin Anghelache

    2006-01-01

    Full Text Available Using the short-run statistic indicators is a compulsory requirement implied in the current analysis. Therefore, there is a system of EUROSTAT indicators on short run which has been set up in this respect, being recommended for utilization by the member-countries. On the basis of these indicators, there are regular, usually monthly, analysis being achieved in respect of: the production dynamic determination; the evaluation of the short-run investment volume; the development of the turnover; the wage evolution: the employment; the price indexes and the consumer price index (inflation; the volume of exports and imports and the extent to which the imports are covered by the exports and the sold of trade balance. The EUROSTAT system of indicators of conjuncture is conceived as an open system, so that it can be, at any moment extended or restricted, allowing indicators to be amended or even removed, depending on the domestic users requirements as well as on the specific requirements of the harmonization and integration. For the short-run analysis, there is also the World Bank system of indicators of conjuncture, which is utilized, relying on the data sources offered by the World Bank, The World Institute for Resources or other international organizations statistics. The system comprises indicators of the social and economic development and focuses on the indicators for the following three fields: human resources, environment and economic performances. At the end of the paper, there is a case study on the situation of Romania, for which we used all these indicators.

  3. Short-run and Current Analysis Model in Statistics

    Directory of Open Access Journals (Sweden)

    Constantin Mitrut

    2006-03-01

    Full Text Available Using the short-run statistic indicators is a compulsory requirement implied in the current analysis. Therefore, there is a system of EUROSTAT indicators on short run which has been set up in this respect, being recommended for utilization by the member-countries. On the basis of these indicators, there are regular, usually monthly, analysis being achieved in respect of: the production dynamic determination; the evaluation of the short-run investment volume; the development of the turnover; the wage evolution: the employment; the price indexes and the consumer price index (inflation; the volume of exports and imports and the extent to which the imports are covered by the exports and the sold of trade balance. The EUROSTAT system of indicators of conjuncture is conceived as an open system, so that it can be, at any moment extended or restricted, allowing indicators to be amended or even removed, depending on the domestic users requirements as well as on the specific requirements of the harmonization and integration. For the short-run analysis, there is also the World Bank system of indicators of conjuncture, which is utilized, relying on the data sources offered by the World Bank, The World Institute for Resources or other international organizations statistics. The system comprises indicators of the social and economic development and focuses on the indicators for the following three fields: human resources, environment and economic performances. At the end of the paper, there is a case study on the situation of Romania, for which we used all these indicators.

  4. Statistical analysis of proteomics, metabolomics, and lipidomics data using mass spectrometry

    CERN Document Server

    Mertens, Bart

    2017-01-01

    This book presents an overview of computational and statistical design and analysis of mass spectrometry-based proteomics, metabolomics, and lipidomics data. This contributed volume provides an introduction to the special aspects of statistical design and analysis with mass spectrometry data for the new omic sciences. The text discusses common aspects of design and analysis between and across all (or most) forms of mass spectrometry, while also providing special examples of application with the most common forms of mass spectrometry. Also covered are applications of computational mass spectrometry not only in clinical study but also in the interpretation of omics data in plant biology studies. Omics research fields are expected to revolutionize biomolecular research by the ability to simultaneously profile many compounds within either patient blood, urine, tissue, or other biological samples. Mass spectrometry is one of the key analytical techniques used in these new omic sciences. Liquid chromatography mass ...

  5. Three-Dimensional Assembly Tolerance Analysis Based on the Jacobian-Torsor Statistical Model

    Directory of Open Access Journals (Sweden)

    Peng Heping

    2017-01-01

    Full Text Available The unified Jacobian-Torsor model has been developed for deterministic (worst case tolerance analysis. This paper presents a comprehensive model for performing statistical tolerance analysis by integrating the unified Jacobian-Torsor model and Monte Carlo simulation. In this model, an assembly is sub-divided into surfaces, the Small Displacements Torsor (SDT parameters are used to express the relative position between any two surfaces of the assembly. Then, 3D dimension-chain can be created by using a surface graph of the assembly and the unified Jacobian-Torsor model is developed based on the effect of each functional element on the whole functional requirements of products. Finally, Monte Carlo simulation is implemented for the statistical tolerance analysis. A numerical example is given to demonstrate the capability of the proposed method in handling three-dimensional assembly tolerance analysis.

  6. SAS and R data management, statistical analysis, and graphics

    CERN Document Server

    Kleinman, Ken

    2009-01-01

    An All-in-One Resource for Using SAS and R to Carry out Common TasksProvides a path between languages that is easier than reading complete documentationSAS and R: Data Management, Statistical Analysis, and Graphics presents an easy way to learn how to perform an analytical task in both SAS and R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation. The book covers many common tasks, such as data management, descriptive summaries, inferential procedures, regression analysis, and the creation of graphics, along with more complex applicat

  7. Statistical methods for data analysis in particle physics

    CERN Document Server

    AUTHOR|(CDS)2070643

    2015-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data

  8. Statistical Analysis of 30 Years Rainfall Data: A Case Study

    Science.gov (United States)

    Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.

    2017-07-01

    Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.

  9. Security analysis of electronic voting and online banking systems

    OpenAIRE

    Tjøstheim, Thomas

    2007-01-01

    The main focus of this dissertation is on security analysis of electronic voting and online banking systems. Six papers form the basis of the thesis and include the following topics: a model for analysis of voting systems, a case study where we apply the proposed model, a new scheme for remote electronic voting, and three case studies of commercial online banking solutions in Norway.

  10. Statistical strategies to reveal potential vibrational markers for in vivo analysis by confocal Raman spectroscopy

    Science.gov (United States)

    Oliveira Mendes, Thiago de; Pinto, Liliane Pereira; Santos, Laurita dos; Tippavajhala, Vamshi Krishna; Téllez Soto, Claudio Alberto; Martin, Airton Abrahão

    2016-07-01

    The analysis of biological systems by spectroscopic techniques involves the evaluation of hundreds to thousands of variables. Hence, different statistical approaches are used to elucidate regions that discriminate classes of samples and to propose new vibrational markers for explaining various phenomena like disease monitoring, mechanisms of action of drugs, food, and so on. However, the technical statistics are not always widely discussed in applied sciences. In this context, this work presents a detailed discussion including the various steps necessary for proper statistical analysis. It includes univariate parametric and nonparametric tests, as well as multivariate unsupervised and supervised approaches. The main objective of this study is to promote proper understanding of the application of various statistical tools in these spectroscopic methods used for the analysis of biological samples. The discussion of these methods is performed on a set of in vivo confocal Raman spectra of human skin analysis that aims to identify skin aging markers. In the Appendix, a complete routine of data analysis is executed in a free software that can be used by the scientific community involved in these studies.

  11. A method for statistical steady state thermal analysis of reactor cores

    International Nuclear Information System (INIS)

    Whetton, P.A.

    1980-01-01

    This paper presents a method for performing a statistical steady state thermal analysis of a reactor core. The technique is only outlined here since detailed thermal equations are dependent on the core geometry. The method has been applied to a pressurised water reactor core and the results are presented for illustration purposes. Random hypothetical cores are generated using the Monte-Carlo method. The technique shows that by splitting the parameters into two types, denoted core-wise and in-core, the Monte Carlo method may be used inexpensively. The idea of using extremal statistics to characterise the low probability events (i.e. the tails of a distribution) is introduced together with a method of forming the final probability distribution. After establishing an acceptable probability of exceeding a thermal design criterion, the final probability distribution may be used to determine the corresponding thermal response value. If statistical and deterministic (i.e. conservative) thermal response values are compared, information on the degree of pessimism in the deterministic method of analysis may be inferred and the restrictive performance limitations imposed by this method relieved. (orig.)

  12. Statistical analysis of the electrocatalytic activity of Pt nanoparticles supported on novel functionalized reduced graphene oxide-chitosan for methanol electrooxidation

    Science.gov (United States)

    Ekrami-Kakhki, Mehri-Saddat; Abbasi, Sedigheh; Farzaneh, Nahid

    2018-01-01

    The purpose of this study is to statistically analyze the anodic current density and peak potential of methanol oxidation at Pt nanoparticles supported on functionalized reduced graphene oxide (RGO), using design of experiments methodology. RGO is functionalized with methyl viologen (MV) and chitosan (CH). The novel Pt/MV-RGO-CH catalyst is successfully prepared and characterized with transmission electron microscopy (TEM) image. The electrocatalytic activity of Pt/MV-RGOCH catalyst is experimentally evaluated for methanol oxidation. The effects of methanol concentration and scan rate factors are also investigated experimentally and statistically. The effects of these two main factors and their interactions are investigated, using analysis of variance test, Duncan's multiple range test and response surface method. The results of the analysis of variance show that all the main factors and their interactions have a significant effect on anodic current density and peak potential of methanol oxidation at α = 0.05. The suggested models which encompass significant factors can predict the variation of the anodic current density and peak potential of methanol oxidation. The results of Duncan's multiple range test confirmed that there is a significant difference between the studied levels of the main factors. [Figure not available: see fulltext.

  13. Statistical analysis of first period of operation of FTU Tokamak

    International Nuclear Information System (INIS)

    Crisanti, F.; Apruzzese, G.; Frigione, D.; Kroegler, H.; Lovisetto, L.; Mazzitelli, G.; Podda, S.

    1996-09-01

    On the FTU Tokamak the plasma physics operations started on the 20/4/90. The first plasma had a plasma current Ip=0.75 MA for about a second. The experimental phase lasted until 7/7/94, when a long shut-down begun for installing the toroidal limiter in the inner side of the vacuum vessel. In these four years of operations plasma experiments have been successfully exploited, e.g. experiments of single and multiple pellet injections; full current drive up to Ip=300 KA was obtained by using waves at the frequency of the Lower Hybrid; analysis of ohmic plasma parameters with different materials (from the low Z silicon to high Z tungsten) as plasma facing element was performed. In this work a statistical analysis of the full period of operation is presented. Moreover, a comparison with the statistical data from other Tokamaks is attempted

  14. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  15. Using R and RStudio for data management, statistical analysis and graphics

    CERN Document Server

    Horton, Nicholas J

    2015-01-01

    This is the second edition of the popular book on using R for statistical analysis and graphics. The authors, who run a popular blog supplementing their books, have focused on adding many new examples to this new edition. These examples are presented primarily in new chapters based on the following themes: simulation, probability, statistics, mathematics/computing, and graphics. The authors have also added many other updates, including a discussion of RStudio-a very popular development environment for R.

  16. Statistical analysis of absorptive laser damage in dielectric thin films

    International Nuclear Information System (INIS)

    Budgor, A.B.; Luria-Budgor, K.F.

    1978-01-01

    The Weibull distribution arises as an example of the theory of extreme events. It is commonly used to fit statistical data arising in the failure analysis of electrical components and in DC breakdown of materials. This distribution is employed to analyze time-to-damage and intensity-to-damage statistics obtained when irradiating thin film coated samples of SiO 2 , ZrO 2 , and Al 2 O 3 with tightly focused laser beams. The data used is furnished by Milam. The fit to the data is excellent; and least squared correlation coefficients greater than 0.9 are often obtained

  17. Center Line Slope Analysis in Two-Dimensional Electronic Spectroscopy

    OpenAIRE

    ?anda, Franti?ek; Perl?k, V?clav; Lincoln, Craig N.; Hauer, J?rgen

    2015-01-01

    Center line slope (CLS) analysis in 2D infrared spectroscopy has been extensively used to extract frequency?frequency correlation functions of vibrational transitions. We apply this concept to 2D electronic spectroscopy, where CLS is a measure of electronic gap fluctuations. The two domains, infrared and electronic, possess differences: In the infrared, the frequency fluctuations are classical, often slow and Gaussian. In contrast, electronic spectra are subject to fast spectral diffusion and...

  18. Statistical analysis of failure time in stress corrosion cracking of fuel tube in light water reactor

    International Nuclear Information System (INIS)

    Hirao, Keiichi; Yamane, Toshimi; Minamino, Yoritoshi

    1991-01-01

    This report is to show how the life due to stress corrosion cracking breakdown of fuel cladding tubes is evaluated by applying the statistical techniques to that examined by a few testing methods. The statistical distribution of the limiting values of constant load stress corrosion cracking life, the statistical analysis by making the probabilistic interpretation of constant load stress corrosion cracking life, and the statistical analysis of stress corrosion cracking life by the slow strain rate test (SSRT) method are described. (K.I.)

  19. Assessment of homogeneity of candidate reference material at the nanogram level and investigation on representativeness of single particle analysis using electron probe X ray microanalysis

    International Nuclear Information System (INIS)

    Ro, Chul-Un; Hoornaerta, S.; Griekena, R. van

    2002-01-01

    Particulate samples of a candidate reference material are evaluated on their homogeneity from bottle to bottle using electron probe X ray microanalysis technique. The evaluation on the homogeneity is done by the utilization of the Kolmogorov-Smirnov statistics to the processing of the quantitative electron probe X ray microanalysis data. Due to a limitation, existing even in computer controlled electron probe X ray microanalysis, in terms of analysis time and expenses, the number of particles analyzed is much smaller compared to that in the sample. Therefore, it is investigated whether this technique provides representative analysis results for the characteristics of the sample, even though a very small portion of the sample is really analyzed. Furthermore, the required number of particles for the analysis, to insure a certain level of reproducibility, e.g. 5% relative standard deviation, is determined by the application of the Ingamells sampling theory. (author)

  20. Discrimination Method of the Volatiles from Fresh Mushrooms by an Electronic Nose Using a Trapping System and Statistical Standardization to Reduce Sensor Value Variation

    Directory of Open Access Journals (Sweden)

    Kouki Fujioka

    2013-11-01

    Full Text Available Electronic noses have the benefit of obtaining smell information in a simple and objective manner, therefore, many applications have been developed for broad analysis areas such as food, drinks, cosmetics, medicine, and agriculture. However, measurement values from electronic noses have a tendency to vary under humidity or alcohol exposure conditions, since several types of sensors in the devices are affected by such variables. Consequently, we show three techniques for reducing the variation of sensor values: (1 using a trapping system to reduce the infering components; (2 performing statistical standardization (calculation of z-score; and (3 selecting suitable sensors. With these techniques, we discriminated the volatiles of four types of fresh mushrooms: golden needle (Flammulina velutipes, white mushroom (Agaricus bisporus, shiitake (Lentinus edodes, and eryngii (Pleurotus eryngii among six fresh mushrooms (hen of the woods (Grifola frondosa, shimeji (Hypsizygus marmoreus plus the above mushrooms. Additionally, we succeeded in discrimination of white mushroom, only comparing with artificial mushroom flavors, such as champignon flavor and truffle flavor. In conclusion, our techniques will expand the options to reduce variations in sensor values.

  1. Implementation and statistical analysis of Metropolis algorithm for SU(3)

    International Nuclear Information System (INIS)

    Katznelson, E.; Nobile, A.

    1984-12-01

    In this paper we study the statistical properties of an implementation of the Metropolis algorithm for SU(3) gauge theory. It is shown that the results have normal distribution. We demonstrate that in this case error analysis can be carried on in a simple way and we show that applying it to both the measurement strategy and the output data analysis has an important influence on the performance and reliability of the simulation. (author)

  2. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  3. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    Science.gov (United States)

    Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang

    2017-10-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.

  4. Monitoring the metering performance of an electronic voltage transformer on-line based on cyber-physics correlation analysis

    International Nuclear Information System (INIS)

    Zhang, Zhu; Li, Hongbin; Hu, Chen; Jiao, Yang; Tang, Dengping

    2017-01-01

    Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer. (paper)

  5. Statistical mechanical analysis of LMFBR fuel cladding tubes

    International Nuclear Information System (INIS)

    Poncelet, J.-P.; Pay, A.

    1977-01-01

    The most important design requirement on fuel pin cladding for LMFBR's is its mechanical integrity. Disruptive factors include internal pressure from mixed oxide fuel fission gas release, thermal stresses and high temperature creep, neutron-induced differential void-swelling as a source of stress in the cladding and irradiation creep of stainless steel material, corrosion by fission products. Under irradiation these load-restraining mechanisms are accentuated by stainless steel embrittlement and strength alterations. To account for the numerous uncertainties involved in the analysis by theoretical models and computer codes statistical tools are unavoidably requested, i.e. Monte Carlo simulation methods. Thanks to these techniques, uncertainties in nominal characteristics, material properties and environmental conditions can be linked up in a correct way and used for a more accurate conceptual design. First, a thermal creep damage index is set up through a sufficiently sophisticated clad physical analysis including arbitrary time dependence of power and neutron flux as well as effects of sodium temperature, burnup and steel mechanical behavior. Although this strain limit approach implies a more general but time consuming model., on the counterpart the net output is improved and e.g. clad temperature, stress and strain maxima may be easily assessed. A full spectrum of variables are statistically treated to account for their probability distributions. Creep damage probability may be obtained and can contribute to a quantitative fuel probability estimation

  6. A robust statistical method for association-based eQTL analysis.

    Directory of Open Access Journals (Sweden)

    Ning Jiang

    Full Text Available It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation.We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations.The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.

  7. Role of scanning electron microscope )SEM) in metal failure analysis

    International Nuclear Information System (INIS)

    Shaiful Rizam Shamsudin; Hafizal Yazid; Mohd Harun; Siti Selina Abd Hamid; Nadira Kamarudin; Zaiton Selamat; Mohd Shariff Sattar; Muhamad Jalil

    2005-01-01

    Scanning electron microscope (SEM) is a scientific instrument that uses a beam of highly energetic electrons to examine the surface and phase distribution of specimens on a micro scale through the live imaging of secondary electrons (SE) and back-scattered electrons (BSE) images. One of the main activities of SEM Laboratory at MINT is for failure analysis on metal part and components. The capability of SEM is excellent for determining the root cause of metal failures such as ductility or brittleness, stress corrosion, fatigue and other types of failures. Most of our customers that request for failure analysis are from local petrochemical plants, manufacturers of automotive components, pipeline maintenance personnel and engineers who involved in the development of metal parts and component. This paper intends to discuss some of the technical concepts in failure analysis associated with SEM. (Author)

  8. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  9. Constitution of an incident database suited to statistical analysis and examples

    International Nuclear Information System (INIS)

    Verpeaux, J.L.

    1990-01-01

    The Nuclear Protection and Safety Institute (IPSN) has set up and is developing an incidents database, which is used for the management and analysis of incidents encountered in French PWR plants. IPSN has already carried out several incidents or safety important events statistical analysis, and is improving its database on the basis of the experience it gained from this various studies. A description of the analysis method and of the developed database is presented

  10. A new statistic for the analysis of circular data in gamma-ray astronomy

    Science.gov (United States)

    Protheroe, R. J.

    1985-01-01

    A new statistic is proposed for the analysis of circular data. The statistic is designed specifically for situations where a test of uniformity is required which is powerful against alternatives in which a small fraction of the observations is grouped in a small range of directions, or phases.

  11. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  12. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  13. How electronic dynamics with Pauli exclusion produces Fermi-Dirac statistics.

    Science.gov (United States)

    Nguyen, Triet S; Nanguneri, Ravindra; Parkhill, John

    2015-04-07

    It is important that any dynamics method approaches the correct population distribution at long times. In this paper, we derive a one-body reduced density matrix dynamics for electrons in energetic contact with a bath. We obtain a remarkable equation of motion which shows that in order to reach equilibrium properly, rates of electron transitions depend on the density matrix. Even though the bath drives the electrons towards a Boltzmann distribution, hole blocking factors in our equation of motion cause the electronic populations to relax to a Fermi-Dirac distribution. These factors are an old concept, but we show how they can be derived with a combination of time-dependent perturbation theory and the extended normal ordering of Mukherjee and Kutzelnigg for a general electronic state. The resulting non-equilibrium kinetic equations generalize the usual Redfield theory to many-electron systems, while ensuring that the orbital occupations remain between zero and one. In numerical applications of our equations, we show that relaxation rates of molecules are not constant because of the blocking effect. Other applications to model atomic chains are also presented which highlight the importance of treating both dephasing and relaxation. Finally, we show how the bath localizes the electron density matrix.

  14. How electronic dynamics with Pauli exclusion produces Fermi-Dirac statistics

    International Nuclear Information System (INIS)

    Nguyen, Triet S.; Nanguneri, Ravindra; Parkhill, John

    2015-01-01

    It is important that any dynamics method approaches the correct population distribution at long times. In this paper, we derive a one-body reduced density matrix dynamics for electrons in energetic contact with a bath. We obtain a remarkable equation of motion which shows that in order to reach equilibrium properly, rates of electron transitions depend on the density matrix. Even though the bath drives the electrons towards a Boltzmann distribution, hole blocking factors in our equation of motion cause the electronic populations to relax to a Fermi-Dirac distribution. These factors are an old concept, but we show how they can be derived with a combination of time-dependent perturbation theory and the extended normal ordering of Mukherjee and Kutzelnigg for a general electronic state. The resulting non-equilibrium kinetic equations generalize the usual Redfield theory to many-electron systems, while ensuring that the orbital occupations remain between zero and one. In numerical applications of our equations, we show that relaxation rates of molecules are not constant because of the blocking effect. Other applications to model atomic chains are also presented which highlight the importance of treating both dephasing and relaxation. Finally, we show how the bath localizes the electron density matrix

  15. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  16. Dataset on statistical analysis of editorial board composition of Hindawi journals indexed in Emerging sources citation index

    Directory of Open Access Journals (Sweden)

    Hilary I. Okagbue

    2018-04-01

    Full Text Available This data article contains the statistical analysis of the total, percentage and distribution of editorial board composition of 111 Hindawi journals indexed in Emerging Sources Citation Index (ESCI across the continents. The reliability of the data was shown using correlation, goodness-of-fit test, analysis of variance and statistical variability tests. Keywords: Hindawi, Bibliometrics, Data analysis, ESCI, Random, Smart campus, Web of science, Ranking analytics, Statistics

  17. Statistical analysis of the determinations of the Sun's Galactocentric distance

    Science.gov (United States)

    Malkin, Zinovy

    2013-02-01

    Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.

  18. ANALYSIS OF HOUSEHOLD BEHAVIOUR TO THE COLLECTION OF WASTE ELECTRICAL AND ELECTRONIC EQUIPMENT IN ROMANIA

    Directory of Open Access Journals (Sweden)

    MARIA LOREDANA NICOLESCU

    2015-12-01

    Full Text Available This paper presents an analysis of household behaviour to the collection of waste electrical and electronic equipment in Romania based on an econometric multifactorial linear regression model. In the model, the amount of WEEE* collected in the counties rep resents the endogenous variable, and factors such as regional gross domestic product, the number of employees, monthly average net nominal earnings, unemployed persons, retirees, existing housing, education and other non - quantifiable factors with regional influence are influence factors or explanatory (exogenous variables. The period considered for the study is 2010 - 2012, and statistics are taken and processed at county level. The study is necessary to identify the extent to which those factors influence t he collection of WEEE from private households. The results of this study may lead to an improvement of the management of waste electrical and electronic equipment in Romania, being useful for policy makers and stakeholders involved in the system.

  19. Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco

    Science.gov (United States)

    Bounoua, Z.; Mechaqrane, A.

    2018-05-01

    An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.

  20. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  1. Analysis of spectral data with rare events statistics

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.; Chernov, N.I.

    1990-01-01

    The case is considered of analyzing experimental data, when the results of individual experimental runs cannot be summed due to large systematic errors. A statistical analysis of the hypothesis about the persistent peaks in the spectra has been performed by means of the Neyman-Pearson test. The computations demonstrate the confidence level for the hypothesis about the presence of a persistent peak in the spectrum is proportional to the square root of the number of independent experimental runs, K. 5 refs

  2. False colour backscatter electron images and their application during electron microprobe analysis of ores and host rocks

    International Nuclear Information System (INIS)

    Cousens, D.R.; French, D.H.; Ramsden, A.R.

    1989-01-01

    The limited contrast range of conventional black and white imaging does not enable full use to be made of the dynamic range of the video signal obtained from a scanning electron microscope or microprobe. The use of false colour substantially increases the information that can be derived from such images enabling relationships to be displayed that cannot be observed in black and white. This capability is now used extensively in combination with quantitative electron microprobe analysis as a research tool for ore characterization and host rocks studies related to minerals exploration in the CSIRO Div.sion of Exploration Geoscience. Thus the CAMEBAX scanning electron microprobe has been modified to allow digital images acquisition and software (IMAGE) developed which allows false colour backscatter electron (BSE) images to be obtained during the course of routine electron microprobe analysis. 1 fig

  3. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  4. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    Science.gov (United States)

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  5. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  6. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  7. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  8. Error analysis of terrestrial laser scanning data by means of spherical statistics and 3D graphs.

    Science.gov (United States)

    Cuartero, Aurora; Armesto, Julia; Rodríguez, Pablo G; Arias, Pedro

    2010-01-01

    This paper presents a complete analysis of the positional errors of terrestrial laser scanning (TLS) data based on spherical statistics and 3D graphs. Spherical statistics are preferred because of the 3D vectorial nature of the spatial error. Error vectors have three metric elements (one module and two angles) that were analyzed by spherical statistics. A study case has been presented and discussed in detail. Errors were calculating using 53 check points (CP) and CP coordinates were measured by a digitizer with submillimetre accuracy. The positional accuracy was analyzed by both the conventional method (modular errors analysis) and the proposed method (angular errors analysis) by 3D graphics and numerical spherical statistics. Two packages in R programming language were performed to obtain graphics automatically. The results indicated that the proposed method is advantageous as it offers a more complete analysis of the positional accuracy, such as angular error component, uniformity of the vector distribution, error isotropy, and error, in addition the modular error component by linear statistics.

  9. Statistics of electron multiplication in a multiplier phototube; Iterative method

    International Nuclear Information System (INIS)

    Ortiz, J. F.; Grau, A.

    1985-01-01

    In the present paper an iterative method is applied to study the variation of dynode response in the multiplier phototube. Three different situation are considered that correspond to the following ways of electronic incidence on the first dynode: incidence of exactly one electron, incidence of exactly r electrons and incidence of an average r electrons. The responses are given for a number of steps between 1 and 5, and for values of the multiplication factor of 2.1, 2.5, 3 and 5. We study also the variance, the skewness and the excess of jurtosis for different multiplication factors. (Author) 11 refs

  10. CFAssay: statistical analysis of the colony formation assay

    International Nuclear Information System (INIS)

    Braselmann, Herbert; Michna, Agata; Heß, Julia; Unger, Kristian

    2015-01-01

    Colony formation assay is the gold standard to determine cell reproductive death after treatment with ionizing radiation, applied for different cell lines or in combination with other treatment modalities. Associated linear-quadratic cell survival curves can be calculated with different methods. For easy code exchange and methodological standardisation among collaborating laboratories a software package CFAssay for R (R Core Team, R: A Language and Environment for Statistical Computing, 2014) was established to perform thorough statistical analysis of linear-quadratic cell survival curves after treatment with ionizing radiation and of two-way designs of experiments with chemical treatments only. CFAssay offers maximum likelihood and related methods by default and the least squares or weighted least squares method can be optionally chosen. A test for comparision of cell survival curves and an ANOVA test for experimental two-way designs are provided. For the two presented examples estimated parameters do not differ much between maximum-likelihood and least squares. However the dispersion parameter of the quasi-likelihood method is much more sensitive for statistical variation in the data than the multiple R 2 coefficient of determination from the least squares method. The dispersion parameter for goodness of fit and different plot functions in CFAssay help to evaluate experimental data quality. As open source software interlaboratory code sharing between users is facilitated

  11. Procedure for statistical analysis of one-parameter discrepant experimental data

    International Nuclear Information System (INIS)

    Badikov, Sergey A.; Chechev, Valery P.

    2012-01-01

    A new, Mandel–Paule-type procedure for statistical processing of one-parameter discrepant experimental data is described. The procedure enables one to estimate a contribution of unrecognized experimental errors into the total experimental uncertainty as well as to include it in analysis. A definition of discrepant experimental data for an arbitrary number of measurements is introduced as an accompanying result. In the case of negligible unrecognized experimental errors, the procedure simply reduces to the calculation of the weighted average and its internal uncertainty. The procedure was applied to the statistical analysis of half-life experimental data; Mean half-lives for 20 actinides were calculated and results were compared to the ENSDF and DDEP evaluations. On the whole, the calculated half-lives are consistent with the ENSDF and DDEP evaluations. However, the uncertainties calculated in this work essentially exceed the ENSDF and DDEP evaluations for discrepant experimental data. This effect can be explained by adequately taking into account unrecognized experimental errors. - Highlights: ► A new statistical procedure for processing one-parametric discrepant experimental data has been presented. ► Procedure estimates a contribution of unrecognized errors in the total experimental uncertainty. ► Procedure was applied for processing half-life discrepant experimental data. ► Results of the calculations are compared to the ENSDF and DDEP evaluations.

  12. Noise removing in encrypted color images by statistical analysis

    Science.gov (United States)

    Islam, N.; Puech, W.

    2012-03-01

    Cryptographic techniques are used to secure confidential data from unauthorized access but these techniques are very sensitive to noise. A single bit change in encrypted data can have catastrophic impact over the decrypted data. This paper addresses the problem of removing bit error in visual data which are encrypted using AES algorithm in the CBC mode. In order to remove the noise, a method is proposed which is based on the statistical analysis of each block during the decryption. The proposed method exploits local statistics of the visual data and confusion/diffusion properties of the encryption algorithm to remove the errors. Experimental results show that the proposed method can be used at the receiving end for the possible solution for noise removing in visual data in encrypted domain.

  13. Statistical Analysis of Radio Propagation Channel in Ruins Environment

    Directory of Open Access Journals (Sweden)

    Jiao He

    2015-01-01

    Full Text Available The cellphone based localization system for search and rescue in complex high density ruins has attracted a great interest in recent years, where the radio channel characteristics are critical for design and development of such a system. This paper presents a spatial smoothing estimation via rotational invariance technique (SS-ESPRIT for radio channel characterization of high density ruins. The radio propagations at three typical mobile communication bands (0.9, 1.8, and 2 GHz are investigated in two different scenarios. Channel parameters, such as arrival time, delays, and complex amplitudes, are statistically analyzed. Furthermore, a channel simulator is built based on these statistics. By comparison analysis of average excess delay and delay spread, the validation results show a good agreement between the measurements and channel modeling results.

  14. Statistical Analysis of Sport Movement Observations: the Case of Orienteering

    Science.gov (United States)

    Amouzandeh, K.; Karimipour, F.

    2017-09-01

    Study of movement observations is becoming more popular in several applications. Particularly, analyzing sport movement time series has been considered as a demanding area. However, most of the attempts made on analyzing movement sport data have focused on spatial aspects of movement to extract some movement characteristics, such as spatial patterns and similarities. This paper proposes statistical analysis of sport movement observations, which refers to analyzing changes in the spatial movement attributes (e.g. distance, altitude and slope) and non-spatial movement attributes (e.g. speed and heart rate) of athletes. As the case study, an example dataset of movement observations acquired during the "orienteering" sport is presented and statistically analyzed.

  15. SeDA: A software package for the statistical analysis of the instrument drift

    International Nuclear Information System (INIS)

    Lee, H. J.; Jang, S. C.; Lim, T. J.

    2006-01-01

    The setpoints for safety-related equipment are affected by many sources of an uncertainty. ANSI/ISA-S67.04.01-2000 [1] and ISA-RP6 7.04.02-2000 [2] suggested the statistical approaches for ensuring that the safety-related instrument setpoints were established and maintained within the technical specification limits [3]. However, Jang et al. [4] indicated that the preceding methodologies for a setpoint drift analysis might be insufficient to manage a setpoint drift on an instrumentation device and proposed new statistical analysis procedures for the management of a setpoint drift, based on the plant specific as-found/as-left data. Although IHPA (Instrument History Performance Analysis) is a widely known commercial software package to analyze an instrument setpoint drift, several steps in the new procedure cannot be performed by using it because it is based on the statistical approaches suggested in the ANSI/ISA-S67.04.01 -2000 [1] and ISA-RP67.04.02-2000 [2], In this paper we present a software package (SeDA: Setpoint Drift Analysis) that implements new methodologies, and which is easy to use, as it is accompanied by powerful graphical tools. (authors)

  16. Electron microprobe analysis of tantalum--nitride thin films

    International Nuclear Information System (INIS)

    Stoltz, D.L.; Starkey, J.P.

    1979-06-01

    Quantitative chemical analysis of 500- and 2000-angstrom tantalum--nitride films on glass substrates has been accomplished using an electron microprobe x-ray analyzer. In order to achieve this analysis, modifications to the microprobe were necessary. A description of the calibration procedure, the method of analysis, and the quantitative results are discussed

  17. Multivariate meta-analysis: a robust approach based on the theory of U-statistic.

    Science.gov (United States)

    Ma, Yan; Mazumdar, Madhu

    2011-10-30

    Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Statistical analysis in MSW collection performance assessment.

    Science.gov (United States)

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Hydrodynamic Relaxation of an Electron Plasma to a Near-Maximum Entropy State

    International Nuclear Information System (INIS)

    Rodgers, D. J.; Servidio, S.; Matthaeus, W. H.; Mitchell, T. B.; Aziz, T.; Montgomery, D. C.

    2009-01-01

    Dynamical relaxation of a pure electron plasma in a Malmberg-Penning trap is studied, comparing experiments, numerical simulations and statistical theories of weakly dissipative two-dimensional (2D) turbulence. Simulations confirm that the dynamics are approximated well by a 2D hydrodynamic model. Statistical analysis favors a theoretical picture of relaxation to a near-maximum entropy state with constrained energy, circulation, and angular momentum. This provides evidence that 2D electron fluid relaxation in a turbulent regime is governed by principles of maximum entropy.

  20. Implementation of statistical analysis methods for medical physics data

    International Nuclear Information System (INIS)

    Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F.

    2009-01-01

    The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the

  1. Statistical analysis of AFM topographic images of self-assembled quantum dots

    Energy Technology Data Exchange (ETDEWEB)

    Sevriuk, V. A.; Brunkov, P. N., E-mail: brunkov@mail.ioffe.ru; Shalnev, I. V.; Gutkin, A. A.; Klimko, G. V.; Gronin, S. V.; Sorokin, S. V.; Konnikov, S. G. [Russian Academy of Sciences, Ioffe Physical-Technical Institute (Russian Federation)

    2013-07-15

    To obtain statistical data on quantum-dot sizes, AFM topographic images of the substrate on which the dots under study are grown are analyzed. Due to the nonideality of the substrate containing height differences on the order of the size of nanoparticles at distances of 1-10 {mu}m and the insufficient resolution of closely arranged dots due to the finite curvature radius of the AFM probe, automation of the statistical analysis of their large dot array requires special techniques for processing topographic images to eliminate the loss of a particle fraction during conventional processing. As such a technique, convolution of the initial matrix of the AFM image with a specially selected matrix is used. This makes it possible to determine the position of each nanoparticle and, using the initial matrix, to measure their geometrical parameters. The results of statistical analysis by this method of self-assembled InAs quantum dots formed on the surface of an AlGaAs epitaxial layer are presented. It is shown that their concentration, average size, and half-width of height distribution depend strongly on the In flow and total amount of deposited InAs which are varied within insignificant limits.

  2. Statistical Analysis of Environmental Tritium around Wolsong Site

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ju Youl [FNC Technology Co., Yongin (Korea, Republic of)

    2010-04-15

    To find the relationship among airborne tritium, tritium in rainwater, TFWT (Tissue Free Water Tritium) and TBT (Tissue Bound Tritium), statistical analysis is conducted based on tritium data measured at KHNP employees' house around Wolsong nuclear power plants during 10 years from 1999 to 2008. The results show that tritium in such media exhibits a strong seasonal and annual periodicity. Tritium concentration in rainwater is observed to be highly correlated with TFWT and directly transmitted to TFWT without delay. The response of environmental radioactivity of tritium around Wolsong site is analyzed using time-series technique and non-parametric trend analysis. Tritium in the atmosphere and rainwater is strongly auto-correlated by seasonal and annual periodicity. TFWT concentration in pine needle is proven to be more sensitive to rainfall phenomenon than other weather variables. Non-parametric trend analysis of TFWT concentration within pine needle shows a increasing slope in terms of confidence level of 95%. This study demonstrates a usefulness of time-series and trend analysis for the interpretation of environmental radioactivity relationship with various environmental media.

  3. Statistical Analysis of the Polarimetric Cloud Analysis and Seeding Test (POLCAST) Field Projects

    Science.gov (United States)

    Ekness, Jamie Lynn

    The North Dakota farming industry brings in more than $4.1 billion annually in cash receipts. Unfortunately, agriculture sales vary significantly from year to year, which is due in large part to weather events such as hail storms and droughts. One method to mitigate drought is to use hygroscopic seeding to increase the precipitation efficiency of clouds. The North Dakota Atmospheric Research Board (NDARB) sponsored the Polarimetric Cloud Analysis and Seeding Test (POLCAST) research project to determine the effectiveness of hygroscopic seeding in North Dakota. The POLCAST field projects obtained airborne and radar observations, while conducting randomized cloud seeding. The Thunderstorm Identification Tracking and Nowcasting (TITAN) program is used to analyze radar data (33 usable cases) in determining differences in the duration of the storm, rain rate and total rain amount between seeded and non-seeded clouds. The single ratio of seeded to non-seeded cases is 1.56 (0.28 mm/0.18 mm) or 56% increase for the average hourly rainfall during the first 60 minutes after target selection. A seeding effect is indicated with the lifetime of the storms increasing by 41 % between seeded and non-seeded clouds for the first 60 minutes past seeding decision. A double ratio statistic, a comparison of radar derived rain amount of the last 40 minutes of a case (seed/non-seed), compared to the first 20 minutes (seed/non-seed), is used to account for the natural variability of the cloud system and gives a double ratio of 1.85. The Mann-Whitney test on the double ratio of seeded to non-seeded cases (33 cases) gives a significance (p-value) of 0.063. Bootstrapping analysis of the POLCAST set indicates that 50 cases would provide statistically significant results based on the Mann-Whitney test of the double ratio. All the statistical analysis conducted on the POLCAST data set show that hygroscopic seeding in North Dakota does increase precipitation. While an additional POLCAST field

  4. Electron identification with the ATLAS detector

    CERN Document Server

    Tarna, Grigore; The ATLAS collaboration

    2017-01-01

    Electron identification is a crucial input to many ATLAS physics analysis. The electron identification used in ATLAS for run 2 is based on a likelihood discrimination to separate isolated electron candidates from candidates originating from photon conversions, hadron misidentification and heavy flavor decays. In addition, isolation variables are used as further handles to separate signal and background. The measurement of the efficiencies of the electron identification and isolationcuts are performed with the data using tag and probe techniques with large statistics sample of Z->ee and J/psi->ee decays. These measurements performed with pp collisions data at sqrt(s)=13 TeV in 2016 (2015) corresponding to an integrated luminosity of 33.9 (3.2) fb-1 of sqrt(s)=13 TeV pp are presented.

  5. Electron identification with the ATLAS detector

    CERN Document Server

    Tarna, Grigore; The ATLAS collaboration

    2017-01-01

    Electron identification is a crucial input to many ATLAS physics analysis. The electron identification used in ATLAS for run 2 is based on a likelihood discrimination to separate isolated electron candidates from candidates originating from photon conversions, hadron misidentification and heavy flavor decays. In addition, isolation variables are used as further handles to separate signal and background. The measurement of the efficiencies of the electron identification and isolationcuts are performed with the data using tag and probe techniques with large statistics sample of Z->ee and J/psi->ee decays. These measurements performed with pp collisions data at sqrt(s)=13 TeV in 2016 (2015) corresponding to an integrated luminosity of 33.9 (3.1)fb-1 of sqrt(s)=13 TeV pp are presented.

  6. Statistical Analysis of Designed Experiments Theory and Applications

    CERN Document Server

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  7. Multivariate statistical analysis of major and trace element data for ...

    African Journals Online (AJOL)

    Multivariate statistical analysis of major and trace element data for niobium exploration in the peralkaline granites of the anorogenic ring-complex province of Nigeria. PO Ogunleye, EC Ike, I Garba. Abstract. No Abstract Available Journal of Mining and Geology Vol.40(2) 2004: 107-117. Full Text: EMAIL FULL TEXT EMAIL ...

  8. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    Science.gov (United States)

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  10. GET electronics samples data analysis

    International Nuclear Information System (INIS)

    Giovinazzo, J.; Goigoux, T.; Anvar, S.; Baron, P.; Blank, B.; Delagnes, E.; Grinyer, G.F.; Pancin, J.; Pedroza, J.L.; Pibernat, J.; Pollacco, E.; Rebii, A.

    2016-01-01

    The General Electronics for TPCs (GET) has been developed to equip a generation of time projection chamber detectors for nuclear physics, and may also be used for a wider range of detector types. The goal of this paper is to propose first analysis procedures to be applied on raw data samples from the GET system, in order to correct for systematic effects observed on test measurements. We also present a method to estimate the response function of the GET system channels. The response function is required in analysis where the input signal needs to be reconstructed, in terms of time distribution, from the registered output samples.

  11. Statistics, synergy, and mechanism of multiple photogeneration of excitons in quantum dots: Fundamental and applied aspects

    International Nuclear Information System (INIS)

    Oksengendler, B. L.; Turaeva, N. N.; Uralov, I.; Marasulov, M. B.

    2012-01-01

    The effect of multiple exciton generation is analyzed based on statistical physics, quantum mechanics, and synergetics. Statistical problems of the effect of multiple exciton generation (MEG) are broadened and take into account not only exciton generation, but also background excitation. The study of the role of surface states of quantum dots is based on the synergy of self-catalyzed electronic reactions. An analysis of the MEG mechanism is based on the idea of electronic shaking using the sudden perturbation method in quantum mechanics. All of the above-mentioned results are applied to the problem of calculating the limiting efficiency to transform solar energy into electric energy. (authors)

  12. Statistical analysis of angular correlation measurements

    International Nuclear Information System (INIS)

    Oliveira, R.A.A.M. de.

    1986-01-01

    Obtaining the multipole mixing ratio, δ, of γ transitions in angular correlation measurements is a statistical problem characterized by the small number of angles in which the observation is made and by the limited statistic of counting, α. The inexistence of a sufficient statistics for the estimator of δ, is shown. Three different estimators for δ were constructed and their properties of consistency, bias and efficiency were tested. Tests were also performed in experimental results obtained in γ-γ directional correlation measurements. (Author) [pt

  13. Applications of factor analysis to electron and ion beam surface techniques

    International Nuclear Information System (INIS)

    Solomon, J.S.

    1987-01-01

    Factor analysis, a mathematical technique for extracting chemical information from matrices of data, is used to enhance Auger electron spectroscopy (AES), core level electron energy loss spectroscopy (EELS), ion scattering spectroscopy (ISS), and secondary ion mass spectroscopy (SIMS) in studies of interfaces, thin films, and surfaces. Several examples of factor analysis enhancement of chemical bonding variations in thin films and at interfaces studied with AES and SIMS are presented. Factor analysis is also shown to be of great benefit in quantifying electron and ion beam doses required to induce surface damage. Finally, examples are presented of the use of factor analysis to reconstruct elemental profiles when peaks of interest overlap each other during the course of depth profile analysis. (author)

  14. Advanced ion beam analysis of materials using ion-induced fast electron

    Energy Technology Data Exchange (ETDEWEB)

    Kudo, Hiroshi; Tanabe, Atsushi; Ishihara, Toyoyuki [Tsukuba Univ., Ibaraki (Japan); and others

    1997-03-01

    Recent progress in the study of high-energy shadowing effect using ion-induced electron spectroscopy is reported with emphasis on a possibility of determination of local electronic structure in solids, which has been a difficult problem to approach with other experimental techniques. We demonstrate real-space determination of covalent-bond electron distribution in Si crystal. The analysis technique may provide a new field of ion beam analysis of solids. (author)

  15. Statistical mechanical analysis of the linear vector channel in digital communication

    International Nuclear Information System (INIS)

    Takeda, Koujin; Hatabu, Atsushi; Kabashima, Yoshiyuki

    2007-01-01

    A statistical mechanical framework to analyze linear vector channel models in digital wireless communication is proposed for a large system. The framework is a generalization of that proposed for code-division multiple-access systems in Takeda et al (2006 Europhys. Lett. 76 1193) and enables the analysis of the system in which the elements of the channel transfer matrix are statistically correlated with each other. The significance of the proposed scheme is demonstrated by assessing the performance of an existing model of multi-input multi-output communication systems

  16. Monte Carlo based statistical power analysis for mediation models: methods and software.

    Science.gov (United States)

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  17. A statistical analysis of the impact of advertising signs on road safety.

    Science.gov (United States)

    Yannis, George; Papadimitriou, Eleonora; Papantoniou, Panagiotis; Voulgari, Chrisoula

    2013-01-01

    This research aims to investigate the impact of advertising signs on road safety. An exhaustive review of international literature was carried out on the effect of advertising signs on driver behaviour and safety. Moreover, a before-and-after statistical analysis with control groups was applied on several road sites with different characteristics in the Athens metropolitan area, in Greece, in order to investigate the correlation between the placement or removal of advertising signs and the related occurrence of road accidents. Road accident data for the 'before' and 'after' periods on the test sites and the control sites were extracted from the database of the Hellenic Statistical Authority, and the selected 'before' and 'after' periods vary from 2.5 to 6 years. The statistical analysis shows no statistical correlation between road accidents and advertising signs in none of the nine sites examined, as the confidence intervals of the estimated safety effects are non-significant at 95% confidence level. This can be explained by the fact that, in the examined road sites, drivers are overloaded with information (traffic signs, directions signs, labels of shops, pedestrians and other vehicles, etc.) so that the additional information load from advertising signs may not further distract them.

  18. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    Science.gov (United States)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  19. Meta- and statistical analysis of single-case intervention research data: quantitative gifts and a wish list.

    Science.gov (United States)

    Kratochwill, Thomas R; Levin, Joel R

    2014-04-01

    In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  20. Statistical analysis of the spatial distribution of galaxies and clusters

    International Nuclear Information System (INIS)

    Cappi, Alberto

    1993-01-01

    This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr

  1. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  2. Statistical analysis of seismicity and hazard estimation for Italy (mixed approach). Statistical parameters of main shocks and aftershocks in the Italian region

    International Nuclear Information System (INIS)

    Molchan, G.M.; Kronrod, T.L.; Dmitrieva, O.E.

    1995-03-01

    The catalog of earthquakes of Italy (1900-1993) is analyzed in the present work. The following problems have been considered: 1) a choice of the operating magnitude, 2) an analysis of data completeness, and 3) a grouping (in time and in space). The catalog has been separated into main shocks and aftershocks. Statistical estimations of seismicity parameters (a,b) are performed for the seismogenetic zones defined by GNDT. The non-standard elements of the analysis performed are: (a) statistical estimation and comparison of seismicity parameters under the condition of arbitrary data grouping in magnitude, time and space; (b) use of a not conventional statistical method for the aftershock identification; the method is based on the idea of optimizing two kinds of errors in the aftershock identification process; (c) use of the aftershock zones to reveal seismically- interrelated seismogenic zones. This procedure contributes to the stability of the estimation of the ''b-value'' Refs, 25 figs, tabs

  3. Analysis and classification of ECG-waves and rhythms using circular statistics and vector strength

    Directory of Open Access Journals (Sweden)

    Janßen Jan-Dirk

    2017-09-01

    Full Text Available The most common way to analyse heart rhythm is to calculate the RR-interval and the heart rate variability. For further evaluation, descriptive statistics are often used. Here we introduce a new and more natural heart rhythm analysis tool that is based on circular statistics and vector strength. Vector strength is a tool to measure the periodicity or lack of periodicity of a signal. We divide the signal into non-overlapping window segments and project the detected R-waves around the unit circle using the complex exponential function and the median RR-interval. In addition, we calculate the vector strength and apply circular statistics as wells as an angular histogram on the R-wave vectors. This approach enables an intuitive visualization and analysis of rhythmicity. Our results show that ECG-waves and rhythms can be easily visualized, analysed and classified by circular statistics and vector strength.

  4. Consolidity analysis for fully fuzzy functions, matrices, probability and statistics

    Directory of Open Access Journals (Sweden)

    Walaa Ibrahim Gabr

    2015-03-01

    Full Text Available The paper presents a comprehensive review of the know-how for developing the systems consolidity theory for modeling, analysis, optimization and design in fully fuzzy environment. The solving of systems consolidity theory included its development for handling new functions of different dimensionalities, fuzzy analytic geometry, fuzzy vector analysis, functions of fuzzy complex variables, ordinary differentiation of fuzzy functions and partial fraction of fuzzy polynomials. On the other hand, the handling of fuzzy matrices covered determinants of fuzzy matrices, the eigenvalues of fuzzy matrices, and solving least-squares fuzzy linear equations. The approach demonstrated to be also applicable in a systematic way in handling new fuzzy probabilistic and statistical problems. This included extending the conventional probabilistic and statistical analysis for handling fuzzy random data. Application also covered the consolidity of fuzzy optimization problems. Various numerical examples solved have demonstrated that the new consolidity concept is highly effective in solving in a compact form the propagation of fuzziness in linear, nonlinear, multivariable and dynamic problems with different types of complexities. Finally, it is demonstrated that the implementation of the suggested fuzzy mathematics can be easily embedded within normal mathematics through building special fuzzy functions library inside the computational Matlab Toolbox or using other similar software languages.

  5. Qualitative and quantitative analysis of the students’ perceptions to the use of 3D electronic models in problem-based learning

    Directory of Open Access Journals (Sweden)

    Hai Ming Wong

    2017-06-01

    Full Text Available Faculty of Dentistry of the University of Hong Kong has introduced innovative blended problem-based learning (PBL with the aid of 3D electronic models (e-models to Bachelor of Dental Surgery (BDS curriculum. Statistical results of pre- and post-semester questionnaire surveys illustrated compatibility of e-models in PBL settings. The students’ importance ratings of two objectives “Complete assigned tasks on time” and “Active listener”, and twenty-two facilitator evaluation items including critical thinking and group problem-solving skills had increased significantly. The students’ PBL preparation behavior, attentions to problem understanding, problem analysis, and learning resource quality were also found to be related to online support of e-models and its software. Qualitative analysis of open-ended questions with visual text analytic software “Leximancer” improved validity of statistical results. Using e-model functions in treatment planning, problem analysis and giving instructions provided a method of informative communication. Therefore, it is critical for the faculty to continuously provide facilitator training and quality online e-model resources to the students.

  6. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis

    Science.gov (United States)

    Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.

    2008-01-01

    Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  7. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data

    Directory of Open Access Journals (Sweden)

    Maria Vinaixa

    2012-10-01

    Full Text Available Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples.

  8. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  9. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  10. STATISTICAL ANALYSIS OF SPORT MOVEMENT OBSERVATIONS: THE CASE OF ORIENTEERING

    Directory of Open Access Journals (Sweden)

    K. Amouzandeh

    2017-09-01

    Full Text Available Study of movement observations is becoming more popular in several applications. Particularly, analyzing sport movement time series has been considered as a demanding area. However, most of the attempts made on analyzing movement sport data have focused on spatial aspects of movement to extract some movement characteristics, such as spatial patterns and similarities. This paper proposes statistical analysis of sport movement observations, which refers to analyzing changes in the spatial movement attributes (e.g. distance, altitude and slope and non-spatial movement attributes (e.g. speed and heart rate of athletes. As the case study, an example dataset of movement observations acquired during the “orienteering” sport is presented and statistically analyzed.

  11. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  12. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  13. Electrons in Nanostructures

    DEFF Research Database (Denmark)

    Flindt, Christian

    2007-01-01

    in the possibilities o®ered by the quantum mechanical behavior of electrons when it comes to informa- tion processing. This branch of research is also concerned with fundamental questions in physics. Besides an introduction to the above-mentioned subjects, the thesis con- tains a number of contributions to the ¯elds...... of coherent electron manip- ulation and the statistical description of electron transport through nano- devices. The physics of the electrons are described with a combination of numerical methods, developed and applied in the thesis, and more analytical approaches, which are also discussed. The thesis......-based communication. The statistical description of electron transport through nanostructures is based on rate equations, and the primary contribution of the thesis in that respect is the development of a method that allows for the calculation of the distribution of electrons passing through a device. The method...

  14. PROSA: A computer program for statistical analysis of near-real-time-accountancy (NRTA) data

    International Nuclear Information System (INIS)

    Beedgen, R.; Bicking, U.

    1987-04-01

    The computer program PROSA (Program for Statistical Analysis of NRTA Data) is a tool to decide on the basis of statistical considerations if, in a given sequence of materials balance periods, a loss of material might have occurred or not. The evaluation of the material balance data is based on statistical test procedures. In PROSA three truncated sequential tests are applied to a sequence of material balances. The manual describes the statistical background of PROSA and how to use the computer program on an IBM-PC with DOS 3.1. (orig.) [de

  15. Progress in element analysis on a high-voltage electron microscope

    International Nuclear Information System (INIS)

    Tivol, W.F.; Barnard, D.; Guha, T.

    1985-01-01

    X-Ray microprobe (XMA) and electron energy-loss (EELS) spectrometers have been installed on the high-voltage electron microscope (HVEM). The probe size has been measured and background reduction is in progress for XMA and EELS as are improvements in electron optics for EELS and sensitivity measurements. XMA is currently useful for qualitative analysis and has been used by several investigators from our laboratory and outside laboratories. However, EELS background levels are still too high for meaningful results to be obtained. Standards suitable for biological specimens are being measured, and a library for quantitative analysis is being compiled

  16. Statistical testing and power analysis for brain-wide association study.

    Science.gov (United States)

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Statistical analysis of magnetically soft particles in magnetorheological elastomers

    Science.gov (United States)

    Gundermann, T.; Cremer, P.; Löwen, H.; Menzel, A. M.; Odenbach, S.

    2017-04-01

    The physical properties of magnetorheological elastomers (MRE) are a complex issue and can be influenced and controlled in many ways, e.g. by applying a magnetic field, by external mechanical stimuli, or by an electric potential. In general, the response of MRE materials to these stimuli is crucially dependent on the distribution of the magnetic particles inside the elastomer. Specific knowledge of the interactions between particles or particle clusters is of high relevance for understanding the macroscopic rheological properties and provides an important input for theoretical calculations. In order to gain a better insight into the correlation between the macroscopic effects and microstructure and to generate a database for theoretical analysis, x-ray micro-computed tomography (X-μCT) investigations as a base for a statistical analysis of the particle configurations were carried out. Different MREs with quantities of 2-15 wt% (0.27-2.3 vol%) of iron powder and different allocations of the particles inside the matrix were prepared. The X-μCT results were edited by an image processing software regarding the geometrical properties of the particles with and without the influence of an external magnetic field. Pair correlation functions for the positions of the particles inside the elastomer were calculated to statistically characterize the distributions of the particles in the samples.

  18. Pattern recognition in menstrual bleeding diaries by statistical cluster analysis

    Directory of Open Access Journals (Sweden)

    Wessel Jens

    2009-07-01

    Full Text Available Abstract Background The aim of this paper is to empirically identify a treatment-independent statistical method to describe clinically relevant bleeding patterns by using bleeding diaries of clinical studies on various sex hormone containing drugs. Methods We used the four cluster analysis methods single, average and complete linkage as well as the method of Ward for the pattern recognition in menstrual bleeding diaries. The optimal number of clusters was determined using the semi-partial R2, the cubic cluster criterion, the pseudo-F- and the pseudo-t2-statistic. Finally, the interpretability of the results from a gynecological point of view was assessed. Results The method of Ward yielded distinct clusters of the bleeding diaries. The other methods successively chained the observations into one cluster. The optimal number of distinctive bleeding patterns was six. We found two desirable and four undesirable bleeding patterns. Cyclic and non cyclic bleeding patterns were well separated. Conclusion Using this cluster analysis with the method of Ward medications and devices having an impact on bleeding can be easily compared and categorized.

  19. Statistical Analysis and validation

    NARCIS (Netherlands)

    Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.

    2013-01-01

    In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are

  20. Variability analysis of AGN: a review of results using new statistical criteria

    Science.gov (United States)

    Zibecchi, L.; Andruchow, I.; Cellone, S. A.; Romero, G. E.; Combi, J. A.

    We present here a re-analysis of the variability results of a sample of active galactic nuclei (AGN), which have been observed on several sessions with the 2.15 m "Jorge Sahade" telescope (CASLEO), San Juan, Argentina, and whose results are published (Romero et al. 1999, 2000, 2002; Cellone et al. 2000). The motivation for this new analysis is the implementation, dur- ing the last years, of improvements in the statistical criteria applied, taking quantitatively into account the incidence of the photometric errors (Cellone et al. 2007). This work is framed as a first step in an integral study on the statistical estimators of AGN variability. This study is motivated by the great diversity of statistical tests that have been proposed to analyze the variability of these objects. Since we note that, in some cases, the results of the object variability depend on the test used, we attempt to make a com- parative study of the various tests and analyze, under the given conditions, which of them is the most efficient and reliable.

  1. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  2. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  3. Determination of localized visibility in off-axis electron holography

    International Nuclear Information System (INIS)

    McLeod, Robert A.; Kupsta, Martin; Malac, Marek

    2014-01-01

    Off-axis electron holography is a wavefront-split interference method for the transmission electron microscope that allows the phase shift and amplitude of the electron wavefront to be separated and quantitatively measured. An additional, third component of the holographic signal is the coherence of the electron wavefront. Historically, wavefront coherence has been evaluated by measurement of the holographic fringe visibility (or contrast) based on the minimum and maximum intensity values. We present a method based on statistical moments is presented that allows allow the visibility to be measured in a deterministic and reproducible fashion suitable for quantitative analysis. We also present an algorithm, based on the Fourier-ratio method, which allows the visibility to be resolved in two-dimensions, which we term the local visibility. The local visibility may be used to evaluate the loss of coherence due to electron scattering within a specimen, or as an aid in image analysis and segmentation. The relationship between amplitude and visibility may be used to evaluate the composition and mass thickness of a specimen by means of a 2-D histogram. Results for a selection of elements (C, Al, Si, Ti, Cr, Cu, Ge, and Au) are provided. All presented visibility metrics are biased at low-dose conditions by the presence of shot-noise, for which we provide methods for empirical normalization to achieve linear response. - Highlights: • Report on a new statistical metric to determine holographic fringe visibility. • Adds new signal to electron holography: measure of electron coherence loss in 2-D. • Provide algorithm to calculate 2-D local visibility map. • Show that amplitude and visibility may be used for compositional analysis and segmentation. • Corrected for data bias such as shot noise

  4. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  5. Neutron activation and statistical analysis of pottery from Thera, Greece

    International Nuclear Information System (INIS)

    Kilikoglou, V.; Grimanis, A.P.; Karayannis, M.I.

    1990-01-01

    Neutron activation analysis, in combination with multivariate analysis of the generated data, was used for the chemical characterization of prehistoric pottery from the Greek islands of Thera, Melos (islands with similar geology) and Crete. The statistical procedure which proved that Theran pottery could be distinguished from Melian is described. This discrimination, attained for the first time, was mainly based on the concentrations of the trace elements Sm, Yb, Lu and Cr. Also, Cretan imports to both Thera and Melos were clearly separable from local products. (author) 22 refs.; 1 fig.; 4 tabs

  6. Vacuum Analysis of Scanning Horn of Electron Beam Machine

    International Nuclear Information System (INIS)

    Suprapto; Sukidi; Sukaryono; Setyo Atmojo; Djasiman

    2003-01-01

    Vacuum analysis of scanning horn of electron beam machine (EBM) has been carried out. In EBM, electron beam produced by the electron gun is accelerated by the accelerating tube toward the target via scanning horn and window. To avoid the disturbance of electron beam trajectory in side the EBM, it is necessary to evacuate the EBM. In designing and constructing the scanning horn, vacuum analysis must be carried out to find the ultimate vacuum grade based on the analysis as well as on the test resulted by the vacuum pump. The ultimate vacuum grade is important and affecting the electron trajectory from electron gun to the target. The yield of the vacuum analysis show that the load gas to be evacuated were the outgassing, permeation and leakages where each value were 5.96487x10 -6 Torr liter/sec, 6.32083x10 -7 Torr liter/sec, and 1.3116234x10 -4 Torr liter/sec respectively, so that the total gas load was 1.377587x10 -4 Torr liter/sec. The total conductivity according to test result was 15.769 liter/sec, while the effective pumping rate and maximum vacuum obtained by RD 150 pump were 14.269 Torr liter/sec and 9.65x10 -6 Torr respectively, The vacuum steady state indicated by the test result was 3.5x10 -5 Torr. The pressure of 3.5x10 -5 Torr showed by the test is close to the capability of vacuum pump that is 2x10 -5 Torr. The vacuum test indicated a good result and that there was no leakage along the welding joint. In the latter of installation it considered to be has a pressure of 5x10 -6 Torr, because the aluminum gasket will be used to seal the window flanges and will be evacuated by turbomolecular pump with pumping rate of 500 liter/sec and ultimate vacuum of -10 Torr. (author)

  7. A scaling analysis of electronic localization in two-dimensional random media

    International Nuclear Information System (INIS)

    Ye Zhen

    2003-01-01

    By an improved scaling analysis, we suggest that there may appear two possibilities concerning the electronic localization in two-dimensional random media. The first is that all electronic states are localized in two dimensions, as conjectured previously. The second possibility is that electronic behaviors in two- and three-dimensional random systems are similar, in agreement with a recent calculation based on a direct calculation of the conductance with the use of the Kubo formula. In this case, non-localized states are possible in two dimensions, and have some peculiar properties. A few predictions are proposed. Moreover, the present analysis accommodates results from the previous scaling analysis

  8. The statistical analysis of the mobility and the labor force use

    Directory of Open Access Journals (Sweden)

    Daniela-Emanuela Dãnãcicã

    2006-05-01

    Full Text Available The paper approaches some of the classical methods used in statistics for theanalysis of labor force and proposes new ways of current analysis required foradopting optimal economic patterns and strategies. The proposed methods, thelinear mean deviation used in the analysis of the external mobility of the laborforce, the coefficient of variation used in the analysis of the external mobility of thelabor force and two-dimensional table used the coefficient of internal mobilitycalculation, are illustrated by the premises, the calculus methodology, practicalapplications and guidance for their use in adopting and applying optimal economicpolicy.

  9. Statistical analysis of the hydrodynamic pressure in the near field of compressible jets

    International Nuclear Information System (INIS)

    Camussi, R.; Di Marco, A.; Castelain, T.

    2017-01-01

    Highlights: • Statistical properties of pressure fluctuations retrieved through wavelet analysis • Time delay PDFs approximated by a log-normal distribution • Amplitude PDFs approximated by a Gamma distribution • Random variable PDFs weakly dependent upon position and Mach number. • A general stochastic model achieved for the distance dependency - Abstract: This paper is devoted to the statistical characterization of the pressure fluctuations measured in the near field of a compressible jet at two subsonic Mach numbers, 0.6 and 0.9. The analysis is focused on the hydrodynamic pressure measured at different distances from the jet exit and analyzed at the typical frequency associated to the Kelvin–Helmholtz instability. Statistical properties are retrieved by the application of the wavelet transform to the experimental data and the computation of the wavelet scalogram around that frequency. This procedure highlights traces of events that appear intermittently in time and have variable strength. A wavelet-based event tracking procedure has been applied providing a statistical characterization of the time delay between successive events and of their energy level. On this basis, two stochastic models are proposed and validated against the experimental data in the different flow conditions

  10. Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.

    Science.gov (United States)

    Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe

    2017-12-27

    The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.

  11. Categorical data processing for real estate objects valuation using statistical analysis

    Science.gov (United States)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  12. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.

  13. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    Science.gov (United States)

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially

  14. Statistical analysis plan for the EuroHYP-1 trial

    DEFF Research Database (Denmark)

    Winkel, Per; Bath, Philip M; Gluud, Christian

    2017-01-01

    Score; (4) brain infarct size at 48 +/-24 hours; (5) EQ-5D-5 L score, and (6) WHODAS 2.0 score. Other outcomes are: the primary safety outcome serious adverse events; and the incremental cost-effectiveness, and cost utility ratios. The analysis sets include (1) the intention-to-treat population, and (2...... outcome), logistic regression (binary outcomes), general linear model (continuous outcomes), and the Poisson or negative binomial model (rate outcomes). DISCUSSION: Major adjustments compared with the original statistical analysis plan encompass: (1) adjustment of analyses by nationality; (2) power......) the per protocol population. The sample size is estimated to 800 patients (5% type 1 and 20% type 2 errors). All analyses are adjusted for the protocol-specified stratification variables (nationality of centre), and the minimisation variables. In the analysis, we use ordinal regression (the primary...

  15. NEW PARADIGM OF ANALYSIS OF STATISTICAL AND EXPERT DATA IN PROBLEMS OF ECONOMICS AND MANAGEMENT

    OpenAIRE

    Orlov A. I.

    2014-01-01

    The article is devoted to the methods of analysis of statistical and expert data in problems of economics and management that are discussed in the framework of scientific specialization "Mathematical methods of economy", including organizational-economic and economic-mathematical modeling, econometrics and statistics, as well as economic aspects of decision theory, systems analysis, cybernetics, operations research. The main provisions of the new paradigm of this scientific and practical fiel...

  16. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  17. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    Science.gov (United States)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  18. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab; Meseguer, José

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability

  19. Electron Waiting Times of a Cooper Pair Splitter

    Science.gov (United States)

    Walldorf, Nicklas; Padurariu, Ciprian; Jauho, Antti-Pekka; Flindt, Christian

    2018-02-01

    Electron waiting times are an important concept in the analysis of quantum transport in nanoscale conductors. Here we show that the statistics of electron waiting times can be used to characterize Cooper pair splitters that create spatially separated spin-entangled electrons. A short waiting time between electrons tunneling into different leads is associated with the fast emission of a split Cooper pair, while long waiting times are governed by the slow injection of Cooper pairs from a superconductor. Experimentally, the waiting time distributions can be measured using real-time single-electron detectors in the regime of slow tunneling, where conventional current measurements are demanding. Our work is important for understanding the fundamental transport processes in Cooper pair splitters and the predictions may be verified using current technology.

  20. Electron Waiting Times of a Cooper Pair Splitter

    DEFF Research Database (Denmark)

    Walldorf, Nicklas; Padurariu, Ciprian; Jauho, Antti-Pekka

    2018-01-01

    Electron waiting times are an important concept in the analysis of quantum transport in nanoscale conductors. Here we show that the statistics of electron waiting times can be used to characterize Cooper pair splitters that create spatially separated spin-entangled electrons. A short waiting time...... between electrons tunneling into different leads is associated with the fast emission of a split Cooper pair, while long waiting times are governed by the slow injection of Cooper pairs from a superconductor. Experimentally, the waiting time distributions can be measured using real-time single......-electron detectors in the regime of slow tunneling, where conventional current measurements are demanding. Our work is important for understanding the fundamental transport processes in Cooper pair splitters and the predictions may be verified using current technology....

  1. Statistical analysis of the W Cyg light curve

    International Nuclear Information System (INIS)

    Klyus, I.A.

    1983-01-01

    A statistical analysis of the light curve of W Cygni has been carried out. The process of brightness variations brightness of the star is shown to be a stationary stochastic one. The hypothesis of stationarity of the process was checked at the significance level of α=0.05. Oscillations of the brightness with average durations of 131 and 250 days have been found. It is proved that oscillations are narrow-band noise, i.e. cycles. Peaks on the power spectrum corresponding to these cycles exceed 99% confidence interval. It has been stated that the oscillations are independent

  2. Teaching Case: Analysis of an Electronic Voting System

    Science.gov (United States)

    Thompson, Nik; Toohey, Danny

    2014-01-01

    This teaching case discusses the analysis of an electronic voting system. The development of the case was motivated by research into information security and management, but as it includes procedural aspects, organizational structure and personnel, it is a suitable basis for all aspects of systems analysis, planning and design tasks. The material…

  3. Statistical Analysis of a Method to Predict Drug-Polymer Miscibility

    DEFF Research Database (Denmark)

    Knopp, Matthias Manne; Olesen, Niels Erik; Huang, Yanbin

    2016-01-01

    In this study, a method proposed to predict drug-polymer miscibility from differential scanning calorimetry measurements was subjected to statistical analysis. The method is relatively fast and inexpensive and has gained popularity as a result of the increasing interest in the formulation of drug...... as provided in this study. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci....

  4. A Bifactor Approach to Model Multifaceted Constructs in Statistical Mediation Analysis

    Science.gov (United States)

    Gonzalez, Oscar; MacKinnon, David P.

    2018-01-01

    Statistical mediation analysis allows researchers to identify the most important mediating constructs in the causal process studied. Identifying specific mediators is especially relevant when the hypothesized mediating construct consists of multiple related facets. The general definition of the construct and its facets might relate differently to…

  5. Statistical Compilation of the ICT Sector and Policy Analysis | Page 4 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  6. Statistical Compilation of the ICT Sector and Policy Analysis | Page 3 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  7. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    CERN Document Server

    Antcheva, I; Bellenot, B; Biskup,1, M; Brun, R; Buncic, N; Canal, Ph; Casadei, D; Couet, O; Fine, V; Franco,1, L; Ganis, G; Gheata, A; Gonzalez Maline, D; Goto, M; Iwaszkiewicz, J; Kreshuk, A; Marcos Segura, D; Maunder, R; Moneta, L; Naumann, A; Offermann, E; Onuchin, V; Panacek, S; Rademakers, F; Russo, P; Tadel, M

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariat...

  8. Quantile regression for the statistical analysis of immunological data with many non-detects.

    Science.gov (United States)

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  9. Analysis of electron interactions in dielectric gases

    International Nuclear Information System (INIS)

    Olivet, Aurelio; Duque, Daniel; Vega, Lourdes F.

    2007-01-01

    We present and discuss results concerning electron interactions processes of dielectric gases and their relationship with the macroscopic behavior of these gases, in particular, with their dielectric strength. Such analysis is based on calculating energies of reactions for molecular ionization, dissociative ionization, parent negative ion formation, and dissociative electron attachment processes. We hypothesize that the estimation of the required energy for a reduced number of processes that take place in electrically stressed gases could be related to the gas' capability to manage the electron flow during an electrical discharge. All calculations were done with semiempirical quantum chemistry methods, including an initial optimization of molecular geometry and heat of formation of the dielectric gases and all of species that appear during electron interaction reactions. The performance of semiempirical methods Austin model 1 and Parametric model 3 (PM3) was compared for several compounds, PM3 being superior in most cases. Calculations performed for a sample of nine dielectric gases show that electron attachment and detachment processes occur in different energy bands that do not overlap for any value of the dielectric strength. We have also analyzed the relationship between dielectric strength and two physical properties: electron affinity and ionization energy. Calculations performed for 43 dielectric gases show no clear correlation between them, although certain guidelines for the qualitative estimation of dielectric strength can still be assessed

  10. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    Science.gov (United States)

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. GIS and statistical analysis for landslide susceptibility mapping in the Daunia area, Italy

    Science.gov (United States)

    Mancini, F.; Ceppi, C.; Ritrovato, G.

    2010-09-01

    This study focuses on landslide susceptibility mapping in the Daunia area (Apulian Apennines, Italy) and achieves this by using a multivariate statistical method and data processing in a Geographical Information System (GIS). The Logistic Regression (hereafter LR) method was chosen to produce a susceptibility map over an area of 130 000 ha where small settlements are historically threatened by landslide phenomena. By means of LR analysis, the tendency to landslide occurrences was, therefore, assessed by relating a landslide inventory (dependent variable) to a series of causal factors (independent variables) which were managed in the GIS, while the statistical analyses were performed by means of the SPSS (Statistical Package for the Social Sciences) software. The LR analysis produced a reliable susceptibility map of the investigated area and the probability level of landslide occurrence was ranked in four classes. The overall performance achieved by the LR analysis was assessed by local comparison between the expected susceptibility and an independent dataset extrapolated from the landslide inventory. Of the samples classified as susceptible to landslide occurrences, 85% correspond to areas where landslide phenomena have actually occurred. In addition, the consideration of the regression coefficients provided by the analysis demonstrated that a major role is played by the "land cover" and "lithology" causal factors in determining the occurrence and distribution of landslide phenomena in the Apulian Apennines.

  12. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  13. On the spectrum of the secondary auroral electrons

    International Nuclear Information System (INIS)

    Fung, S.F.; Hoffman, R.A.

    1988-01-01

    We present the results from a statistical study of the spectral characteristics of the (power law portion) secondary electrons associated with inverted-V electron precipitation events. A total of 106 inverted-V events observed at low altitudes ( - /sup γ/ with an averaged γ = 1.85. This spectral parameter is relatively insensitive to the variation of locations of the inverted-V events, but it depends on the primary beam parameters. We compare our results with the predictions and assess the applicabilities of the current theories on the generation of the low-energy auroral power law secondary electrons. Our analysis shows that collisional processes are the dominant source of the secondary electrons. copyright American Geophysical Union 1988

  14. Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech

    Science.gov (United States)

    Přibil, J.; Přibilová, A.

    2009-01-01

    The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.

  15. Multivariate statistical analysis of wildfires in Portugal

    Science.gov (United States)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  16. Cognition of and Demand for Education and Teaching in Medical Statistics in China: A Systematic Review and Meta-Analysis

    Science.gov (United States)

    Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong

    2015-01-01

    Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent

  17. [Methods of statistical analysis in differential diagnostics of the degree of brain glioma anaplasia during preoperative stage].

    Science.gov (United States)

    Glavatskiĭ, A Ia; Guzhovskaia, N V; Lysenko, S N; Kulik, A V

    2005-12-01

    The authors proposed a possible preoperative diagnostics of the degree of supratentorial brain gliom anaplasia using statistical analysis methods. It relies on a complex examination of 934 patients with I-IV degree anaplasias, which had been treated in the Institute of Neurosurgery from 1990 to 2004. The use of statistical analysis methods for differential diagnostics of the degree of brain gliom anaplasia may optimize a diagnostic algorithm, increase reliability of obtained data and in some cases avoid carrying out irrational operative intrusions. Clinically important signs for the use of statistical analysis methods directed to preoperative diagnostics of brain gliom anaplasia have been defined

  18. Statistics without Tears: Complex Statistics with Simple Arithmetic

    Science.gov (United States)

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  19. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    Science.gov (United States)

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  20. The research protocol VI: How to choose the appropriate statistical test. Inferential statistics

    Directory of Open Access Journals (Sweden)

    Eric Flores-Ruiz

    2017-10-01

    Full Text Available The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.