WorldWideScience

Sample records for high statistics study

  1. High statistics study of ω0 production

    International Nuclear Information System (INIS)

    Shaevitz, M.H.; Abolins, M.A.; Dankowych, J.A.

    1974-01-01

    Results from a study of π - p → ω 0 n at 6.0 GeV/c based on 28,000 events from a charged and neutral spectrometer are reported. Background under the ω 0 is only 7 percent, a large improvement over deuterium bubble chamber work. Density matrix elements, projected cross sections and effective trajectories for natural and unnatural exchanges are presented

  2. Statistical study of high-latitude plasma flow during magnetospheric substorms

    Directory of Open Access Journals (Sweden)

    G. Provan

    2004-11-01

    Full Text Available We have utilised the near-global imaging capabilities of the Northern Hemisphere SuperDARN radars, to perform a statistical superposed epoch analysis of high-latitude plasma flows during magnetospheric substorms. The study involved 67 substorms, identified using the IMAGE FUV space-borne auroral imager. A substorm co-ordinate system was developed, centred on the magnetic local time and magnetic latitude of substorm onset determined from the auroral images. The plasma flow vectors from all 67 intervals were combined, creating global statistical plasma flow patterns and backscatter occurrence statistics during the substorm growth and expansion phases. The commencement of the substorm growth phase was clearly observed in the radar data 18-20min before substorm onset, with an increase in the anti-sunward component of the plasma velocity flowing across dawn sector of the polar cap and a peak in the dawn-to-dusk transpolar voltage. Nightside backscatter moved to lower latitudes as the growth phase progressed. At substorm onset a flow suppression region was observed on the nightside, with fast flows surrounding the suppressed flow region. The dawn-to-dusk transpolar voltage increased from ~40kV just before substorm onset to ~75kV 12min after onset. The low-latitude return flow started to increase at substorm onset and continued to increase until 8min after onset. The velocity flowing across the polar-cap peaked 12-14min after onset. This increase in the flux of the polar cap and the excitation of large-scale plasma flow occurred even though the IMF Bz component was increasing (becoming less negative during most of this time. This study is the first to statistically prove that nightside reconnection creates magnetic flux and excites high-latitude plasma flow in a similar way to dayside reconnection and that dayside and nightside reconnection, are two separate time-dependent processes.

  3. West Valley high-level nuclear waste glass development: a statistically designed mixture study

    Energy Technology Data Exchange (ETDEWEB)

    Chick, L.A.; Bowen, W.M.; Lokken, R.O.; Wald, J.W.; Bunnell, L.R.; Strachan, D.M.

    1984-10-01

    The first full-scale conversion of high-level commercial nuclear wastes to glass in the United States will be conducted at West Valley, New York, by West Valley Nuclear Services Company, Inc. (WVNS), for the US Department of Energy. Pacific Northwest Laboratory (PNL) is supporting WVNS in the design of the glass-making process and the chemical formulation of the glass. This report describes the statistically designed study performed by PNL to develop the glass composition recommended for use at West Valley. The recommended glass contains 28 wt% waste, as limited by process requirements. The waste loading and the silica content (45 wt%) are similar to those in previously developed waste glasses; however, the new formulation contains more calcium and less boron. A series of tests verified that the increased calcium results in improved chemical durability and does not adversely affect the other modeled properties. The optimization study assessed the effects of seven oxide components on glass properties. Over 100 melts combining the seven components into a wide variety of statistically chosen compositions were tested. Viscosity, electrical conductivity, thermal expansion, crystallinity, and chemical durability were measured and empirically modeled as a function of the glass composition. The mathematical models were then used to predict the optimum formulation. This glass was tested and adjusted to arrive at the final composition recommended for use at West Valley. 56 references, 49 figures, 18 tables.

  4. Statistical study of overvoltages by maneuvering in switches in high voltage using EMTP-RV

    International Nuclear Information System (INIS)

    Dominguez Herrera, Diego Armando

    2013-01-01

    The transient overvoltages produced by maneuvering of switches are studied in a statistical way and through a variation the sequential closing times of switches in networks larger than 230 kV. This study is performed according to time delays and typical deviation ranges, using the tool EMTP- RV (ElectroMagnetic Trasient Program Restructured Version). A conceptual framework related with the electromagnetic transients by maneuver is developed in triphasic switches installed in nominal voltages higher than 230 kV. The methodology established for the execution of statistical studies of overvoltages by switch maneuver is reviewed and evaluated by simulating two fictitious cases in EMTP-RV [es

  5. A statistical study of high-altitude electric fields measured on the Viking satellite

    International Nuclear Information System (INIS)

    Lindqvist, P.A.; Marklund, G.T.

    1990-01-01

    Characteristics of high-altitude data from the Viking electric field instrument are presented in a statistical study based on 109 Viking orbits. The study is focused in particular on the signatures of and relationships between various parameters measured by the electric field instrument, such as the parallel and transverse (to B) components of the electric field instrument, such as electric field variability. A major goal of the Viking mission was to investigate the occurrence and properties of parallel electric fields and their role in the auroral acceleration process. The results in this paper on the altitude distribution of the electric field variability confirm earlier findings on the distribution of small-scale electric fields and indicate the presence of parallel fields up to about 11,000 km altitude. The directly measured parallel electric field is also investigated in some detail. It is in general directed upward with an average value of 1 mV/m, but depends on, for example, altitude and plasma density. Possible sources of error in the measurement of the parallel field are also considered and accounted for

  6. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, H.; Waara, M.; Arvelius, S.; Yamauchi, M.; Lundin, R. [Inst. of Space Physics, Kiruna (Sweden); Marghitu, O. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); Inst. for Space Sciences, Bucharest (Romania); Bouhram, M. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); CETP-CNRS, Saint-Maur (France); Hobara, Y. [Inst. of Space Physics, Kiruna (Sweden); Univ. of Sheffield, Sheffield (United Kingdom); Reme, H.; Sauvaud, J.A.; Dandouras, I. [Centre d' Etude Spatiale des Rayonnements, Toulouse (France); Balogh, A. [Imperial Coll. of Science, Technology and Medicine, London (United Kingdom); Kistler, L.M. [Univ. of New Hampshire, Durham (United States); Klecker, B. [Max-Planck-Inst. fuer Extraterrestriche Physik, Garching (Germany); Carlson, C.W. [Space Science Lab., Univ. of California, Berkeley (United States); Bavassano-Cattaneo, M.B. [Ist. di Fisica dello Spazio Interplanetario, Roma (Italy); Korth, A. [Max-Planck-Inst. fuer Sonnensystemforschung, Katlenburg-Lindau (Germany)

    2006-07-01

    The results of a statistical study of oxygen ion outflow using cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H{sup +}) and oxygen ions (O{sup +}) from 3 years (2001-2003) of spring orbits (January to May) have been used. The altitudes covered were mainly in the range 5-12 R{sub E} geocentric distance. It was found that O{sup +} is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O{sup +} parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O{sup +} parallel bulk velocities in excess of 60 km s{sup -1} were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O{sup +} the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H{sup +} and O{sup +} was found to typically be close to the same throughout the observation interval when the H{sup +} bulk velocity was calculated for all pitch-angles. When the H{sup +} bulk velocity was calculated for upward moving particles only the H{sup +} parallel bulk velocity was typically higher than that of O{sup +}. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O{sup +} ions dominates. The thermal velocity of O{sup +} was always well below that of H{sup +}. Thus perpendicular energization that is more effective for O{sup +} takes place, but this is not enough to explain the close to similar parallel velocities. Further parallel acceleration must occur. The results presented constrain the models of perpendicular heating and parallel

  7. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Directory of Open Access Journals (Sweden)

    H. Nilsson

    2006-05-01

    Full Text Available The results of a statistical study of oxygen ion outflow using Cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H+ and oxygen ions (O+ from 3 years (2001-2003 of spring orbits (January to May have been used. The altitudes covered were mainly in the range 5–12 RE geocentric distance. It was found that O+ is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O+ parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O+ parallel bulk velocities in excess of 60 km s-1 were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O+ the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H+ and O+ was found to typically be close to the same throughout the observation interval when the H+ bulk velocity was calculated for all pitch-angles. When the H+ bulk velocity was calculated for upward moving particles only the H+ parallel bulk velocity was typically higher than that of O+. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O+ ions dominates. The thermal velocity of O+ was always well below that of H+. Thus perpendicular energization that is more effective for O+ takes place, but this is not enough to explain the close to similar parallel velocities. Further

  8. Characteristics of high altitude oxygen ion energization and outflow as observed by Cluster: a statistical study

    Directory of Open Access Journals (Sweden)

    H. Nilsson

    2006-05-01

    Full Text Available The results of a statistical study of oxygen ion outflow using Cluster data obtained at high altitude above the polar cap is reported. Moment data for both hydrogen ions (H+ and oxygen ions (O+ from 3 years (2001-2003 of spring orbits (January to May have been used. The altitudes covered were mainly in the range 5–12 RE geocentric distance. It was found that O+ is significantly transversely energized at high altitudes, indicated both by high perpendicular temperatures for low magnetic field values as well as by a tendency towards higher perpendicular than parallel temperature distributions for the highest observed temperatures. The O+ parallel bulk velocity increases with altitude in particular for the lowest observed altitude intervals. O+ parallel bulk velocities in excess of 60 km s-1 were found mainly at higher altitudes corresponding to magnetic field strengths of less than 100 nT. For the highest observed parallel bulk velocities of O+ the thermal velocity exceeds the bulk velocity, indicating that the beam-like character of the distribution is lost. The parallel bulk velocity of the H+ and O+ was found to typically be close to the same throughout the observation interval when the H+ bulk velocity was calculated for all pitch-angles. When the H+ bulk velocity was calculated for upward moving particles only the H+ parallel bulk velocity was typically higher than that of O+. The parallel bulk velocity is close to the same for a wide range of relative abundance of the two ion species, including when the O+ ions dominates. The thermal velocity of O+ was always well below that of H+. Thus perpendicular energization that is more effective for O+ takes place, but this is not enough to explain the close to similar parallel velocities. Further parallel acceleration must occur. The results presented constrain the models of perpendicular heating and parallel acceleration. In particular centrifugal acceleration of the outflowing ions, which may

  9. Cluster survey of the high-altitude cusp properties: a three-year statistical study

    Directory of Open Access Journals (Sweden)

    B. Lavraud

    2004-09-01

    Full Text Available The global characteristics of the high-altitude cusp and its surrounding regions are investigated using a three-year statistical survey based on data obtained by the Cluster spacecraft. The analysis involves an elaborate orbit-sampling methodology that uses a model field and takes into account the actual solar wind conditions and level of geomagnetic activity. The spatial distribution of the magnetic field and various plasma parameters in the vicinity of the low magnetic field exterior cusp are determined and it is found that: 1 The magnetic field distribution shows the presence of an intermediate region between the magnetosheath and the magnetosphere: the exterior cusp, 2 This region is characterized by the presence of dense plasma of magnetosheath origin; a comparison with the Tsyganenko (1996 magnetic field model shows that it is diamagnetic in nature, 3 The spatial distributions show that three distinct boundaries with the lobes, the dayside plasma sheet and the magnetosheath surround the exterior cusp, 4 The external boundary with the magnetosheath has a sharp bulk velocity gradient, as well as a density decrease and temperature increase as one goes from the magnetosheath to the exterior cusp, 5 While the two inner boundaries form a funnel, the external boundary shows no clear indentation, 6 The plasma and magnetic pressure distributions suggest that the exterior cusp is in equilibrium with its surroundings in a statistical sense, and 7 A preliminary analysis of the bulk flow distributions suggests that the exterior cusp is stagnant under northward IMF conditions but convective under southward IMF conditions.

  10. A statistical study towards high-mass BGPS clumps with the MALT90 survey

    Science.gov (United States)

    Liu, Xiao-Lan; Xu, Jin-Long; Ning, Chang-Chun; Zhang, Chuan-Peng; Liu, Xiao-Tao

    2018-01-01

    In this work, we perform a statistical investigation towards 50 high-mass clumps using data from the Bolocam Galactic Plane Survey (BGPS) and Millimetre Astronomy Legacy Team 90-GHz survey (MALT90). Eleven dense molecular lines (N2H+(1–0), HNC(1–0), HCO+(1–0), HCN(1–0), HN13C(1–0), H13CO+(1–0), C2H(1–0), HC3N(10–9), SiO(2–1), 13CS(2–1)and HNCO(44,0 ‑ 30,3)) are detected. N2H+ and HNC are shown to be good tracers for clumps in various evolutionary stages since they are detected in all the fields. The detection rates of N-bearing molecules decrease as the clumps evolve, but those of O-bearing species increase with evolution. Furthermore, the abundance ratios [N2H+]/[HCO+] and log([HC3N]/[HCO+]) decline with log([HCO+]) as two linear functions, respectively. This suggests that N2H+ and HC3N transform to HCO+ as the clumps evolve. We also find that C2H is the most abundant molecule with an order of magnitude 10‑8. In addition, three new infall candidates, G010.214–00.324, G011.121–00.128 and G012.215–00.118(a), are discovered to have large-scale infall motions and infall rates with an order of magnitude 10‑3 M ⊙ yr‑1.

  11. High statistics study of $\\bar{p}p$ annihilation physics at the EHS

    CERN Multimedia

    2002-01-01

    We propose to perform an experiment to study antiproton-proton annihilations at 50 GeV/c, using the EHS facility.\\\\ \\\\ We will study in detail annihilation processes in terms of the underlying quark structure, and to compare our results with the predictions of the three principal models: quark fusion, fragmentation and recombination.\\\\ \\\\ These models achieved satisfactory results for pion and non-strange resonance production but failed for strange particle production. In fact a study of the production of strange particles could lead to valuable dues in understanding the annihilation mechanisms. A possible explanation of their production is that fast gluons materialize as strange quark pairs in the annihilation process. \\\\ \\\\ We expect to obtain 35 K, well identified, annihilation events.

  12. High-Statistics Study of the β+/EC-Decay of 110In

    Science.gov (United States)

    Diaz Varela, A.; Garrett, P. E.; Ball, G. C.; Banjay, J. C.; Cross, D. S.; Demand, G. A.; Finlay, P.; Garnsworthy, A. B.; Green, K. L.; Hackman, G.; Kulp, W. D.; Leach, K. G.; Orce, J. N.; Phillips, A. A.; Rand, E. T.; Svensson, C. E.; Sumithrarachchi, C.; Triambak, S.; Wong, J.; Wood, J. L.; Yates, S. W.

    2014-03-01

    A study of the 110In β+/EC decay was performed at the TRIUMF Isotope Separator and Accelerator (ISAC) facility to probe the nuclear structure of 110Cd. The data were collected in scaled-down γ-ray singles, γ - γ coincidence, and γ-electron coincidence mode. The data were sorted and a random-background subtracted γ - γ matrix was created containing a total of 850 million events. We expanded the level scheme of 110Cd significantly by identifying 75 levels under 3.8 MeV, including 12 new ones, and increased the number of previously observed transitions from these levels to 273. The γ-ray branching intensities have been extracted through an analysis of the coincidence intensities. The branching ratios were combined with a reanalysis of lifetimes measurements obtained in an (n, n'γ) reaction with monoenergetic neutrons for the calculation of B(E2) values and these results have lead to the proposal of a γ-soft rotor, or O(6) nucleus, rather than a vibrational, or U(5) pattern for the nature of the low-lying, low-spin levels in 110Cd.

  13. High-Statistics Study of the β+/EC-Decay of 110In

    Directory of Open Access Journals (Sweden)

    Varela A. Diaz

    2014-03-01

    Full Text Available A study of the 110In β+/EC decay was performed at the TRIUMF Isotope Separator and Accelerator (ISAC facility to probe the nuclear structure of 110Cd. The data were collected in scaled-down γ-ray singles, γ − γ coincidence, and γ-electron coincidence mode. The data were sorted and a random-background subtracted γ − γ matrix was created containing a total of 850 million events. We expanded the level scheme of 110Cd significantly by identifying 75 levels under 3.8 MeV, including 12 new ones, and increased the number of previously observed transitions from these levels to 273. The γ-ray branching intensities have been extracted through an analysis of the coincidence intensities. The branching ratios were combined with a reanalysis of lifetimes measurements obtained in an (n, n'γ reaction with monoenergetic neutrons for the calculation of B(E2 values and these results have lead to the proposal of a γ-soft rotor, or O(6 nucleus, rather than a vibrational, or U(5 pattern for the nature of the low-lying, low-spin levels in 110Cd.

  14. Statistical learning in high energy and astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J.

    2005-06-16

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot

  15. Statistical learning in high energy and astrophysics

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2005-01-01

    This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a

  16. Statistics for High Energy Physics

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed.  Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells,  Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...

  17. A New Statistical Approach to Characterize Chemical-Elicited Behavioral Effects in High-Throughput Studies Using Zebrafish.

    Directory of Open Access Journals (Sweden)

    Guozhu Zhang

    Full Text Available Zebrafish have become an important alternative model for characterizing chemical bioactivity, partly due to the efficiency at which systematic, high-dimensional data can be generated. However, these new data present analytical challenges associated with scale and diversity. We developed a novel, robust statistical approach to characterize chemical-elicited effects in behavioral data from high-throughput screening (HTS of all 1,060 Toxicity Forecaster (ToxCast™ chemicals across 5 concentrations at 120 hours post-fertilization (hpf. Taking advantage of the immense scale of data for a global view, we show that this new approach reduces bias introduced by extreme values yet allows for diverse response patterns that confound the application of traditional statistics. We have also shown that, as a summary measure of response for local tests of chemical-associated behavioral effects, it achieves a significant reduction in coefficient of variation compared to many traditional statistical modeling methods. This effective increase in signal-to-noise ratio augments statistical power and is observed across experimental periods (light/dark conditions that display varied distributional response patterns. Finally, we integrated results with data from concomitant developmental endpoint measurements to show that appropriate statistical handling of HTS behavioral data can add important biological context that informs mechanistic hypotheses.

  18. Use Of Statistical Tools To Evaluate The Reductive Dechlorination Of High Levels Of TCE In Microcosm Studies

    Science.gov (United States)

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study ...

  19. Factors That Explain the Attitude towards Statistics in High-School Students: Empirical Evidence at Technological Study Center of the Sea in Veracruz, Mexico

    Science.gov (United States)

    Rojas-Kramer, Carlos; Limón-Suárez, Enrique; Moreno-García, Elena; García-Santillán, Arturo

    2018-01-01

    The aim of this paper was to analyze attitude towards statistics in high-school students using the SATS scale designed by Auzmendi (1992). The sample was 200 students from the sixth semester of the afternoon shift, who were enrolled in technical careers from the Technological Study Center of the Sea (Centro de Estudios Tecnológicos del Mar 07…

  20. High productivity chromatography refolding process for Hepatitis B Virus X (HBx) protein guided by statistical design of experiment studies.

    Science.gov (United States)

    Basu, Anindya; Leong, Susanna Su Jan

    2012-02-03

    The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. High impact  =  high statistical standards? Not necessarily so.

    Science.gov (United States)

    Tressoldi, Patrizio E; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.

  2. High Impact = High Statistical Standards? Not Necessarily So

    Science.gov (United States)

    Tressoldi, Patrizio E.; Giofré, David; Sella, Francesco; Cumming, Geoff

    2013-01-01

    What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors. PMID:23418533

  3. Introduction to high-dimensional statistics

    CERN Document Server

    Giraud, Christophe

    2015-01-01

    Ever-greater computing technologies have given rise to an exponentially growing volume of data. Today massive data sets (with potentially thousands of variables) play an important role in almost every branch of modern human activity, including networks, finance, and genetics. However, analyzing such data has presented a challenge for statisticians and data analysts and has required the development of new statistical methods capable of separating the signal from the noise.Introduction to High-Dimensional Statistics is a concise guide to state-of-the-art models, techniques, and approaches for ha

  4. Statistical analysis for discrimination of prompt gamma ray peak induced by high energy neutron: Monte Carlo simulation study

    International Nuclear Information System (INIS)

    Do-Kun Yoon; Joo-Young Jung; Tae Suk Suh; Seong-Min Han

    2015-01-01

    The purpose of this research is a statistical analysis for discrimination of prompt gamma ray peak induced by the 14.1 MeV neutron particles from spectra using Monte Carlo simulation. For the simulation, the information of 18 detector materials was used to simulate spectra by the neutron capture reaction. The discrimination of nine prompt gamma ray peaks from the simulation of each detector material was performed. We presented the several comparison indexes of energy resolution performance depending on the detector material using the simulation and statistics for the prompt gamma activation analysis. (author)

  5. Statistical study of foreshock cavitons

    Directory of Open Access Journals (Sweden)

    P. Kajdič

    2013-12-01

    Full Text Available In this work we perform a statistical analysis of 92 foreshock cavitons observed with the Cluster spacecraft 1 during the period 2001–2006. We analyze time intervals during which the spacecraft was located in the Earth's foreshock with durations longer than 10 min. Together these amount to ~ 50 days. The cavitons are transient structures in the Earth's foreshock. Their main signatures in the data include simultaneous depletions of the magnetic field intensity and plasma density, which are surrounded by a rim of enhanced values of these two quantities. Cavitons form due to nonlinear interaction of transverse and compressive ultra-low frequency (ULF waves and are therefore always surrounded by intense compressive ULF fluctuations. They are carried by the solar wind towards the bow shock. This work represents the first systematic study of a large sample of foreshock cavitons. We find that cavitons appear for a wide range of solar wind and interplanetary magnetic field conditions and are therefore a common feature upstream of Earth's quasi-parallel bow shock with an average occurrence rate of ~ 2 events per day. We also discuss their observational properties in the context of other known upstream phenomena and show that the cavitons are a distinct structure in the foreshock.

  6. Nonextensive statistical mechanics and high energy physics

    Directory of Open Access Journals (Sweden)

    Tsallis Constantino

    2014-04-01

    Full Text Available The use of the celebrated Boltzmann-Gibbs entropy and statistical mechanics is justified for ergodic-like systems. In contrast, complex systems typically require more powerful theories. We will provide a brief introduction to nonadditive entropies (characterized by indices like q, which, in the q → 1 limit, recovers the standard Boltzmann-Gibbs entropy and associated nonextensive statistical mechanics. We then present somerecent applications to systems such as high-energy collisions, black holes and others. In addition to that, we clarify and illustrate the neat distinction that exists between Lévy distributions and q-exponential ones, a point which occasionally causes some confusion in the literature, very particularly in the LHC literature

  7. A Framework for Assessing High School Students' Statistical Reasoning.

    Science.gov (United States)

    Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang

    2016-01-01

    Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.

  8. Statistical considerations in NRDA studies

    International Nuclear Information System (INIS)

    Harner, E.G.; Parker, K.R.; Skalski, J.R.

    1993-01-01

    Biological, chemical, and toxicological variables are usually modeled with lognormal, Poisson, negative binomial, or binomial error distributions. Species counts and densities often have frequent zeros and overdispersion. Chemical concentrations can have frequent non-detects and a small proportion of high values. The feasibility of making adjustments to these response variables, such as zero-inflated models, are discussed. Toxicity measurements are usually modeled with the binomial distribution. A strategy for determining the most appropriate distribution is presented. Model-based methods, using concomitant variables and interactions, enhance assessment of impacts. Concomitant variable models reduce variability and also reduce bias by adjusting means to a common basis. Variable selection strategies are given for determining the most appropriate set of concomitant variables. Multi-year generalized linear models test impact-by-time interactions, possibly after adjusting for time-dependent concomitant variables. Communities are analyzed to make inferences about overall biological impact and recovery and require non-normal multivariate techniques. Partial canonical corresponding analysis is an appropriate community model for ordinating spatial and temporal shifts due to impact. The Exxon Valdez is used as a case study

  9. High-statistics study of the reaction γp → p2π{sup 0}

    Energy Technology Data Exchange (ETDEWEB)

    Sokhoyan, V.; Pee, H. van; Bartholomy, O.; Beck, R.; Fuchs, M.; Funke, C.; Hoffmeister, P.; Horn, I.; Junkersfeld, J.; Kalinowsky, H.; Klempt, E.; Lang, M.; Metsch, B.; Piontek, D.; Schmidt, C.; Seifen, T.; Szczepanek, T.; Thiel, A.; Thoma, U.; Wendel, C. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik, Bonn (Germany); Gutz, E. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik, Bonn (Germany); Universitaet Giessen, II. Physikalisches Institut, Giessen (Germany); Crede, V. [Florida State University, Department of Physics, Tallahassee (United States); Anisovich, A.V.; Bayadilov, D.; Nikonov, V.A.; Novinsky, D.; Sarantsev, A.V. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik, Bonn (Germany); Petersburg Nuclear Physics Institute, Gatchina (Russian Federation); Bacelar, J.C.S.; Castelijns, R.; Loehner, H.; Messchendorp, J.G.; Shende, S. [Kernfysisch Versneller Instituut, Groningen (Netherlands); Bantes, B.; Dutz, H.; Elsner, D.; Ewald, R.; Frommberger, F.; Hillert, W.; Kammer, S.; Kleber, V.; Klein, Frank; Klein, Friedrich; Ostrick, M.; Schmieden, H.; Suele, A. [Universitaet Bonn, Physikalisches Institut, Bonn (Germany); Beloglazov, Y.A.; Gridnev, A.B.; Lopatin, I.V.; Sumachev, V.V. [Petersburg Nuclear Physics Institute, Gatchina (Russian Federation); Gregor, R.; Lugert, S.; Metag, V.; Nanova, M.; Novotny, R.; Pfeiffer, M.; Trnka, D. [Universitaet Giessen, II. Physikalisches Institut, Giessen (Germany); Jaegle, I.; Krusche, B.; Mertens, T. [Universitaet Basel, Institut fuer Physik, Basel (Switzerland); Kotulla, M. [Universitaet Giessen, II. Physikalisches Institut, Giessen (Germany); Universitaet Basel, Institut fuer Physik, Basel (Switzerland); Pant, L.; Roy, A.; Varma, R. [Universitaet Giessen, II. Physikalisches Institut, Giessen (Germany); BARC, Nucl. Phys. Div., Mumbai (India); Walther, D. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik, Bonn (Germany); Universitaet Bonn, Physikalisches Institut, Bonn (Germany); Wilson, A. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik, Bonn (Germany); Florida State University, Department of Physics, Tallahassee (United States); Collaboration: The CBELSA/TAPS Collaboration

    2015-08-15

    The photoproduction of 2π {sup 0} mesons off protons was studied with the Crystal Barrel/TAPS experiment at the electron accelerator ELSA in Bonn. The energy of photons produced in a radiator was tagged in the energy range from 600 MeV to 2.5 GeV. Differential and total cross sections and pπ {sup 0} π {sup 0} Dalitz plots are presented. Part of the data was taken with a diamond radiator producing linearly polarized photons, and beam asymmetries were derived. Properties of nucleon and Δ resonances contributing to the pπ {sup 0} π {sup 0} final state were determined within the Bonn-Gatchina (BnGa) partial-wave analysis. The data presented here allow us to determine branching ratios of nucleon and Δ resonances for their decays into pπ {sup 0} π {sup 0} via several intermediate states. Most prominent are decays proceeding via Δ(1232)π, N(1440)1/2{sup +} π, N(1520)3/2{sup -} π, N(1680)5/2{sup +} π, but also pf{sub 0}(500), pf{sub 0}(980), and pf{sub 2}(1270) contribute to the reaction. (orig.)

  10. A statistical study of high coronal densities from X-ray line-ratios of Mg XI

    Science.gov (United States)

    Linford, G. A.; Lemen, J. R.; Strong, K. T.

    1991-01-01

    An X-ray line-ratio density diagnostic was applied to 50 Mg XI spectra of flaring active regions on the sun recorded by the Flat Crystal Spectrometer on the SMM. The plasma density is derived from R, the flux ratio of the forbidden to intercombination lines of the He-like ion, Mg XI. The R ratio for Mg XI is only density sensitive when the electron density exceeds a critical value (about 10 to the 12th/cu cm), the low-density limit (LDL). This theoretical value of the low-density limit is uncertain as it depends on complex atomic theory. Reported coronal densities above 10 to the 12th/cu cm are uncommon. In this study, the distribution of R ratio values about the LDL is estimated and the empirical values are derived for the 1st and 2nd moments of this distribution from 50 Mg XI spectra. From these derived parameters, the percentage of observations is derived which indicated densities above this limit.

  11. A Statistical Perspective on Highly Accelerated Testing

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Edward V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Highly accelerated life testing has been heavily promoted at Sandia (and elsewhere) as a means to rapidly identify product weaknesses caused by flaws in the product's design or manufacturing process. During product development, a small number of units are forced to fail at high stress. The failed units are then examined to determine the root causes of failure. The identification of the root causes of product failures exposed by highly accelerated life testing can instigate changes to the product's design and/or manufacturing process that result in a product with increased reliability. It is widely viewed that this qualitative use of highly accelerated life testing (often associated with the acronym HALT) can be useful. However, highly accelerated life testing has also been proposed as a quantitative means for "demonstrating" the reliability of a product where unreliability is associated with loss of margin via an identified and dominating failure mechanism. It is assumed that the dominant failure mechanism can be accelerated by changing the level of a stress factor that is assumed to be related to the dominant failure mode. In extreme cases, a minimal number of units (often from a pre-production lot) are subjected to a single highly accelerated stress relative to normal use. If no (or, sufficiently few) units fail at this high stress level, some might claim that a certain level of reliability has been demonstrated (relative to normal use conditions). Underlying this claim are assumptions regarding the level of knowledge associated with the relationship between the stress level and the probability of failure. The primary purpose of this document is to discuss (from a statistical perspective) the efficacy of using accelerated life testing protocols (and, in particular, "highly accelerated" protocols) to make quantitative inferences concerning the performance of a product (e.g., reliability) when in fact there is lack-of-knowledge and uncertainty concerning

  12. High cumulants of conserved charges and their statistical uncertainties

    Science.gov (United States)

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  13. Statistical health-effects study

    International Nuclear Information System (INIS)

    Gilbert, E.S.; Sever, L.E.

    1983-01-01

    A principal objective of this program is to determine if there are demonstrable effects of radiation exposure to the Hanford worker by analyzing mortality records of this population. A secondary purpose is to improve methodology for assessing health effects of chronic low-level exposure to harmful agents or substances, particularly i an occupational setting. In the past year we have updated our analyses and initiated new areas of analysis. Complete documentation was provided for our computer program for the mortality study, and a user's manual is under development. A case-control study of birth defects was started in FY 1982

  14. Statistical principles for prospective study protocols:

    DEFF Research Database (Denmark)

    Christensen, Robin; Langberg, Henning

    2012-01-01

    In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means...... the statistical principles for trial protocols in terms of design, analysis, and reporting of findings....

  15. Statistical behavior of high doses in medical radiodiagnosis

    International Nuclear Information System (INIS)

    Barboza, Adriana Elisa

    2014-01-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  16. A Statistical study of the Doppler spectral width of high-latitude ionospheric F-region echoes recorded with SuperDARN coherent HF radars

    Directory of Open Access Journals (Sweden)

    J.-P. Villain

    2002-11-01

    Full Text Available The HF radars of the Super Dual Auroral Radar Network (SuperDARN provide measurements of the E × B drift of ionospheric plasma over extended regions of the high-latitude ionosphere. We have conducted a statistical study of the associated Doppler spectral width of ionospheric F-region echoes. The study has been conducted with all available radars from the Northern Hemisphere for 2 specific periods of time. Period 1 corresponds to the winter months of 1994, while period 2 covers October 1996 to March 1997. The distributions of data points and average spectral width are presented as a function of Magnetic Latitude and Magnetic Local Time. The databases are very consistent and exhibit the same features. The most stringent features are: a region of very high spectral width, collocated with the ionospheric LLBL/cusp/mantle region; an oval shaped region of high spectral width, whose equator-ward boundary matches the poleward limit of the Holzworth and Meng auroral oval. A simulation has been conducted to evaluate the geometrical and instrumental effects on the spectral width. It shows that these effects cannot account for the observed spectral features. It is then concluded that these specific spectral width characteristics are the signature of ionospheric/magnetospheric coupling phenomena.Key words. Ionosphere (auroral ionosphere; ionosphere-magnetosphere interactions; ionospheric irregularities

  17. A Statistical study of the Doppler spectral width of high-latitude ionospheric F-region echoes recorded with SuperDARN coherent HF radars

    Directory of Open Access Journals (Sweden)

    J.-P. Villain

    Full Text Available The HF radars of the Super Dual Auroral Radar Network (SuperDARN provide measurements of the E × B drift of ionospheric plasma over extended regions of the high-latitude ionosphere. We have conducted a statistical study of the associated Doppler spectral width of ionospheric F-region echoes. The study has been conducted with all available radars from the Northern Hemisphere for 2 specific periods of time. Period 1 corresponds to the winter months of 1994, while period 2 covers October 1996 to March 1997. The distributions of data points and average spectral width are presented as a function of Magnetic Latitude and Magnetic Local Time. The databases are very consistent and exhibit the same features. The most stringent features are: a region of very high spectral width, collocated with the ionospheric LLBL/cusp/mantle region; an oval shaped region of high spectral width, whose equator-ward boundary matches the poleward limit of the Holzworth and Meng auroral oval. A simulation has been conducted to evaluate the geometrical and instrumental effects on the spectral width. It shows that these effects cannot account for the observed spectral features. It is then concluded that these specific spectral width characteristics are the signature of ionospheric/magnetospheric coupling phenomena.

    Key words. Ionosphere (auroral ionosphere; ionosphere-magnetosphere interactions; ionospheric irregularities

  18. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  19. Statistics of high-level scene context.

    Science.gov (United States)

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics

  20. CFD simulation of CO_2 sorption on K_2CO_3 solid sorbent in novel high flux circulating-turbulent fluidized bed riser: Parametric statistical experimental design study

    International Nuclear Information System (INIS)

    Thummakul, Theeranan; Gidaspow, Dimitri; Piumsomboon, Pornpote; Chalermsinsuwan, Benjapon

    2017-01-01

    Highlights: • Circulating-turbulent fluidization was proved to be advantage on CO_2 sorption. • The novel regime was proven to capture CO_2 higher than the conventional regimes. • Uniform solid particle distribution was observed in the novel fluidization regime. • The system continuity had more effect in the system than the process system mixing. • Parametric experimental design analysis was studied to evaluate significant factor. - Abstract: In this study a high flux circulating-turbulent fluidized bed (CTFB) riser was confirmed to be advantageous for carbon dioxide (CO_2) sorption on a potassium carbonate solid sorbent. The effect of various parameters on the CO_2 removal level was evaluated using a statistical experimental design. The most appropriate fluidization regime was found to occur between the turbulent and fast fluidization regimes, which was shown to capture CO_2 more efficiently than conventional fluidization regimes. The highest CO_2 sorption level was 93.4% under optimized CTFB operating conditions. The important parameters for CO_2 capture were the inlet gas velocity and the interactions between the CO_2 concentration and the inlet gas velocity and water vapor concentration. The CTFB regime had a high and uniform solid particle distribution in both the axial and radial system directions and could transport the solid sorbent to the regeneration reactor. In addition, the process system continuity had a stronger effect on the CO_2 removal level in the system than the process system mixing.

  1. Statistical Power in Longitudinal Network Studies

    NARCIS (Netherlands)

    Stadtfeld, Christoph; Snijders, Tom A. B.; Steglich, Christian; van Duijn, Marijtje

    2018-01-01

    Longitudinal social network studies may easily suffer from a lack of statistical power. This is the case in particular for studies that simultaneously investigate change of network ties and change of nodal attributes. Such selection and influence studies have become increasingly popular due to the

  2. Statistical principles for prospective study protocols:

    DEFF Research Database (Denmark)

    Christensen, Robin; Langberg, Henning

    2012-01-01

    In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means......, risk differences, and other quantities that convey information. One of the goals in biomedical research is to develop parsimonious models - meaning as simple as possible. This approach is valid if the subsequent research report (the article) is written independent of whether the results...

  3. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  4. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  5. Statistical evidences of absorption at high latitudes

    International Nuclear Information System (INIS)

    Fesenko, B.I.

    1980-01-01

    Evidences are considered which indicate to the significant effect of the irregular interstellar absorption at high latitudes b. The number density of faint galaxies grows with the increasing |b| even at the values of |b| exceeding 50 deg. The effects of interstellar medium are traced even in the directions of the stars and globular clusters with very low values of the colour excess. The coefficient of absorption, Asub(B)=0.29+-0.05, was estimated from the colours of the bright E-galaxies [ru

  6. Hemispheric Differences in White Matter Microstructure between Two Profiles of Children with High Intelligence Quotient vs. Controls: A Tract-Based Spatial Statistics Study

    Science.gov (United States)

    Nusbaum, Fanny; Hannoun, Salem; Kocevar, Gabriel; Stamile, Claudio; Fourneret, Pierre; Revol, Olivier; Sappey-Marinier, Dominique

    2017-01-01

    Objectives: The main goal of this study was to investigate and compare the neural substrate of two children's profiles of high intelligence quotient (HIQ). Methods: Two groups of HIQ children were included with either a homogeneous (Hom-HIQ: n = 20) or a heterogeneous IQ profile (Het-HIQ: n = 24) as defined by a significant difference between verbal comprehension index and perceptual reasoning index. Diffusion tensor imaging was used to assess white matter (WM) microstructure while tract-based spatial statistics (TBSS) analysis was performed to detect and localize WM regional differences in fractional anisotropy (FA), mean diffusivity, axial (AD), and radial diffusivities. Quantitative measurements were performed on 48 regions and 21 fiber-bundles of WM. Results: Hom-HIQ children presented higher FA than Het-HIQ children in widespread WM regions including central structures, and associative intra-hemispheric WM fasciculi. AD was also greater in numerous WM regions of Total-HIQ, Hom-HIQ, and Het-HIQ groups when compared to the Control group. Hom-HIQ and Het-HIQ groups also differed by their hemispheric lateralization in AD differences compared to Controls. Het-HIQ and Hom-HIQ groups showed a lateralization ratio (left/right) of 1.38 and 0.78, respectively. Conclusions: These findings suggest that both inter- and intra-hemispheric WM integrity are enhanced in HIQ children and that neural substrate differs between Hom-HIQ and Het-HIQ. The left hemispheric lateralization of Het-HIQ children is concordant with their higher verbal index while the relative right hemispheric lateralization of Hom-HIQ children is concordant with their global brain processing and adaptation capacities as evidenced by their homogeneous IQ. PMID:28420955

  7. Statistical Issues in TBI Clinical Studies

    Directory of Open Access Journals (Sweden)

    Paul eRapp

    2013-11-01

    Full Text Available The identification and longitudinal assessment of traumatic brain injury presents several challenges. Because these injuries can have subtle effects, efforts to find quantitative physiological measures that can be used to characterize traumatic brain injury are receiving increased attention. The results of this research must be considered with care. Six reasons for cautious assessment are outlined in this paper. None of the issues raised here are new. They are standard elements in the technical literature that describes the mathematical analysis of clinical data. The purpose of this paper is to draw attention to these issues because they need to be considered when clinicians evaluate the usefulness of this research. In some instances these points are demonstrated by simulation studies of diagnostic processes. We take as an additional objective the explicit presentation of the mathematical methods used to reach these conclusions. This material is in the appendices. The following points are made:1. A statistically significant separation of a clinical population from a control population does not ensure a successful diagnostic procedure.2. Adding more variables to a diagnostic discrimination can, in some instances, actually reduce classification accuracy.3. A high sensitivity and specificity in a TBI versus control population classification does not ensure diagnostic successes when the method is applied in a more general neuropsychiatric population. 4. Evaluation of treatment effectiveness must recognize that high variability is a pronounced characteristic of an injured central nervous system and that results can be confounded by either disease progression or spontaneous recovery. A large pre-treatment versus post-treatment effect size does not, of itself, establish a successful treatment.5. A procedure for discriminating between treatment responders and nonresponders requires, minimally, a two phase investigation. This procedure must include a

  8. Statistical Considerations of Food Allergy Prevention Studies.

    Science.gov (United States)

    Bahnson, Henry T; du Toit, George; Lack, Gideon

    Clinical studies to prevent the development of food allergy have recently helped reshape public policy recommendations on the early introduction of allergenic foods. These trials are also prompting new research, and it is therefore important to address the unique design and analysis challenges of prevention trials. We highlight statistical concepts and give recommendations that clinical researchers may wish to adopt when designing future study protocols and analysis plans for prevention studies. Topics include selecting a study sample, addressing internal and external validity, improving statistical power, choosing alpha and beta, analysis innovations to address dilution effects, and analysis methods to deal with poor compliance, dropout, and missing data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. GALEX-SDSS CATALOGS FOR STATISTICAL STUDIES

    International Nuclear Information System (INIS)

    Budavari, Tamas; Heinis, Sebastien; Szalay, Alexander S.; Nieto-Santisteban, Maria; Bianchi, Luciana; Gupchup, Jayant; Shiao, Bernie; Smith, Myron; Chang Ruixiang; Kauffmann, Guinevere; Morrissey, Patrick; Wyder, Ted K.; Martin, D. Christopher; Barlow, Tom A.; Forster, Karl; Friedman, Peter G.; Schiminovich, David; Milliard, Bruno; Donas, Jose; Seibert, Mark

    2009-01-01

    We present a detailed study of the Galaxy Evolution Explorer's (GALEX) photometric catalogs with special focus on the statistical properties of the All-sky and Medium Imaging Surveys. We introduce the concept of primaries to resolve the issue of multiple detections and follow a geometric approach to define clean catalogs with well understood selection functions. We cross-identify the GALEX sources (GR2+3) with Sloan Digital Sky Survey (SDSS; DR6) observations, which indirectly provides an invaluable insight into the astrometric model of the UV sources and allows us to revise the band merging strategy. We derive the formal description of the GALEX footprints as well as their intersections with the SDSS coverage along with analytic calculations of their areal coverage. The crossmatch catalogs are made available for the public. We conclude by illustrating the implementation of typical selection criteria in SQL for catalog subsets geared toward statistical analyses, e.g., correlation and luminosity function studies.

  10. High-Throughput Nanoindentation for Statistical and Spatial Property Determination

    Science.gov (United States)

    Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.

    2018-04-01

    Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.

  11. Statistical studies of powerful extragalactic radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, J T

    1981-01-01

    This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.

  12. High energy behaviour of particles and unified statistics

    International Nuclear Information System (INIS)

    Chang, Y.

    1984-01-01

    Theories and experiments suggest that particles at high energy appear to possess a new statistics unifying Bose-Einstein and Fermi-Dirac statistics via the GAMMA distribution. This hypothesis can be obtained from many models, and agrees quantitatively with scaling, the multiplicty, large transverse momentum, the mass spectrum, and other data. It may be applied to scatterings at high energy, and agrees with experiments and known QED's results. The Veneziano model and other theories have implied new statistics, such as, the B distribution and the Polya distribution. They revert to the GAMMA distribution at high energy. The possible inapplicability of Pauli's exclusion principle within the unified statistics is considered and associated to the quark constituents

  13. The clinic-statistic study of osteoporosis

    Directory of Open Access Journals (Sweden)

    Florin MARCU

    2008-05-01

    Full Text Available Osteoporosis is the most common metabolic bone disease and is characterized by the shrinkage in bone mass and the distruction of bone quality, thus conferring a higher risk for fractures and injuries. Osteoporosis reaches clinical attention when it is severe enough to induce microfractures and the collapsing of vertebral bodies manifesting with back aches or predisposition to other bone fractures. The aim of the study was to establish a statistic-numeric report between women and men in subjects diagnosed with osteoporosis through DEXA that present with a clinical simptomatology. We studied a group of subjects of masculine and feminine genders that have been diagnosed with osteoporosis through DEXA at the EURORAD clinic in Oradea from 01.01.2007-to present time .The result of the study was that the simptomatology of osteoporosis with pain and even cases of fractures is more obvious in female subjects then in male patients; statistically ,a woman/man report of 6.1/1 was established.

  14. Multivariate statistics high-dimensional and large-sample approximations

    CERN Document Server

    Fujikoshi, Yasunori; Shimizu, Ryoichi

    2010-01-01

    A comprehensive examination of high-dimensional analysis of multivariate methods and their real-world applications Multivariate Statistics: High-Dimensional and Large-Sample Approximations is the first book of its kind to explore how classical multivariate methods can be revised and used in place of conventional statistical tools. Written by prominent researchers in the field, the book focuses on high-dimensional and large-scale approximations and details the many basic multivariate methods used to achieve high levels of accuracy. The authors begin with a fundamental presentation of the basic

  15. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  16. Dust grain resonant capture: A statistical study

    Science.gov (United States)

    Marzari, F.; Vanzani, V.; Weidenschilling, S. J.

    1993-01-01

    A statistical approach, based on a large number of simultaneous numerical integrations, is adopted to study the capture in external mean motion resonances with the Earth of micron size dust grains perturbed by solar radiation and wind forces. We explore the dependence of the resonant capture phenomenon on the initial eccentricity e(sub 0) and perihelion argument w(sub 0) of the dust particle orbit. The intensity of both the resonant and dissipative (Poynting-Robertson and wind drag) perturbations strongly depends on the eccentricity of the particle while the perihelion argument determines, for low inclination, the mutual geometrical configuration of the particle's orbit with respect to the Earth's orbit. We present results for three j:j+1 commensurabilities (2:3, 4:5 and 6:7) and also for particle sizes s = 15, 30 microns. This study extends our previous work on the long term orbital evolution of single dust particles trapped into resonances with the Earth.

  17. Extrusion product defects: a statistical study

    International Nuclear Information System (INIS)

    Qamar, S.Z.; Arif, A.F.M.; Sheikh, A.K.

    2003-01-01

    In any manufacturing environment, defects resulting in rework or rejection are directly related to product cost and quality, and indirectly linked with process, tooling and product design. An analysis of product defects is therefore integral to any attempt at improving productivity, efficiency and quality. Commercial aluminum extrusion is generally a hot working process and consists of a series of different but integrated operations: billet preheating and sizing, die set and container preheating, billet loading and deformation, product sizing and stretching/roll-correction, age hardening, and painting/anodizing. Product defects can be traced back to problems in billet material and preparation, die and die set design and maintenance, process variable aberrations (ram speed, extrusion pressure, container temperature, etc), and post-extrusion treatment (age hardening, painting/anodizing, etc). The current paper attempts to analyze statistically the product defects commonly encountered in a commercial hot aluminum extrusion setup. Real-world rejection data, covering a period of nine years, has been researched and collected from a local structural aluminum extrusion facility. Rejection probabilities have been calculated for all the defects studied. The nine-year rejection data have been statistically analyzed on the basis of (i) an overall breakdown of defects, (ii) year-wise rejection behavior, (iii) breakdown of defects in each of three cost centers: press, anodizing, and painting. (author)

  18. Statistical trend of radiation chemical studies

    International Nuclear Information System (INIS)

    Yoshida, Hiroshi

    1980-01-01

    In the field of radiation chemistry, over 1,000 reports are published year after year. Attempt has been made to review the trends in this field for more than five years, by looking through the lists of papers statistically. As for the period from 1974 to 1978, Annual Cumulation with Keyword and Author Indexes in the Biweekly List of Papers on Radiation Chemistry was referred to. For 1979, because of the unavailability of the Cumulation, Chemical Abstracts Search by Japan Information Center of Science and Technology was referred to. The contents are as follows: how far radiation chemistry is studied, what the trends of radiation chemistry is in recent years, who contributes to the advance of radiation chemistry, and where, the trend radiation chemistry takes in 1979. (J.P.N.)

  19. Statistical study of auroral fragmentation into patches

    Science.gov (United States)

    Hashimoto, Ayumi; Shiokawa, Kazuo; Otsuka, Yuichi; Oyama, Shin-ichiro; Nozawa, Satonori; Hori, Tomoaki; Lester, Mark; Johnsen, Magnar Gullikstad

    2015-08-01

    The study of auroral dynamics is important when considering disturbances of the magnetosphere. Shiokawa et al. (2010, 2014) reported observations of finger-like auroral structures that cause auroral fragmentation. Those structures are probably produced by macroscopic instabilities in the magnetosphere, mainly of the Rayleigh-Taylor type. However, the statistical characteristics of these structures have not yet been investigated. Here based on observations by an all-sky imager at Tromsø (magnetic latitude = 67.1°N), Norway, over three winter seasons, we statistically analyzed the occurrence conditions of 14 large-scale finger-like structures that developed from large-scale auroral regions including arcs and 6 small-scale finger-like structures that developed in auroral patches. The large-scale structures were seen from midnight to dawn local time and usually appeared at the beginning of the substorm recovery phase, near the low-latitude boundary of the auroral region. The small-scale structures were primarily seen at dawn and mainly occurred in the late recovery phase of substorms. The sizes of these large- and small-scale structures mapped in the magnetospheric equatorial plane are usually larger than the gyroradius of 10 keV protons, indicating that the finger-like structures could be caused by magnetohydrodynamic instabilities. However, the scale of small structures is only twice the gyroradius of 10 keV protons, suggesting that finite Larmor radius effects may contribute to the formation of small-scale structures. The eastward propagation velocities of the structures are -40 to +200 m/s and are comparable with those of plasma drift velocities measured by the colocating Super Dual Auroral Radar Network radar.

  20. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  1. Clinical and Statistical Study on Canine Impaction

    Directory of Open Access Journals (Sweden)

    Adina-Simona Coșarcă

    2013-08-01

    Full Text Available Aim: The aim of this study was to perform a clinical and statistical research on permanent impacted canine patients among those with dental impaction referred to and treated at the Oral and Maxillo-Facial Surgery Clinic of Tîrgu Mureș, over a four years period (2009-2012. Materials and methods: The study included 858 patients having dental impaction, and upon clinical records, different parameters, like frequency, gender, age, quadrant involvement, patient residence, associated complications, referring specialist and type of treatment, related to canine impaction, were assessed. Results: The study revealed: about 10% frequency of canine impaction among dental impactions; more frequent in women, in the first quadrant (tooth 13; most cases diagnosed between the age of 10-19 years; patients under 20 were referred by an orthodontist, those over 20 by a dentist; surgical exposure was more often performed than odontectomy. Conclusions: Canine impaction is the second-most frequent dental impaction in dental arch after third molars; it occurs especially in women. Due to its important role, canine recovery within dental arch is a goal to be achieved, whenever possible. Therefore, diagnose and treatment of canine impaction requires an interdisciplinary approach (surgical and orthodontic

  2. Studies in Theoretical and Applied Statistics

    CERN Document Server

    Pratesi, Monica; Ruiz-Gazen, Anne

    2018-01-01

    This book includes a wide selection of the papers presented at the 48th Scientific Meeting of the Italian Statistical Society (SIS2016), held in Salerno on 8-10 June 2016. Covering a wide variety of topics ranging from modern data sources and survey design issues to measuring sustainable development, it provides a comprehensive overview of the current Italian scientific research in the fields of open data and big data in public administration and official statistics, survey sampling, ordinal and symbolic data, statistical models and methods for network data, time series forecasting, spatial analysis, environmental statistics, economic and financial data analysis, statistics in the education system, and sustainable development. Intended for researchers interested in theoretical and empirical issues, this volume provides interesting starting points for further research.

  3. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  4. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  5. Statistical study of auroral omega bands

    Directory of Open Access Journals (Sweden)

    N. Partamies

    2017-09-01

    Full Text Available The presence of very few statistical studies on auroral omega bands motivated us to test-use a semi-automatic method for identifying large-scale undulations of the diffuse aurora boundary and to investigate their occurrence. Five identical all-sky cameras with overlapping fields of view provided data for 438 auroral omega-like structures over Fennoscandian Lapland from 1996 to 2007. The results from this set of omega band events agree remarkably well with previous observations of omega band occurrence in magnetic local time (MLT, lifetime, location between the region 1 and 2 field-aligned currents, as well as current density estimates. The average peak emission height of omega forms corresponds to the estimated precipitation energies of a few keV, which experienced no significant change during the events. Analysis of both local and global magnetic indices demonstrates that omega bands are observed during substorm expansion and recovery phases that are more intense than average substorm expansion and recovery phases in the same region. The omega occurrence with respect to the substorm expansion and recovery phases is in a very good agreement with an earlier observed distribution of fast earthward flows in the plasma sheet during expansion and recovery phases. These findings support the theory that omegas are produced by fast earthward flows and auroral streamers, despite the rarity of good conjugate observations.

  6. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  7. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  8. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  9. Statistical spectroscopic studies in nuclear structure physics

    International Nuclear Information System (INIS)

    Halemane, T.R.

    1979-01-01

    The spectral distribution theory establishes the centroid and width of the energy spectrum as quantities of fundamental importance and gives credence to a geometry associated with averages of the product of pairs of operators acting within a model space. Utilizing this fact and partitioning the model space according to different group symmetries, simple and physically meaningful expansions are obtained for the model interactions. In the process, a global measure for the goodness of group symmetries is also developed. This procedure could eventually lead to a new way of constructing model interactions for nuclear structure studies. Numerical results for six (ds)-shell interactions and for scalar-isospin, configuration-isospin, space symmetry, supermultiplet and SU(e) x SU(4) group structures are presented. The notion of simultaneous propagation of operator averages in the irreps of two or more groups (not necessarily commuting) is also introduced. The non-energy-weighted sum rule (NEWSR) for electric and magnetic multipole excitations in the (ds)-shell nuclei 20 Ne, 24 Mg, 28 Si, 32 S, and 36 Ar are evaluated. A generally applicable procedure for evaluating the eigenvalue bound to the NEWSR is presented and numerical results obtained for the said excitations and nuclei. Comparisons are made with experimental data and shell-model results. Further, a general theory is given for the linear-energy-weighted sum rule (LEWSR). When the Hamiltonian is one-body, this has a very simple form (expressible in terms of occupancies) and amounts to an extension of the Kurath sum rule to other types of excitations and to arbitrary one-body Hamiltonians. Finally, we develop a statistical approach to perturbation theory and inverse-energy-weighted sum rules, and indicate some applications

  10. Statistical mechanics of high-density bond percolation

    Science.gov (United States)

    Timonin, P. N.

    2018-05-01

    High-density (HD) percolation describes the percolation of specific κ -clusters, which are the compact sets of sites each connected to κ nearest filled sites at least. It takes place in the classical patterns of independently distributed sites or bonds in which the ordinary percolation transition also exists. Hence, the study of series of κ -type HD percolations amounts to the description of classical clusters' structure for which κ -clusters constitute κ -cores nested one into another. Such data are needed for description of a number of physical, biological, and information properties of complex systems on random lattices, graphs, and networks. They range from magnetic properties of semiconductor alloys to anomalies in supercooled water and clustering in biological and social networks. Here we present the statistical mechanics approach to study HD bond percolation on an arbitrary graph. It is shown that the generating function for κ -clusters' size distribution can be obtained from the partition function of the specific q -state Potts-Ising model in the q →1 limit. Using this approach we find exact κ -clusters' size distributions for the Bethe lattice and Erdos-Renyi graph. The application of the method to Euclidean lattices is also discussed.

  11. TRAN-STAT: statistics for environmental studies

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1984-09-01

    This issue of TRAN-STAT discusses statistical methods for assessing the uncertainty in predictions of pollutant transport models, particularly for radionuclides. Emphasis is placed on radionuclide transport models but the statistical assessment techniques also apply in general to other types of pollutants. The report begins with an outline of why an assessment of prediction uncertainties is important. This is followed by an introduction to several methods currently used in these assessments. This in turn is followed by more detailed discussion of the methods, including examples. 43 references, 2 figures

  12. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  13. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  14. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  15. High-dimensional statistical inference: From vector to matrix

    Science.gov (United States)

    Zhang, Anru

    estimator is easy to implement via convex programming and performs well numerically. The techniques and main results developed in the chapter also have implications to other related statistical problems. An application to estimation of spiked covariance matrices from one-dimensional random projections is considered. The results demonstrate that it is still possible to accurately estimate the covariance matrix of a high-dimensional distribution based only on one-dimensional projections. For the third part of the thesis, we consider another setting of low-rank matrix completion. Current literature on matrix completion focuses primarily on independent sampling models under which the individual observed entries are sampled independently. Motivated by applications in genomic data integration, we propose a new framework of structured matrix completion (SMC) to treat structured missingness by design. Specifically, our proposed method aims at efficient matrix recovery when a subset of the rows and columns of an approximately low-rank matrix are observed. We provide theoretical justification for the proposed SMC method and derive lower bound for the estimation errors, which together establish the optimal rate of recovery over certain classes of approximately low-rank matrices. Simulation studies show that the method performs well in finite sample under a variety of configurations. The method is applied to integrate several ovarian cancer genomic studies with different extent of genomic measurements, which enables us to construct more accurate prediction rules for ovarian cancer survival.

  16. [Suicide in Luxembourg: a statistical study].

    Science.gov (United States)

    1983-01-01

    A review of the situation concerning suicide in Luxembourg is presented. The existing laws are first described, and some methodological questions are summarized. A statistical analysis of suicide in the country is then presented. Data are included on trends over time, 1881-1982; and on variations in suicide by sex, age, marital status, religion, nationality, and occupation and standard of living. A bibliography is also provided.

  17. Study of statistical properties of hybrid statistic in coherent multi-detector compact binary coalescences Search

    OpenAIRE

    Haris, K; Pai, Archana

    2015-01-01

    In this article, we revisit the problem of coherent multi-detector search of gravitational wave from compact binary coalescence with Neutron stars and Black Holes using advanced interferometers like LIGO-Virgo. Based on the loss of optimal multi-detector signal-to-noise ratio (SNR), we construct a hybrid statistic as a best of maximum-likelihood-ratio(MLR) statistic tuned for face-on and face-off binaries. The statistical properties of the hybrid statistic is studied. The performance of this ...

  18. Eulerian and Lagrangian statistics from high resolution numerical simulations of weakly compressible turbulence

    NARCIS (Netherlands)

    Benzi, R.; Biferale, L.; Fisher, R.T.; Lamb, D.Q.; Toschi, F.

    2009-01-01

    We report a detailed study of Eulerian and Lagrangian statistics from high resolution Direct Numerical Simulations of isotropic weakly compressible turbulence. Reynolds number at the Taylor microscale is estimated to be around 600. Eulerian and Lagrangian statistics is evaluated over a huge data

  19. Statistical mechanics of complex neural systems and high dimensional data

    International Nuclear Information System (INIS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-01-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks. (paper)

  20. Binomial vs poisson statistics in radiation studies

    International Nuclear Information System (INIS)

    Foster, J.; Kouris, K.; Spyrou, N.M.; Matthews, I.P.; Welsh National School of Medicine, Cardiff

    1983-01-01

    The processes of radioactive decay, decay and growth of radioactive species in a radioactive chain, prompt emission(s) from nuclear reactions, conventional activation and cyclic activation are discussed with respect to their underlying statistical density function. By considering the transformation(s) that each nucleus may undergo it is shown that all these processes are fundamentally binomial. Formally, when the number of experiments N is large and the probability of success p is close to zero, the binomial is closely approximated by the Poisson density function. In radiation and nuclear physics, N is always large: each experiment can be conceived of as the observation of the fate of each of the N nuclei initially present. Whether p, the probability that a given nucleus undergoes a prescribed transformation, is close to zero depends on the process and nuclide(s) concerned. Hence, although a binomial description is always valid, the Poisson approximation is not always adequate. Therefore further clarification is provided as to when the binomial distribution must be used in the statistical treatment of detected events. (orig.)

  1. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  2. Statistical emission of complex fragments from highly excited compound nucleus

    International Nuclear Information System (INIS)

    Matsuse, T.

    1991-01-01

    A full statistical analysis has been given in terms of the Extended Hauser-Feshbach method. The charge and kinetic energy distributions of 35 Cl+ 12 C reaction at E lab = 180, 200 MeV and 23 Na+ 24 Mg reaction at E lab = 89 MeV which form the 47 V compound nucleus are investigated as a prototype of the light mass system. The measured kinetic energy distributions of the complex fragments are shown to be well reproduced by the Extended Hauser-Feshbach method, so the observed complex fragment production is understood as the statistical binary decay from the compound nucleus induced by heavy-ion reaction. Next, this method is applied to the study of the complex production from the 111 In compound nucleus which is formed by the 84 Kr+ 27 Al reaction at E lab = 890 MeV. (K.A.) 18 refs., 10 figs

  3. Statistical approach for calculating opacities of high-Z plasmas

    International Nuclear Information System (INIS)

    Nishikawa, Takeshi; Nakamura, Shinji; Takabe, Hideaki; Mima, Kunioki

    1992-01-01

    For simulating the X-ray radiation from laser produced high-Z plasma, an appropriate atomic modeling is necessary. Based on the average ion model, we have used a rather simple atomic model for opacity calculation in a hydrodynamic code and obtained a fairly good agreement with the experiment on the X-ray spectra from the laser-produced plasmas. We have investigated the accuracy of the atomic model used in the hydrodynamic code. It is found that transition energies of 4p-4d, 4d-4f, 4p-5d, 4d-5f and 4f-5g, which are important in laser produced high-Z plasma, can be given within an error of 15 % compared to the values by the Hartree-Fock-Slater (HFS) calculation and their oscillator strengths obtained by HFS calculation vary by a factor two according to the difference of charge state. We also propose a statistical method to carry out detail configuration accounting for electronic state by use of the population of bound electrons calculated with the average ion model. The statistical method is relatively simple and provides much improvement in calculating spectral opacities of line radiation, when we use the average ion model to determine electronic state. (author)

  4. Topics in statistical data analysis for high-energy physics

    International Nuclear Information System (INIS)

    Cowan, G.

    2011-01-01

    These lectures concert two topics that are becoming increasingly important in the analysis of high-energy physics data: Bayesian statistics and multivariate methods. In the Bayesian approach, we extend the interpretation of probability not only to cover the frequency of repeatable outcomes but also to include a degree of belief. In this way we are able to associate probability with a hypothesis and thus to answer directly questions that cannot be addressed easily with traditional frequentist methods. In multivariate analysis, we try to exploit as much information as possible from the characteristics that we measure for each event to distinguish between event types. In particular we will look at a method that has gained popularity in high-energy physics in recent years: the boosted decision tree. Finally, we give a brief sketch of how multivariate methods may be applied in a search for a new signal process. (author)

  5. Doing Qualitative Studies, Using Statistical Reasoning

    DEFF Research Database (Denmark)

    Kristensen, Tore; Gabrielsen, Gorm

    2016-01-01

    Qualitative studies are associated with interviews, focus groups and observations. We introduce experiments as a way of dealing with such studies. In contrast to the common focus on how many respondents choose a particular behaviour we focus on how much a design affect the individual. This is often...

  6. Statistics and sampling in transuranic studies

    International Nuclear Information System (INIS)

    Eberhardt, L.L.; Gilbert, R.O.

    1980-01-01

    The existing data on transuranics in the environment exhibit a remarkably high variability from sample to sample (coefficients of variation of 100% or greater). This chapter stresses the necessity of adequate sample size and suggests various ways to increase sampling efficiency. Objectives in sampling are regarded as being of great importance in making decisions as to sampling methodology. Four different classes of sampling methods are described: (1) descriptive sampling, (2) sampling for spatial pattern, (3) analytical sampling, and (4) sampling for modeling. A number of research needs are identified in the various sampling categories along with several problems that appear to be common to two or more such areas

  7. [Statistical modeling studies of turbulent reacting flows

    International Nuclear Information System (INIS)

    Dwyer, H.A.

    1987-01-01

    This paper discusses the study of turbulent wall shear flows, and we feel that this problem is both more difficult and a better challenge for the new methods we are developing. Turbulent wall flows have a wide variety of length and time scales which interact with the transport processes to produce very large fluxes of mass, heat, and momentum. At the present time we have completed the first calculation of a wall diffusion flame, and we have begun a velocity PDF calculation for the flat plate boundary layer. A summary of the various activities is contained in this report

  8. Interactions: A Study of Office Reference Statistics

    Directory of Open Access Journals (Sweden)

    Naomi Lederer

    2012-06-01

    Full Text Available Objective – The purpose of this study was to analyze the data from a referencestatistics-gathering mechanism at Colorado State University (CSU Libraries. It aimedprimarily to better understand patron behaviours, particularly in an academic librarywith no reference desk.Methods – The researchers examined data from 2007 to 2010 of College LiaisonLibrarians’ consultations with patrons. Data were analyzed by various criteria,including patron type, contact method, and time spent with the patron. Theinformation was examined in the aggregate, meaning all librarians combined, andthen specifically from the Liberal Arts and Business subject areas.Results – The researchers found that the number of librarian reference consultationsis substantial. Referrals to librarians from CSU’s Morgan Library’s one public servicedesk have declined over time. The researchers also found that graduate students arethe primary patrons and email is the preferred contact method overall.Conclusion – The researchers found that interactions with patrons in librarians’ offices – either in person or virtually – remain substantial even without a traditional reference desk. The data suggest that librarians’ efforts at marketing themselves to departments, colleges, and patrons have been successful. This study will be of value to reference, subject specialist, and public service librarians, and library administrators as they consider ways to quantify their work, not only for administrative purposes, but in order to follow trends and provide services and staffing accordingly.

  9. Statistical physics, neural networks, brain studies

    International Nuclear Information System (INIS)

    Toulouse, G.

    1999-01-01

    An overview of some aspects of a vast domain, located at the crossroads of physics, biology and computer science is presented: (1) During the last fifteen years, physicists advancing along various pathways have come into contact with biology (computational neurosciences) and engineering (formal neural nets). (2) This move may actually be viewed as one component in a larger picture. A prominent trend of recent years, observable over many countries, has been the establishment of interdisciplinary centers devoted to the study of: cognitive sciences; natural and artificial intelligence; brain, mind and behaviour; perception and action; learning and memory; robotics; man-machine communication, etc. What are the promising lines of development? What opportunities for physicists? An attempt will be made to address such questions and related issues

  10. Statistics of the Iomazenil-multicenter study

    International Nuclear Information System (INIS)

    Hasler, P.H.; Beer-Wohlfahrt, H.; Schubiger, P.A.

    1990-01-01

    The 123 I-Ro 16-0154 (= Iomazenil) has been shown as a very potent benzodiazepine antagonist by Beer et al. (1990). In this study the in vitro and in vivo characteristics have been described. The preliminary clinical results revealed clearly images of the benzodiazepine receptor density in the brain. Also storage defects due to pathological CBF and changed receptor density were detected. The Iomazenil showed potential usefulness and therefore extended clinical tests in a multicenter study have been performed. The goals were twofold, first to define the normal benzodiazepine receptor distribution in a healthy human brain and second to investigate the possible diagnostic usefulness of Iomazenil in the case of partial epilepsy. Furthermore, on a few normal volunteers the pharmacokinetics should be determined and some patients with other dieseases (e.g. Alzheimer) would be screened. Some preliminary clinical results have already been published. Hoell et al. (1989) compared the biodistribution of Iomazenil in normal humans with the animal data published (Beer et al., 1990). They concluded that the distribution was similar. The radioactivity concentration in the plasma was virtually cleared after 15 min., cerebral uptake reached a maximum of 10-15 min p.i. and remained stable for about 20 min. Bangerl et al. (1990) found a reduction of benzodiazepine receptors in patients with Lennox-Gaustaut-syndrome. Bartenstein et al. (1989) compared early (30 min.) and late (2h) SPECT-images. They found that early images showed flow-phenomena and receptor binding, whereas late one corresponded clearly to receptor binding. Their findings confirm the results published by Beer et al. (1990). In this work the patient data of all participating clinical centers are evaluated and discussed. (author) 3 figs., 7 tabs., 5 refs

  11. Detection of Doppler Microembolic Signals Using High Order Statistics

    Directory of Open Access Journals (Sweden)

    Maroun Geryes

    2016-01-01

    Full Text Available Robust detection of the smallest circulating cerebral microemboli is an efficient way of preventing strokes, which is second cause of mortality worldwide. Transcranial Doppler ultrasound is widely considered the most convenient system for the detection of microemboli. The most common standard detection is achieved through the Doppler energy signal and depends on an empirically set constant threshold. On the other hand, in the past few years, higher order statistics have been an extensive field of research as they represent descriptive statistics that can be used to detect signal outliers. In this study, we propose new types of microembolic detectors based on the windowed calculation of the third moment skewness and fourth moment kurtosis of the energy signal. During energy embolus-free periods the distribution of the energy is not altered and the skewness and kurtosis signals do not exhibit any peak values. In the presence of emboli, the energy distribution is distorted and the skewness and kurtosis signals exhibit peaks, corresponding to the latter emboli. Applied on real signals, the detection of microemboli through the skewness and kurtosis signals outperformed the detection through standard methods. The sensitivities and specificities reached 78% and 91% and 80% and 90% for the skewness and kurtosis detectors, respectively.

  12. High statistics inclusive phi-meson production at SPS energies

    International Nuclear Information System (INIS)

    Dijkstra, H.B.

    1985-01-01

    This thesis describes an experiment studying the inclusive reaction hadron + Be → phi + anything → K + + K - + anything in 100 GeV/c, 120 GeV/c and 200 GeV/c hadron interactions. A total of 8x10 6 events were recorded using both positively and negatively charged unseparated hadron beams supplied by the CERN SPS. The experiment made use of an intelligent on-line event selection system based on micro-processors (FAMPs) in conjunction with a system of large MWPCs to increase the number of phi-events recorded per unit time. In 32 days of data taking over 600,000 phi-mesons were recorded onto magnetic tape. The physics motivation for collecting a large statistics sample of inclusive phi-mesons was the investigation of the inclusive phi-meson production mechanism and phi-spectroscopy. (Auth.)

  13. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    Science.gov (United States)

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  14. A Validity Study: Attitudes towards Statistics among Japanese College Students

    Science.gov (United States)

    Satake, Eike

    2015-01-01

    This cross-cultural study investigated the relationship between attitudes toward statistics (ATS) and course achievement (CA) among Japanese college students. The sample consisted of 135 male and 134 female students from the first two-year liberal arts program of a four-year college in Tokyo, Japan. Attitudes about statistics were measured using…

  15. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  17. An Entropy-Based Statistic for Genomewide Association Studies

    OpenAIRE

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-01-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard χ2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the difference...

  18. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when

  19. Statistics available for site studies in registers and surveys at Statistics Sweden

    International Nuclear Information System (INIS)

    Haldorson, Marie

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  20. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  1. Eddies in the Red Sea: A statistical and dynamical study

    KAUST Repository

    Zhan, Peng

    2014-06-01

    Sea level anomaly (SLA) data spanning 1992–2012 were analyzed to study the statistical properties of eddies in the Red Sea. An algorithm that identifies winding angles was employed to detect 4998 eddies propagating along 938 unique eddy tracks. Statistics suggest that eddies are generated across the entire Red Sea but that they are prevalent in certain regions. A high number of eddies is found in the central basin between 18°N and 24°N. More than 87% of the detected eddies have a radius ranging from 50 to 135 km. Both the intensity and relative vorticity scale of these eddies decrease as the eddy radii increase. The averaged eddy lifespan is approximately 6 weeks. AEs and cyclonic eddies (CEs) have different deformation features, and those with stronger intensities are less deformed and more circular. Analysis of long-lived eddies suggests that they are likely to appear in the central basin with AEs tending to move northward. In addition, their eddy kinetic energy (EKE) increases gradually throughout their lifespans. The annual cycles of CEs and AEs differ, although both exhibit significant seasonal cycles of intensity with the winter and summer peaks appearing in February and August, respectively. The seasonal cycle of EKE is negatively correlated with stratification but positively correlated with vertical shear of horizontal velocity and eddy growth rate, suggesting that the generation of baroclinic instability is responsible for the activities of eddies in the Red Sea.

  2. Studies on coal flotation in flotation column using statistical technique

    Energy Technology Data Exchange (ETDEWEB)

    M.S. Jena; S.K. Biswal; K.K. Rao; P.S.R. Reddy [Institute of Minerals & Materials Technology (IMMT), Orissa (India)

    2009-07-01

    Flotation of Indian high ash coking coal fines to obtain clean coal has been reported earlier by many authors. Here an attempt has been made to systematically analyse factors influencing the flotation process using statistical design of experiments technique. Studies carried out in a 100 mm diameter column using factorial design to establish weightage of factors such as feed rate, air rate and collector dosage indicated that all three parameters have equal influence on the flotation process. Subsequently RSM-CCD design was used to obtain best result and it is observed that 94% combustibles can be recovered with 82.5% weight recovery at 21.4% ash from a feed containing 31.3% ash content.

  3. Statistical behavior of high doses in medical radiodiagnosis; Comportamento estatistico das altas doses em radiodiagnostico medico

    Energy Technology Data Exchange (ETDEWEB)

    Barboza, Adriana Elisa, E-mail: adrianaebarboza@gmail.com, E-mail: elisa@bolsista.ird.gov.br [Instituto de Radioprotecao e Dosimetria, (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  4. High-temperature behavior of a deformed Fermi gas obeying interpolating statistics.

    Science.gov (United States)

    Algin, Abdullah; Senay, Mustafa

    2012-04-01

    An outstanding idea originally introduced by Greenberg is to investigate whether there is equivalence between intermediate statistics, which may be different from anyonic statistics, and q-deformed particle algebra. Also, a model to be studied for addressing such an idea could possibly provide us some new consequences about the interactions of particles as well as their internal structures. Motivated mainly by this idea, in this work, we consider a q-deformed Fermi gas model whose statistical properties enable us to effectively study interpolating statistics. Starting with a generalized Fermi-Dirac distribution function, we derive several thermostatistical functions of a gas of these deformed fermions in the thermodynamical limit. We study the high-temperature behavior of the system by analyzing the effects of q deformation on the most important thermostatistical characteristics of the system such as the entropy, specific heat, and equation of state. It is shown that such a deformed fermion model in two and three spatial dimensions exhibits the interpolating statistics in a specific interval of the model deformation parameter 0 < q < 1. In particular, for two and three spatial dimensions, it is found from the behavior of the third virial coefficient of the model that the deformation parameter q interpolates completely between attractive and repulsive systems, including the free boson and fermion cases. From the results obtained in this work, we conclude that such a model could provide much physical insight into some interacting theories of fermions, and could be useful to further study the particle systems with intermediate statistics.

  5. Feasibility Study of Using Gemstone Spectral Imaging (GSI) and Adaptive Statistical Iterative Reconstruction (ASIR) for Reducing Radiation and Iodine Contrast Dose in Abdominal CT Patients with High BMI Values.

    Science.gov (United States)

    Zhu, Zheng; Zhao, Xin-ming; Zhao, Yan-feng; Wang, Xiao-yi; Zhou, Chun-wu

    2015-01-01

    To prospectively investigate the effect of using Gemstone Spectral Imaging (GSI) and adaptive statistical iterative reconstruction (ASIR) for reducing radiation and iodine contrast dose in abdominal CT patients with high BMI values. 26 patients (weight > 65kg and BMI ≥ 22) underwent abdominal CT using GSI mode with 300mgI/kg contrast material as study group (group A). Another 21 patients (weight ≤ 65kg and BMI ≥ 22) were scanned with a conventional 120 kVp tube voltage for noise index (NI) of 11 with 450mgI/kg contrast material as control group (group B). GSI images were reconstructed at 60keV with 50%ASIR and the conventional 120kVp images were reconstructed with FBP reconstruction. The CT values, standard deviation (SD), signal-noise-ratio (SNR), contrast-noise-ratio (CNR) of 26 landmarks were quantitatively measured and image quality qualitatively assessed using statistical analysis. As for the quantitative analysis, the difference of CNR between groups A and B was all significant except for the mesenteric vein. The SNR in group A was higher than B except the mesenteric artery and splenic artery. As for the qualitative analysis, all images had diagnostic quality and the agreement for image quality assessment between the reviewers was substantial (kappa = 0.684). CT dose index (CTDI) values for non-enhanced, arterial phase and portal phase in group A were decreased by 49.04%, 40.51% and 40.54% compared with group B (P = 0.000), respectively. The total dose and the injection rate for the contrast material were reduced by 14.40% and 14.95% in A compared with B. The use of GSI and ASIR provides similar enhancement in vessels and image quality with reduced radiation dose and contrast dose, compared with the use of conventional scan protocol.

  6. Feasibility Study of Using Gemstone Spectral Imaging (GSI and Adaptive Statistical Iterative Reconstruction (ASIR for Reducing Radiation and Iodine Contrast Dose in Abdominal CT Patients with High BMI Values.

    Directory of Open Access Journals (Sweden)

    Zheng Zhu

    Full Text Available To prospectively investigate the effect of using Gemstone Spectral Imaging (GSI and adaptive statistical iterative reconstruction (ASIR for reducing radiation and iodine contrast dose in abdominal CT patients with high BMI values.26 patients (weight > 65kg and BMI ≥ 22 underwent abdominal CT using GSI mode with 300mgI/kg contrast material as study group (group A. Another 21 patients (weight ≤ 65kg and BMI ≥ 22 were scanned with a conventional 120 kVp tube voltage for noise index (NI of 11 with 450mgI/kg contrast material as control group (group B. GSI images were reconstructed at 60keV with 50%ASIR and the conventional 120kVp images were reconstructed with FBP reconstruction. The CT values, standard deviation (SD, signal-noise-ratio (SNR, contrast-noise-ratio (CNR of 26 landmarks were quantitatively measured and image quality qualitatively assessed using statistical analysis.As for the quantitative analysis, the difference of CNR between groups A and B was all significant except for the mesenteric vein. The SNR in group A was higher than B except the mesenteric artery and splenic artery. As for the qualitative analysis, all images had diagnostic quality and the agreement for image quality assessment between the reviewers was substantial (kappa = 0.684. CT dose index (CTDI values for non-enhanced, arterial phase and portal phase in group A were decreased by 49.04%, 40.51% and 40.54% compared with group B (P = 0.000, respectively. The total dose and the injection rate for the contrast material were reduced by 14.40% and 14.95% in A compared with B.The use of GSI and ASIR provides similar enhancement in vessels and image quality with reduced radiation dose and contrast dose, compared with the use of conventional scan protocol.

  7. Recommendations for describing statistical studies and results in general readership science and engineering journals.

    Science.gov (United States)

    Gardenier, John S

    2012-12-01

    This paper recommends how authors of statistical studies can communicate to general audiences fully, clearly, and comfortably. The studies may use statistical methods to explore issues in science, engineering, and society or they may address issues in statistics specifically. In either case, readers without explicit statistical training should have no problem understanding the issues, the methods, or the results at a non-technical level. The arguments for those results should be clear, logical, and persuasive. This paper also provides advice for editors of general journals on selecting high quality statistical articles without the need for exceptional work or expense. Finally, readers are also advised to watch out for some common errors or misuses of statistics that can be detected without a technical statistical background.

  8. A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE

    Energy Technology Data Exchange (ETDEWEB)

    Hillier, A. [Kwasan and Hida Observatories, Kyoto University, Kyoto 607-8471 (Japan); Morton, R. J. [Mathematics and Information Science, Northumbria University, Pandon Building, Camden Street, Newcastle upon Tyne NE1 8ST (United Kingdom); Erdélyi, R., E-mail: andrew@kwasan.kyoto-u.ac.jp [Solar Physics and Space Plasma Research Centre (SP2RC), University of Sheffield, Hicks Building, Hounsfield Road, Sheffield S3 7RH (United Kingdom)

    2013-12-20

    The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000 s with typical velocity amplitudes ranging between 0.2 and 23 km s{sup –1}. The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7 mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency.

  9. A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE

    International Nuclear Information System (INIS)

    Hillier, A.; Morton, R. J.; Erdélyi, R.

    2013-01-01

    The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000 s with typical velocity amplitudes ranging between 0.2 and 23 km s –1 . The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7 mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency

  10. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  11. The Effect of Using Case Studies in Business Statistics

    Science.gov (United States)

    Pariseau, Susan E.; Kezim, Boualem

    2007-01-01

    The authors evaluated the effect on learning of using case studies in business statistics courses. The authors divided students into 3 groups: a control group, a group that completed 1 case study, and a group that completed 3 case studies. Results evidenced that, on average, students whom the authors required to complete a case analysis received…

  12. Excel 2016 in applied statistics for high school students a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2018-01-01

    This textbook is a step-by-step guide for high school, community college, or undergraduate students who are taking a course in applied statistics and wish to learn how to use Excel to solve statistical problems. All of the statistics problems in this book will come from the following fields of study: business, education, psychology, marketing, engineering and advertising. Students will learn how to perform key statistical tests in Excel without being overwhelmed by statistical theory. Each chapter briefly explains a topic and then demonstrates how to use Excel commands and formulas to solve specific statistics problems. This book gives practice in using Excel in two different ways: (1) writing formulas (e.g., confidence interval about the mean, one-group t-test, two-group t-test, correlation) and (2) using Excel’s drop-down formula menus (e.g., simple linear regression, multiple correlations and multiple regression, and one-way ANOVA). Three practice problems are provided at the end of each chapter, along w...

  13. A Case Study in Elementary Statistics: The Florida Panther Population

    Science.gov (United States)

    Lazowski, Andrew; Stopper, Geffrey

    2013-01-01

    We describe a case study that was created to intertwine the fields of biology and mathematics. This project is given in an elementary probability and statistics course for non-math majors. Some goals of this case study include: to expose students to biology in a math course, to apply probability to real-life situations, and to display how far a…

  14. Measuring University Students' Approaches to Learning Statistics: An Invariance Study

    Science.gov (United States)

    Chiesi, Francesca; Primi, Caterina; Bilgin, Ayse Aysin; Lopez, Maria Virginia; del Carmen Fabrizio, Maria; Gozlu, Sitki; Tuan, Nguyen Minh

    2016-01-01

    The aim of the current study was to provide evidence that an abbreviated version of the Approaches and Study Skills Inventory for Students (ASSIST) was invariant across different languages and educational contexts in measuring university students' learning approaches to statistics. Data were collected on samples of university students attending…

  15. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  16. General statistical data structure for epidemiologic studies of DOE workers

    International Nuclear Information System (INIS)

    Frome, E.L.; Hudson, D.R.

    1981-01-01

    Epidemiologic studies to evaluate the occupational risks associated with employment in the nuclear industry are currently being conducted by the Department of Energy. Data that have potential value in evaluating any long-term health effects of occupational exposure to low levels of radiation are obtained for each individual at a given facility. We propose a general data structure for statistical analysis that is used to define transformations from the data management system into the data analysis system. Statistical methods of interest in epidemiologic studies include contingency table analysis and survival analysis procedures that can be used to evaluate potential associations between occupational radiation exposure and mortality. The purposes of this paper are to discuss (1) the adequacy of this data structure for single- and multiple-facility analysis and (2) the statistical computing problems encountered in dealing with large populations over extended periods of time

  17. Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38

    Science.gov (United States)

    Patterson, Brian F.

    2009-01-01

    Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…

  18. Statistical studies on quasars and active nuclei of galaxies

    International Nuclear Information System (INIS)

    Stasinska, G.

    1987-01-01

    A catalogue of optical, radio and X-ray properties of quasars and other active galactic nuclei, now in elaboration, is presented. This catalogue may serve as a data base for statistical studies. As an example, we give some preliminary results concerning the determination of the quasar masses [fr

  19. Examining reproducibility in psychology : A hybrid method for combining a statistically significant original study and a replication

    NARCIS (Netherlands)

    Van Aert, R.C.M.; Van Assen, M.A.L.M.

    2018-01-01

    The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter

  20. Statistical classification techniques in high energy physics (SDDT algorithm)

    International Nuclear Information System (INIS)

    Bouř, Petr; Kůs, Václav; Franc, Jiří

    2016-01-01

    We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)

  1. Computational and statistical methods for high-throughput analysis of post-translational modifications of proteins

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Braga, Thiago Verano; Roepstorff, Peter

    2015-01-01

    The investigation of post-translational modifications (PTMs) represents one of the main research focuses for the study of protein function and cell signaling. Mass spectrometry instrumentation with increasing sensitivity improved protocols for PTM enrichment and recently established pipelines...... for high-throughput experiments allow large-scale identification and quantification of several PTM types. This review addresses the concurrently emerging challenges for the computational analysis of the resulting data and presents PTM-centered approaches for spectra identification, statistical analysis...

  2. Statistics for products of traces of high powers of the frobenius class of hyperelliptic curves

    OpenAIRE

    Roditty-Gershon, Edva

    2011-01-01

    We study the averages of products of traces of high powers of the Frobenius class of hyperelliptic curves of genus g over a fixed finite field. We show that for increasing genus g, the limiting expectation of these products equals to the expectation when the curve varies over the unitary symplectic group USp(2g). We also consider the scaling limit of linear statistics for eigenphases of the Frobenius class of hyperelliptic curves, and show that their first few moments are Gaussian.

  3. A High School Statistics Class Investigates the Death Penalty

    Science.gov (United States)

    Brelias, Anastasia

    2015-01-01

    Recommendations for reforming high school mathematics curricula emphasize the importance of engaging students in mathematical investigations of societal issues (CCSSI [Common Core State Standards Initiative] 2010; NCTM [National Council of Teachers of Mathematics] 2000). Proponents argue that these investigations can positively influence students'…

  4. Study of developing a database of energy statistics

    Energy Technology Data Exchange (ETDEWEB)

    Park, T.S. [Korea Energy Economics Institute, Euiwang (Korea, Republic of)

    1997-08-01

    An integrated energy database should be prepared in advance for managing energy statistics comprehensively. However, since much manpower and budget is required for developing an integrated energy database, it is difficult to establish a database within a short period of time. Therefore, this study sets the purpose in drawing methods to analyze existing statistical data lists and to consolidate insufficient data as first stage work for the energy database, and at the same time, in analyzing general concepts and the data structure of the database. I also studied the data content and items of energy databases in operation in international energy-related organizations such as IEA, APEC, Japan, and the USA as overseas cases as well as domestic conditions in energy databases, and the hardware operating systems of Japanese databases. I analyzed the making-out system of Korean energy databases, discussed the KEDB system which is representative of total energy databases, and present design concepts for new energy databases. In addition, I present the establishment directions and their contents of future Korean energy databases, data contents that should be collected by supply and demand statistics, and the establishment of data collection organization, etc. by analyzing the Korean energy statistical data and comparing them with the system of OECD/IEA. 26 refs., 15 figs., 11 tabs.

  5. Statistical mechanics of flux lines in high-temperature superconductors

    International Nuclear Information System (INIS)

    Dasgupta, C.

    1992-01-01

    The shortness of the low temperature coherence lengths of high T c materials leads to new mechanisms of pinning of flux lines. Lattice periodic modulations of the order parameters itself acts to pin vortex lines in regions of the unit cell were the order parameter is small. A presentation of flux creep and flux noise at low temperature and magnetic fields in terms of motion of simple metastable defects on flux lines is made, with a calculation of flux lattice melting. 12 refs

  6. THE STATISTICS OF RADIO ASTRONOMICAL POLARIMETRY: BRIGHT SOURCES AND HIGH TIME RESOLUTION

    International Nuclear Information System (INIS)

    Van Straten, W.

    2009-01-01

    A four-dimensional statistical description of electromagnetic radiation is developed and applied to the analysis of radio pulsar polarization. The new formalism provides an elementary statistical explanation of the modal-broadening phenomenon in single-pulse observations. It is also used to argue that the degree of polarization of giant pulses has been poorly defined in past studies. Single- and giant-pulse polarimetry typically involves sources with large flux-densities and observations with high time-resolution, factors that necessitate consideration of source-intrinsic noise and small-number statistics. Self-noise is shown to fully explain the excess polarization dispersion previously noted in single-pulse observations of bright pulsars, obviating the need for additional randomly polarized radiation. Rather, these observations are more simply interpreted as an incoherent sum of covariant, orthogonal, partially polarized modes. Based on this premise, the four-dimensional covariance matrix of the Stokes parameters may be used to derive mode-separated pulse profiles without any assumptions about the intrinsic degrees of mode polarization. Finally, utilizing the small-number statistics of the Stokes parameters, it is established that the degree of polarization of an unresolved pulse is fundamentally undefined; therefore, previous claims of highly polarized giant pulses are unsubstantiated.

  7. Progressive statistics for studies in sports medicine and exercise science.

    Science.gov (United States)

    Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri

    2009-01-01

    Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.

  8. Lifetime statistics of quantum chaos studied by a multiscale analysis

    KAUST Repository

    Di Falco, A.

    2012-04-30

    In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.

  9. Statistical damage analysis of transverse cracking in high temperature composite laminates

    International Nuclear Information System (INIS)

    Sun Zuo; Daniel, I.M.; Luo, J.J.

    2003-01-01

    High temperature polymer composites are receiving special attention because of their potential applications to high speed transport airframe structures and aircraft engine components exposed to elevated temperatures. In this study, a statistical analysis was used to study the progressive transverse cracking in a typical high temperature composite. The mechanical properties of this unidirectional laminate were first characterized both at room and high temperatures. Damage mechanisms of transverse cracking in cross-ply laminates were studied by X-ray radiography at room temperature and in-test photography technique at high temperature. Since the tensile strength of unidirectional laminate along transverse direction was found to follow Weibull distribution, Monte Carlo simulation technique based on experimentally obtained parameters was applied to predict transverse cracking at different temperatures. Experiments and simulation showed that they agree well both at room temperature and 149 deg. C (stress free temperature) in terms of applied stress versus crack density. The probability density function (PDF) of transverse crack spacing considering statistical strength distribution was also developed, and good agreements with simulation and experimental results are reached. Finally, a generalized master curve that predicts the normalized applied stress versus normalized crack density for various lay-ups and various temperatures was established

  10. Line identification studies using traditional techniques and wavelength coincidence statistics

    International Nuclear Information System (INIS)

    Cowley, C.R.; Adelman, S.J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum

  11. A statistical study on fracture toughness data of Japanese RPVS

    International Nuclear Information System (INIS)

    Sakai, Y.; Ogura, N.

    1987-01-01

    In a cooperative study for investigating fracture toughness on pressure vessel steels produced in Japan, a number of heats of ASTM A533B cl.1 and A508 cl.3 steels have been studied. Approximately 3000 fracture toughness data and 8000 mechanical properties data were obtained and filed in a computer data bank. Statistical characterization of toughness data in the transition region has been carried out using the computer data bank. Curve fitting technique for toughness data has been examined. Approach using the function to model the transition behaviours of each toughness has been applied. The aims of fitting curve technique were as follows; (1) Summarization of an enormous toughness data base to permit comparison heats, materials and testing methods; (2) Investigating the relationships among static, dynamic and arrest toughness; (3) Examining the ASME K(IR) curve statistically. The methodology used in this study for analyzing a large quantity of fracture toughness data was found to be useful for formulating a statistically based K(IR) curve. (orig./HP)

  12. Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies

    International Nuclear Information System (INIS)

    Weber, K.H.

    1993-01-01

    In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de

  13. Challenges and Approaches to Statistical Design and Inference in High Dimensional Investigations

    Science.gov (United States)

    Garrett, Karen A.; Allison, David B.

    2015-01-01

    Summary Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other “omic” data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology, and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative. PMID:19588106

  14. Challenges and approaches to statistical design and inference in high-dimensional investigations.

    Science.gov (United States)

    Gadbury, Gary L; Garrett, Karen A; Allison, David B

    2009-01-01

    Advances in modern technologies have facilitated high-dimensional experiments (HDEs) that generate tremendous amounts of genomic, proteomic, and other "omic" data. HDEs involving whole-genome sequences and polymorphisms, expression levels of genes, protein abundance measurements, and combinations thereof have become a vanguard for new analytic approaches to the analysis of HDE data. Such situations demand creative approaches to the processes of statistical inference, estimation, prediction, classification, and study design. The novel and challenging biological questions asked from HDE data have resulted in many specialized analytic techniques being developed. This chapter discusses some of the unique statistical challenges facing investigators studying high-dimensional biology and describes some approaches being developed by statistical scientists. We have included some focus on the increasing interest in questions involving testing multiple propositions simultaneously, appropriate inferential indicators for the types of questions biologists are interested in, and the need for replication of results across independent studies, investigators, and settings. A key consideration inherent throughout is the challenge in providing methods that a statistician judges to be sound and a biologist finds informative.

  15. A Classification of Statistics Courses (A Framework for Studying Statistical Education)

    Science.gov (United States)

    Turner, J. C.

    1976-01-01

    A classification of statistics courses in presented, with main categories of "course type,""methods of presentation,""objectives," and "syllabus." Examples and suggestions for uses of the classification are given. (DT)

  16. Open Access!: Review of Online Statistics: An Interactive Multimedia Course of Study by David Lane

    Directory of Open Access Journals (Sweden)

    Samuel L. Tunstall

    2016-01-01

    Full Text Available David M. Lane (project leader. Online Statistics Education: An Interactive Multimedia Course of Study (http://onlinestatbook.com/ Also: David M. Lane (primary author and editor, with David Scott, Mikki Hebl, Rudy Guerra, Dan Osherson, and Heidi Zimmer. Introduction to Statistics. Online edition (http://onlinestatbook.com/Online_Statistics_Education.pdf, 694 pp. It is rare that students receive high-quality textbooks for free, but David Lane's Online Statistics: An Interactive Multimedia Course of Study permits precisely that. This review gives an overview of the many features in Lane's online textbook, including the Java Applets, the textbook itself, and the resources available for instructors. A discussion of uses of the site, as well as a comparison of the text to alternative online statistics textbooks, is included.

  17. Epilepsy and occupational accidents in Brazil: a national statistics study.

    Science.gov (United States)

    Lunardi, Mariana dos Santos; Soliman, Lucas Alexandre Pedrollo; Pauli, Carla; Lin, Katia

    2011-01-01

    Epilepsy may restrict the patient's daily life. It causes lower quality of life and increased risk for work-related accidents (WRA). The aim of this study is to analyze the implantation of the Epidemiologic and Technical Security System Nexus (ETSSN) and WRA patterns among patients with epilepsy. Data regarding WRA, between 1999 and 2008, on the historical database of WRA Infolog Statistical Yearbook from Brazilian Ministry of Social Security were reviewed. There was a significant increase of reported cases during the ten year period, mainly after the establishment of the ETSSN. The increased granted benefits evidenced the epidemiologic association between epilepsy and WRA. ETSSN possibly raised the registration of occupational accidents and granted benefits. However, the real number of WRA may remain underestimated due to informal economy and house workers' accidents which are usually not included in the official statistics in Brazil.

  18. Statistical study of density fluctuations in the tore supra tokamak

    International Nuclear Information System (INIS)

    Devynck, P.; Fenzi, C.; Garbet, X.; Laviron, C.

    1998-03-01

    It is believed that the radial anomalous transport in tokamaks is caused by plasma turbulence. Using infra-red laser scattering technique on the Tore Supra tokamak, statistical properties of the density fluctuations are studied as a function of the scales in ohmic as well as additional heating regimes using the lower hybrid or the ion cyclotron frequencies. The probability distributions are compared to a Gaussian in order to estimate the role of intermittency which is found to be negligible. The temporal behaviour of the three-dimensional spectrum is thoroughly discussed; its multifractal character is reflected in the singularity spectrum. The autocorrelation coefficient as well as their long-time incoherence and statistical independence. We also put forward the existence of fluctuations transfer between two distinct but close wavenumbers. A rather clearer image is thus obtained about the way energy is transferred through the turbulent scales. (author)

  19. Statistical study of TCV disruptivity and H-mode accessibility

    International Nuclear Information System (INIS)

    Martin, Y.; Deschenaux, C.; Lister, J.B.; Pochelon, A.

    1997-01-01

    Optimising tokamak operation consists of finding a path, in a multidimensional parameter space, which leads to the desired plasma characteristics and avoids hazards regions. Typically the desirable regions are the domain where an L-mode to H-mode transition can occur, and then, in the H-mode, where ELMs and the required high density< y can be maintained. The regions to avoid are those with a high rate of disruptivity. On TCV, learning the safe and successful paths is achieved empirically. This will no longer be possible in a machine like ITER, since only a small percentage of disrupted discharges will be tolerable. An a priori knowledge of the hazardous regions in ITER is therefore mandatory. This paper presents the results of a statistical analysis of the occurrence of disruptions in TCV. (author) 4 figs

  20. A study on the advanced statistical core thermal design methodology

    International Nuclear Information System (INIS)

    Lee, Seung Hyuk

    1992-02-01

    A statistical core thermal design methodology for generating the limit DNBR and the nominal DNBR is proposed and used in assessing the best-estimate thermal margin in a reactor core. Firstly, the Latin Hypercube Sampling Method instead of the conventional Experimental Design Technique is utilized as an input sampling method for a regression analysis to evaluate its sampling efficiency. Secondly and as a main topic, the Modified Latin Hypercube Sampling and the Hypothesis Test Statistics method is proposed as a substitute for the current statistical core thermal design method. This new methodology adopts 'a Modified Latin Hypercube Sampling Method' which uses the mean values of each interval of input variables instead of random values to avoid the extreme cases that arise in the tail areas of some parameters. Next, the independence between the input variables is verified through 'Correlation Coefficient Test' for statistical treatment of their uncertainties. And the distribution type of DNBR response is determined though 'Goodness of Fit Test'. Finally, the limit DNBR with one-sided 95% probability and 95% confidence level, DNBR 95/95 ' is estimated. The advantage of this methodology over the conventional statistical method using Response Surface and Monte Carlo simulation technique lies in its simplicity of the analysis procedure, while maintaining the same level of confidence in the limit DNBR result. This methodology is applied to the two cases of DNBR margin calculation. The first case is the application to the determination of the limit DNBR where the DNBR margin is determined by the difference between the nominal DNBR and the limit DNBR. The second case is the application to the determination of the nominal DNBR where the DNBR margin is determined by the difference between the lower limit value of the nominal DNBR and the CHF correlation limit being used. From this study, it is deduced that the proposed methodology gives a good agreement in the DNBR results

  1. Application of mathematical statistics methods to study fluorite deposits

    International Nuclear Information System (INIS)

    Chermeninov, V.B.

    1980-01-01

    Considered are the applicability of mathematical-statistical methods for the increase of reliability of sampling and geological tasks (study of regularities of ore formation). Compared is the reliability of core sampling (regarding the selective abrasion of fluorite) and neutron activation logging for fluorine. The core sampling data are characterized by higher dispersion than neutron activation logging results (mean value of variation coefficients are 75% and 56% respectively). However the hypothesis of the equality of average two sampling is confirmed; this fact testifies to the absence of considerable variability of ore bodies

  2. Statistical Study of False Alarms of Geomagnetic Storms

    DEFF Research Database (Denmark)

    Leer, Kristoffer; Vennerstrøm, Susanne; Veronig, A.

    . A subset of these halo CMEs did not cause a geomagnetic storm the following four days and have therefore been considered as false alarms. The properties of these events are investigated and discussed here. Their statistics are compared to the geo-effective CMEs. The ability to identify potential false......Coronal Mass Ejections (CMEs) are known to cause geomagnetic storms on Earth. However, not all CMEs will trigger geomagnetic storms, even if they are heading towards the Earth. In this study, front side halo CMEs with speed larger than 500 km/s have been identified from the SOHO LASCO catalogue...

  3. Large-eddy simulation in a mixing tee junction: High-order turbulent statistics analysis

    International Nuclear Information System (INIS)

    Howard, Richard J.A.; Serre, Eric

    2015-01-01

    Highlights: • Mixing and thermal fluctuations in a junction are studied using large eddy simulation. • Adiabatic and conducting steel wall boundaries are tested. • Wall thermal fluctuations are not the same between the flow and the solid. • Solid thermal fluctuations cannot be predicted from the fluid thermal fluctuations. • High-order turbulent statistics show that the turbulent transport term is important. - Abstract: This study analyses the mixing and thermal fluctuations induced in a mixing tee junction with circular cross-sections when cold water flowing in a pipe is joined by hot water from a branch pipe. This configuration is representative of industrial piping systems in which temperature fluctuations in the fluid may cause thermal fatigue damage on the walls. Implicit large-eddy simulations (LES) are performed for equal inflow rates corresponding to a bulk Reynolds number Re = 39,080. Two different thermal boundary conditions are studied for the pipe walls; an insulating adiabatic boundary and a conducting steel wall boundary. The predicted flow structures show a satisfactory agreement with the literature. The velocity and thermal fields (including high-order statistics) are not affected by the heat transfer with the steel walls. However, predicted thermal fluctuations at the boundary are not the same between the flow and the solid, showing that solid thermal fluctuations cannot be predicted by the knowledge of the fluid thermal fluctuations alone. The analysis of high-order turbulent statistics provides a better understanding of the turbulence features. In particular, the budgets of the turbulent kinetic energy and temperature variance allows a comparative analysis of dissipation, production and transport terms. It is found that the turbulent transport term is an important term that acts to balance the production. We therefore use a priori tests to evaluate three different models for the triple correlation

  4. Statistical ensembles and molecular dynamics studies of anisotropic solids. II

    International Nuclear Information System (INIS)

    Ray, J.R.; Rahman, A.

    1985-01-01

    We have recently discussed how the Parrinello--Rahman theory can be brought into accord with the theory of the elastic and thermodynamic behavior of anisotropic media. This involves the isoenthalpic--isotension ensemble of statistical mechanics. Nose has developed a canonical ensemble form of molecular dynamics. We combine Nose's ideas with the Parrinello--Rahman theory to obtain a canonical form of molecular dynamics appropriate to the study of anisotropic media subjected to arbitrary external stress. We employ this isothermal--isotension ensemble in a study of a fcc→ close-packed structural phase transformation in a Lennard-Jones solid subjected to uniaxial compression. Our interpretation of the Nose theory does not involve a scaling of the time variable. This latter fact leads to simplifications when studying the time dependence of quantities

  5. Statistical Indicators for Religious Studies: Indicators of Level and Structure

    Science.gov (United States)

    Herteliu, Claudiu; Isaic-Maniu, Alexandru

    2009-01-01

    Using statistic indicators as vectors of information relative to the operational status of a phenomenon, including a religious one, is unanimously accepted. By introducing a system of statistic indicators we can also analyze the interfacing areas of a phenomenon. In this context, we have elaborated a system of statistic indicators specific to the…

  6. Statistics as a Foreign Language--Part 2: More Things to Consider in Reading Statistical Language Studies.

    Science.gov (United States)

    Brown, James Dean

    1992-01-01

    Five new strategies are proposed to help language teachers understand statistical studies. Each strategy is discussed with appropriate tables, figures, and examples drawn from recent articles of the "TESOL Quarterly." (18 references) (Author/LB)

  7. Statistical and direct decay of high-lying single-particle excitations

    International Nuclear Information System (INIS)

    Gales, S.

    1993-01-01

    Transfer reactions induced by hadronic probes at intermediate energies have revealed a rich spectrum of high-lying excitations embedded in the nuclear continuum. The investigation of their decay properties is believed to be a severe test of their microscopic structure as predicted by microscopic nuclear models. In addition the degree of damping of these simple modes in the nuclear continuum can be obtained by means of the measured particle (n,p) decay branching ratios. The neutron and proton decay studies of high-lying single-particle states in heavy nuclei are presented. (author). 13 refs., 9 figs

  8. Statistical studies of energetic electrons in the outer radiation belt

    Energy Technology Data Exchange (ETDEWEB)

    Johnstone, A.D.; Rodgers, D.J.; Jones, G.H. E-mail: g.h.jones@ic.ac.uk

    1999-10-01

    The medium electron A (MEA) instrument aboard the CRRES spacecraft provided data on terrestrial radiation belt electrons in the energy range from 153 to 1582 keV, during 1990-91. These data have previously been used to produce an empirical model of the radiation belts from L=1.1 to 8.9, ordered according to 17 energy bands, 18 pitch angle bins, and 5 Kp ranges. Empirical models such as this are very valuable, but are prone to statistical fluctuations and gaps in coverage. In this study, in order to smooth the data and make it more easy to interpolate within data gaps, the pitch angle distribution at each energy in the model was fitted with a Bessel function. This provided a way to characterize the pitch angle in terms of only two parameters for each energy. It was not possible to model fluxes reliably within the loss cone because of poor statistics. The fitted distributions give an indication of the way in which pitch angle diffusion varies in the outer radiation belts. The two parameters of the Bessel function were found to vary systematically with L value, energy and Kp. Through the fitting of a simple function to these systematic variations, the number of parameters required to describe the model could be reduced drastically.

  9. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    Energy Technology Data Exchange (ETDEWEB)

    Duran-Lobato, Matilde, E-mail: mduran@us.es [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain); Enguix-Gonzalez, Alicia [Universidad de Sevilla, Dpto. Estadistica e Investigacion Operativa, Facultad de Matematicas (Espana) (Spain); Fernandez-Arevalo, Mercedes; Martin-Banderas, Lucia [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain)

    2013-02-15

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 {mu}m, negative zeta potential under -30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R{sub L/S}) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R{sub L/S}, while the number of passes applied mainly determined polydispersion. {alpha}-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  10. High-throughput optimization by statistical designs: example with rat liver slices cryopreservation.

    Science.gov (United States)

    Martin, H; Bournique, B; Blanchi, B; Lerche-Langrand, C

    2003-08-01

    The purpose of this study was to optimize cryopreservation conditions of rat liver slices in a high-throughput format, with focus on reproducibility. A statistical design of 32 experiments was performed and intracellular lactate dehydrogenase (LDHi) activity and antipyrine (AP) metabolism were evaluated as biomarkers. At freezing, modified University of Wisconsin solution was better than Williams'E medium, and pure dimethyl sulfoxide was better than a cryoprotectant mixture. The best cryoprotectant concentrations were 10% for LDHi and 20% for AP metabolism. Fetal calf serum could be used at 50 or 80%, and incubation of slices with the cryoprotectant could last 10 or 20 min. At thawing, 42 degrees C was better than 22 degrees C. After thawing, 1h was better than 3h of preculture. Cryopreservation increased the interslice variability of the biomarkers. After cryopreservation, LDHi and AP metabolism levels were up to 84 and 80% of fresh values. However, these high levels were not reproducibly achieved. Two factors involved in the day-to-day variability of LDHi were identified: the incubation time with the cryoprotectant and the preculture time. In conclusion, the statistical design was very efficient to quickly determine optimized conditions by simultaneously measuring the role of numerous factors. The cryopreservation procedure developed appears suitable for qualitative metabolic profiling studies.

  11. Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach

    International Nuclear Information System (INIS)

    Durán-Lobato, Matilde; Enguix-González, Alicia; Fernández-Arévalo, Mercedes; Martín-Banderas, Lucía

    2013-01-01

    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 μm, negative zeta potential under –30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R L/S ) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R L/S , while the number of passes applied mainly determined polydispersion. α-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.

  12. Statistic rCBF study of extrapyramidal disorders

    Energy Technology Data Exchange (ETDEWEB)

    Kamei, Hiroshi; Nakajima, Takashi; Fukuhara, Nobuyoshi [National Saigata Hospital, Ogata, Niigata (Japan)

    2002-08-01

    We studied regional cerebral blood flow (rCBF) in 16 patients with Parkinson's disease (PD), 2 patients with dementia with Lewy bodies (DLB), 2 patients with progressive supranuclear palsy (PSP), 2 patients with striatonigral degeneration, and 16 normal volunteers, using Three-dimensional stereotactic surface projections (3D-SSP). Decreased rCBF in PD patients was shown in the posterior parietal and occipital cortex. Decreased rCBF in DLB was shown in the frontal, parietal and occipital cortex with relative sparing of the sensorimotor cortex.. Decreased rCBF in PSP was shown in the frontal cortex. Decreased rCBF in SND was shown in the frontal cortex and cerebellum. Statistic rCBF analysis using 3D-SSP was a useful measure for the early differential diagnosis of extrapyramidal disorders. (author)

  13. Statistical study of seismicity associated with geothermal reservoirs in California

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, D.M.; Cavit, D.S.

    1982-01-01

    Statistical methods are outlined to separate spatially, temporally, and magnitude-dependent portions of both the random and non-random components of the seismicity. The methodology employed compares the seismicity distributions with a generalized Poisson distribution. Temporally related events are identified by the distribution of the interoccurrence times. The regions studied to date include the Imperial Valley, Coso, The Geysers, Lassen, and the San Jacinto fault. The spatial characteristics of the random and clustered components of the seismicity are diffuse and appear unsuitable for defining the areal extent of the reservoir. However, from the temporal characteristics of the seismicity associated with these regions a general discriminant was constructed that combines several physical parameters for identifying the presence of a geothermal system.

  14. Statistical study of chemical additives effects in the waste cementation

    International Nuclear Information System (INIS)

    Tello, Cledola C.O. de; Diniz, Paula S.; Haucz, Maria J.A.

    1997-01-01

    This paper presents the statistical study, that was carried out to analyse the chemical additives effect in the waste cementation process. Three different additives from two industries were tested: set accelerator, set retarder and super plasticizers, in cemented pates with and without bentonite. The experiments were planned in accordance with the 2 3 factorial design, so that the effect of each type of additive, its quantity and manufacturer in cemented paste and specimens could be evaluated. The results showed that the use of these can improve the cementation process and the product. The admixture quantity and the association with bentonite were the most important factors affecting the process and product characteristics. (author). 4 refs., 9 figs., 4 tabs

  15. Statistical study of ion pitch-angle distributions

    International Nuclear Information System (INIS)

    Sibeck, D.G.; Mcentire, R.W.; Lui, A.T.Y.; Krimigis, S.M.

    1987-01-01

    Preliminary results of a statistical study of energetic (34-50 keV) ion pitch-angle distributions (PADs) within 9 Re of earth provide evidence for an orderly pattern consistent with both drift-shell splitting and magnetopause shadowing. Normal ion PADs dominate the dayside and inner magnetosphere. Butterfly PADs typically occur in a narrow belt stretching from dusk to dawn through midnight, where they approach within 6 Re of earth. While those ion butterfly PADs that typically occur on closed drift paths are mainly caused by drift-shell splitting, there is also evidence for magnetopause shadowing in observations of more frequent butterfly PAD occurrence in the outer magnetosphere near dawn than dusk. Isotropic and gradient boundary PADs terminate the tailward extent of the butterfly ion PAD belt. 9 references

  16. A statistical perspective on association studies of psychiatric disorders

    DEFF Research Database (Denmark)

    Foldager, Leslie

    2014-01-01

    Gene-gene (GxG) and gene-environment (GxE) interactions likely play an important role in the aetiology of complex diseases like psychiatric disorders. Thus, we aim at investigating methodological aspects of and apply methods from statistical genetics taking interactions into account. In addition we...... genes and maternal infection by virus. Paper 3 presents the initial steps (mainly data construction) of an ongoing simulation study aiming at guiding decisions by comparing methods for GxE interaction analysis including both traditional two-step logistic regression, exhaustive searches using efficient...... these markers. However, the validity of the identified haplotypes is also checked by inferring phased haplotypes from genotypes. Haplotype analysis is also used in paper 5 which is otherwise an example of a focused approach to narrow down a previously found signal to search for more precise positions of disease...

  17. Interplanetary sources of magnetic storms: A statistical study

    DEFF Research Database (Denmark)

    Vennerstrøm, Susanne

    2001-01-01

    Magnetic storms are mainly caused by the occurrence of intense southward magnetic fields in the interplanetary medium. These fields can be formed directly either by ejection of magnetic structures from the Sun or by stream interaction processes during solar wind propagation. In the present study we...... examine 30 years of satellite measurement of the solar wind during magnetic storms, with the aim of estimating the relative importance of these two processes. We use the solar wind proton temperature relative to the temperature expected from the empirical relation to the solar wind speed T......-p/T-exp, together with the speed gradient, and the interplanetary magnetic field azimuth in the ecliptic, in order to distinguish between the two processes statistically. We find that compression due to stream interaction is at least as important as the direct effect of ejection of intense fields, and probably more...

  18. Interplanetary sources to magnetic storms - A statistical study

    DEFF Research Database (Denmark)

    Vennerstrøm, Susanne

    2001-01-01

    Magnetic storms are mainly caused by the occurrence of intense southward magnetic fields in the interplanetary medium. These fields can be formed directly either by ejection of magnetic structures from the Sun or by stream interaction processes during solar wind propagation. In the present study we...... examine 30 years of satellite measurement of the solar wind during magnetic storms, with the aim of estimating the relative importance of these two processes. We use the solar wind proton temperature relative to the temperature expected from the empirical relation to the solar wind speed Tp/Texp, together...... with the speed gradient, and the interplanetary magnetic field azimuth in the ecliptic, in order to distinguish between the two processes statistically. We find that compression due to stream interaction is at least as important as the direct effect of ejection of intense fields, and probably more so. Only...

  19. IGESS: a statistical approach to integrating individual-level genotype data and summary statistics in genome-wide association studies.

    Science.gov (United States)

    Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben

    2017-09-15

    Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. A Pilot Study Teaching Metrology in an Introductory Statistics Course

    Science.gov (United States)

    Casleton, Emily; Beyler, Amy; Genschel, Ulrike; Wilson, Alyson

    2014-01-01

    Undergraduate students who have just completed an introductory statistics course often lack deep understanding of variability and enthusiasm for the field of statistics. This paper argues that by introducing the commonly underemphasized concept of measurement error, students will have a better chance of attaining both. We further present lecture…

  1. Automating Exams for a Statistics Course: II. A Case Study.

    Science.gov (United States)

    Michener, R. Dean; And Others

    A specific application of the process of automating exams for any introductory statistics course is described. The process of automating exams was accomplished by using the Statistical Test Item Collection System (STICS). This system was first used to select a set of questions based on course requirements established in advance; afterward, STICS…

  2. The Reliability of Single Subject Statistics for Biofeedback Studies.

    Science.gov (United States)

    Bremner, Frederick J.; And Others

    To test the usefulness of single subject statistical designs for biofeedback, three experiments were conducted comparing biofeedback to meditation, and to a compound stimulus recognition task. In a statistical sense, this experimental design is best described as one experiment with two replications. The apparatus for each of the three experiments…

  3. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  4. Transfer of drug dissolution testing by statistical approaches: Case study

    Science.gov (United States)

    AL-Kamarany, Mohammed Amood; EL Karbane, Miloud; Ridouan, Khadija; Alanazi, Fars K.; Hubert, Philippe; Cherrah, Yahia; Bouklouze, Abdelaziz

    2011-01-01

    The analytical transfer is a complete process that consists in transferring an analytical procedure from a sending laboratory to a receiving laboratory. After having experimentally demonstrated that also masters the procedure in order to avoid problems in the future. Method of transfers is now commonplace during the life cycle of analytical method in the pharmaceutical industry. No official guideline exists for a transfer methodology in pharmaceutical analysis and the regulatory word of transfer is more ambiguous than for validation. Therefore, in this study, Gauge repeatability and reproducibility (R&R) studies associated with other multivariate statistics appropriates were successfully applied for the transfer of the dissolution test of diclofenac sodium as a case study from a sending laboratory A (accredited laboratory) to a receiving laboratory B. The HPLC method for the determination of the percent release of diclofenac sodium in solid pharmaceutical forms (one is the discovered product and another generic) was validated using accuracy profile (total error) in the sender laboratory A. The results showed that the receiver laboratory B masters the test dissolution process, using the same HPLC analytical procedure developed in laboratory A. In conclusion, if the sender used the total error to validate its analytical method, dissolution test can be successfully transferred without mastering the analytical method validation by receiving laboratory B and the pharmaceutical analysis method state should be maintained to ensure the same reliable results in the receiving laboratory. PMID:24109204

  5. Learning Statistics at the Farmers Market? A Comparison of Academic Service Learning and Case Studies in an Introductory Statistics Course

    Science.gov (United States)

    Hiedemann, Bridget; Jones, Stacey M.

    2010-01-01

    We compare the effectiveness of academic service learning to that of case studies in an undergraduate introductory business statistics course. Students in six sections of the course were assigned either an academic service learning project (ASL) or business case studies (CS). We examine two learning outcomes: students' performance on the final…

  6. Significance levels for studies with correlated test statistics.

    Science.gov (United States)

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  7. Statistical translation with scarce resources: a South African case study

    CSIR Research Space (South Africa)

    Ronald, K

    2006-11-01

    Full Text Available Statistical machine translation techniques offer great promise for the development of automatic translation systems. However, the realization of this potential requires the availability of significant amounts of parallel bilingual texts. This paper...

  8. High statistics study (approx.106 events) of J/psi production and T production in the energy range 150 to 280 GeV by π+-, p+- incident particle

    International Nuclear Information System (INIS)

    Badier, J.; Boucrot, J.; Bourotte, J.; Burgun, G.; Callot, O.; Charpentier, P.; Crozon, M.; Decamp, D.; Delpierre, P.; Diop, A.; Dube, R.; Espigat, P.; Gandois, B.; Hagelberg, R.; Hansroul, M.; Karyotakis, J.; Kienzle, W.; Lafontaine, A.; Le Du, P.; Lefrancois, J.; Leray, T.; Maillard, J.; Matthiae, G.; Michelini, A.; Mine, P.; Nguyen Ngoc, H.; Rahal, G.; Runolfsson, O.; Siegrist, P.; Tilquin, A.; Timmermans, J.; Valentin, J.; Vanderhaghen, R.; Weisz, S.

    1981-01-01

    We have performed in the NA3 experiment the study of high mass dimuon production by a hadronic unseparated beam on hydrogen and platinum targets. The comparison of the production cross-section for proton and antiproton together with the differential cross-section dsigma/dx allows us to compare the data with a production mechanism involving quark-antiquark and gluon-gluon interactions. The cosTHETA* distribution of the same J/psi data have also been analysed and results will be presented. Finally we have observed T production from 150 GeV/c incident pions

  9. The Study of Second Higher Education through Mathematical Statistics

    Directory of Open Access Journals (Sweden)

    Olga V. Kremer

    2014-05-01

    Full Text Available The article deals with the statistic reasons, age and wages of people who get the second higher education. People opt for the second higher education mostly due to many economical and physiological factors. According to our research, the age is a key motivator for the second higher education. Based on statistical data the portrait of a second higher education student was drawn.

  10. Statistical Study of the Magnetic Field Orientation in Solar Filaments

    Science.gov (United States)

    Hanaoka, Yoichiro; Sakurai, Takashi

    2017-12-01

    We have carried out a statistical study of the average orientation of the magnetic field in solar filaments with respect to their axes for more than 400 samples, based on data taken with daily full-Sun, full-Stokes spectropolarimetric observations using the He I 1083.0 nm line. The major part of the samples are the filaments in the quiet areas, but those in the active areas are included as well. The average orientation of the magnetic field in filaments shows a systematic property depending on the hemisphere; the direction of the magnetic field in filaments in the northern (southern) hemisphere mostly deviates clockwise (counterclockwise) from their axes, which run along the magnetic polarity inversion line. The deviation angles of the magnetic field from the axes are concentrated between 10° and 30°. This hemispheric pattern is consistent with that revealed for chirality of filament barbs, filament channels, and for other solar features found to possess chirality. For some filaments, it was confirmed that their magnetic field direction is locally parallel to their structure seen in Hα images. Our results for the first time confirmed this hemispheric pattern with the direct observation of the magnetic field in filaments. Interestingly, the filaments which show the opposite magnetic field deviation to the hemispheric pattern, are in many cases found above the polarity inversion line whose ambient photospheric magnetic field has the polarity alignment being opposite to that of active regions following the Hale–Nicholson law.

  11. Statistical physics of medical diagnostics: Study of a probabilistic model.

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  12. Statistical physics of medical diagnostics: Study of a probabilistic model

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  13. A Statistical Study of Interplanetary Type II Bursts: STEREO Observations

    Science.gov (United States)

    Krupar, V.; Eastwood, J. P.; Magdalenic, J.; Gopalswamy, N.; Kruparova, O.; Szabo, A.

    2017-12-01

    Coronal mass ejections (CMEs) are the primary cause of the most severe and disruptive space weather events such as solar energetic particle (SEP) events and geomagnetic storms at Earth. Interplanetary type II bursts are generated via the plasma emission mechanism by energetic electrons accelerated at CME-driven shock waves and hence identify CMEs that potentially cause space weather impact. As CMEs propagate outward from the Sun, radio emissions are generated at progressively at lower frequencies corresponding to a decreasing ambient solar wind plasma density. We have performed a statistical study of 153 interplanetary type II bursts observed by the two STEREO spacecraft between March 2008 and August 2014. These events have been correlated with manually-identified CMEs contained in the Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) catalogue. Our results confirm that faster CMEs are more likely to produce interplanetary type II radio bursts. We have compared observed frequency drifts with white-light observations to estimate angular deviations of type II burst propagation directions from radial. We have found that interplanetary type II bursts preferably arise from CME flanks. Finally, we discuss a visibility of radio emissions in relation to the CME propagation direction.

  14. Electron Dropout Echoes Induced by Interplanetary Shock: A Statistical Study

    Science.gov (United States)

    Liu, Z.; Zong, Q.; Hao, Y.; Zhou, X.; Ma, X.; Liu, Y.

    2017-12-01

    "Electron dropout echo" as indicated by repeated moderate dropout and recovery signatures of the flux of energetic electron in the out radiation belt region has been investigated systematically. The electron dropout and its echoes are usually found for higher energy (> 300 keV) channels fluxes, whereas the flux enhancements are obvious for lower energy electrons simultaneously after the interplanetary shock arrives at the Earth's geosynchronous orbit. 104 dropout echo events have been found from 215 interplanetary shock events from 1998 to 2007 based on LANL satellite data. In analogy to substorm injections, these 104 events could be naturally divided into two categories: dispersionless (49 events) or dispersive (55 events) according to the energy dispersion of the initial dropout. It is found that locations of dispersionless events are distributed mainly in the duskside magnetosphere. Further, the obtained locations derived from dispersive events with the time-of-flight technique of the initial dropout regions are mainly located at the duskside as well. Statistical studies have shown that the effect of shock normal, interplanetary magnetic field Bz and solar wind dynamic pressure may be insignificant to these electron dropout events. We suggest that the electric field impulse induced by the IP shock produces a more pronounced inward migration of electrons at the dusk side, resulting in the observed dusk-side moderate dropout of electron flux and its consequent echoes.

  15. Statistical approach to study of lithium magnesium metaborate glasses

    Directory of Open Access Journals (Sweden)

    Nedyalkova Miroslava

    2017-03-01

    Full Text Available Alkali borate glasses and alkaline earth borate glasses are commonly used materials in the field of optoelectronics. Infrared (FTIR and Raman spectroscopy are valuable tools for structural investigation of borate glass networks. The compositional and structural variety of lithium magnesium metaborate glasses is usually determined by traditional instrumental methods. In this study a data set is classified by structural and physicochemical parameters (FTIR, Raman spectra, glass transition temperature-Tg. Characterisation of magnesium containing metaborate glasses by multivariate statistics (hierarchical cluster analysis to reveal potential relationships (similarity or dissimilarity between the type of glasses included in the data set using specific structural features available in the literature is conducted. The clustering of the glass objects indicates a good separation of different magnesium containing borate glass compositions. The grouping of variables concerning Tg and structural data for BO3 and BO4 linkage confirms that BO4/BO3 ratios strongly affect Tg. Additionally, patterns of similarity could be detected not only between the glass composition but also between the features (variables describing the glasses. The proposed approach can be further used as an expert tool for glass properties prediction or fingerprinting (identification of unknown compositions.

  16. Audiometric changes with age in Hiroshima: a statistical study

    Energy Technology Data Exchange (ETDEWEB)

    Hollingsworth, J W; Ishii, Goro

    1960-10-01

    Audiometry observations were analyzed for 290 irradiated survivors of the 1945 atomic bomb in Hiroshima and in 293 nonirradiated subjects. The study was undertaken in order to determine the age changes in audiology in irradiated and nonirradiated subjects as well as to investigate the pattern of hearing levels in a Japanese population for comparison with patterns in Caucasians. The following statistical observations were made. Correlation between hearing levels for right and left ear. Correlation between hearing levels at various cycles. Changes in hearing levels by age and sex. The relation between age and decibel loss was not linear and correlation ratios with age were 0.45 to 0.72. Audiometry seems to be of some value as one of a battery of tests of physiologic aging designed for detection of irradiation induced nonspecific aging acceleration. In this relatively small sample, no differences in hearing acuity were detected in the atomic bomb survivors as compared with the control sample. 6 references, 3 figures, 9 tables.

  17. Statistical properties of Joule heating rate, electric field and conductances at high latitudes

    Directory of Open Access Journals (Sweden)

    A. T. Aikio

    2009-07-01

    Full Text Available Statistical properties of Joule heating rate, electric field and conductances in the high latitude ionosphere are studied by a unique one-month measurement made by the EISCAT incoherent scatter radar in Tromsø (66.6 cgmlat from 6 March to 6 April 2006. The data are from the same season (close to vernal equinox and from similar sunspot conditions (about 1.5 years before the sunspot minimum providing an excellent set of data to study the MLT and Kp dependence of parameters with high temporal and spatial resolution.

    All the parameters show a clear MLT variation, which is different for low and high Kp conditions. Our results indicate that the response of morning sector conductances and conductance ratios to increased magnetic activity is stronger than that of the evening sector. The co-location of Pedersen conductance maximum and electric field maximum in the morning sector produces the largest Joule heating rates 03–05 MLT for Kp≥3. In the evening sector, a smaller maximum occurs at 18 MLT. Minimum Joule heating rates in the nightside are statistically observed at 23 MLT, which is the location of the electric Harang discontinuity.

    An important outcome of the paper are the fitted functions for the Joule heating rate as a function of electric field magnitude, separately for four MLT sectors and two activity levels (Kp<3 and Kp≥3. In addition to the squared electric field, the fit includes a linear term to study the possible anticorrelation or correlation between electric field and conductance. In the midday sector, positive correlation is found as well as in the morning sector for the high activity case. In the midnight and evening sectors, anticorrelation between electric field and conductance is obtained, i.e. high electric fields are associated with low conductances. This is expected to occur in the return current regions adjacent to

  18. Statistical properties of Joule heating rate, electric field and conductances at high latitudes

    Directory of Open Access Journals (Sweden)

    A. T. Aikio

    2009-07-01

    Full Text Available Statistical properties of Joule heating rate, electric field and conductances in the high latitude ionosphere are studied by a unique one-month measurement made by the EISCAT incoherent scatter radar in Tromsø (66.6 cgmlat from 6 March to 6 April 2006. The data are from the same season (close to vernal equinox and from similar sunspot conditions (about 1.5 years before the sunspot minimum providing an excellent set of data to study the MLT and Kp dependence of parameters with high temporal and spatial resolution. All the parameters show a clear MLT variation, which is different for low and high Kp conditions. Our results indicate that the response of morning sector conductances and conductance ratios to increased magnetic activity is stronger than that of the evening sector. The co-location of Pedersen conductance maximum and electric field maximum in the morning sector produces the largest Joule heating rates 03–05 MLT for Kp≥3. In the evening sector, a smaller maximum occurs at 18 MLT. Minimum Joule heating rates in the nightside are statistically observed at 23 MLT, which is the location of the electric Harang discontinuity. An important outcome of the paper are the fitted functions for the Joule heating rate as a function of electric field magnitude, separately for four MLT sectors and two activity levels (Kp<3 and Kp≥3. In addition to the squared electric field, the fit includes a linear term to study the possible anticorrelation or correlation between electric field and conductance. In the midday sector, positive correlation is found as well as in the morning sector for the high activity case. In the midnight and evening sectors, anticorrelation between electric field and conductance is obtained, i.e. high electric fields are associated with low conductances. This is expected to occur in the return current regions adjacent to auroral arcs as a result of ionosphere-magnetosphere coupling, as discussed by Aikio et al. (2004 In

  19. Statistical issues in searches for new phenomena in High Energy Physics

    Science.gov (United States)

    Lyons, Louis; Wardle, Nicholas

    2018-03-01

    Many analyses of data in High Energy Physics are concerned with searches for New Physics. We review the statistical issues that arise in such searches, and then illustrate these using the specific example of the recent successful search for the Higgs boson, produced in collisions between high energy protons at CERN’s Large Hadron Collider.

  20. Integration of statistical modeling and high-content microscopy to systematically investigate cell-substrate interactions.

    Science.gov (United States)

    Chen, Wen Li Kelly; Likhitpanichkul, Morakot; Ho, Anthony; Simmons, Craig A

    2010-03-01

    Cell-substrate interactions are multifaceted, involving the integration of various physical and biochemical signals. The interactions among these microenvironmental factors cannot be facilely elucidated and quantified by conventional experimentation, and necessitate multifactorial strategies. Here we describe an approach that integrates statistical design and analysis of experiments with automated microscopy to systematically investigate the combinatorial effects of substrate-derived stimuli (substrate stiffness and matrix protein concentration) on mesenchymal stem cell (MSC) spreading, proliferation and osteogenic differentiation. C3H10T1/2 cells were grown on type I collagen- or fibronectin-coated polyacrylamide hydrogels with tunable mechanical properties. Experimental conditions, which were defined according to central composite design, consisted of specific permutations of substrate stiffness (3-144 kPa) and adhesion protein concentration (7-520 microg/mL). Spreading area, BrdU incorporation and Runx2 nuclear translocation were quantified using high-content microscopy and modeled as mathematical functions of substrate stiffness and protein concentration. The resulting response surfaces revealed distinct patterns of protein-specific, substrate stiffness-dependent modulation of MSC proliferation and differentiation, demonstrating the advantage of statistical modeling in the detection and description of higher-order cellular responses. In a broader context, this approach can be adapted to study other types of cell-material interactions and can facilitate the efficient screening and optimization of substrate properties for applications involving cell-material interfaces. Copyright 2009 Elsevier Ltd. All rights reserved.

  1. Waste generated in high-rise buildings construction: a quantification model based on statistical multiple regression.

    Science.gov (United States)

    Parisi Kern, Andrea; Ferreira Dias, Michele; Piva Kulakowski, Marlova; Paulo Gomes, Luciana

    2015-05-01

    Reducing construction waste is becoming a key environmental issue in the construction industry. The quantification of waste generation rates in the construction sector is an invaluable management tool in supporting mitigation actions. However, the quantification of waste can be a difficult process because of the specific characteristics and the wide range of materials used in different construction projects. Large variations are observed in the methods used to predict the amount of waste generated because of the range of variables involved in construction processes and the different contexts in which these methods are employed. This paper proposes a statistical model to determine the amount of waste generated in the construction of high-rise buildings by assessing the influence of design process and production system, often mentioned as the major culprits behind the generation of waste in construction. Multiple regression was used to conduct a case study based on multiple sources of data of eighteen residential buildings. The resulting statistical model produced dependent (i.e. amount of waste generated) and independent variables associated with the design and the production system used. The best regression model obtained from the sample data resulted in an adjusted R(2) value of 0.694, which means that it predicts approximately 69% of the factors involved in the generation of waste in similar constructions. Most independent variables showed a low determination coefficient when assessed in isolation, which emphasizes the importance of assessing their joint influence on the response (dependent) variable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A High-resolution Atlas and Statistical Model of the Vocal Tract from Structural MRI.

    Science.gov (United States)

    Woo, Jonghye; Lee, Junghoon; Murano, Emi Z; Xing, Fangxu; Al-Talib, Meena; Stone, Maureen; Prince, Jerry L

    Magnetic resonance imaging (MRI) is an essential tool in the study of muscle anatomy and functional activity in the tongue. Objective assessment of similarities and differences in tongue structure and function has been performed using unnormalized data, but this is biased by the differences in size, shape, and orientation of the structures. To remedy this, we propose a methodology to build a 3D vocal tract atlas based on structural MRI volumes from twenty normal subjects. We first constructed high-resolution volumes from three orthogonal stacks. We then removed extraneous data so that all 3D volumes contained the same anatomy. We used an unbiased diffeomorphic groupwise registration using a cross-correlation similarity metric. Principal component analysis was applied to the deformation fields to create a statistical model from the atlas. Various evaluations and applications were carried out to show the behaviour and utility of the atlas.

  3. Matching of experimental and statistical-model thermonuclear reaction rates at high temperatures

    International Nuclear Information System (INIS)

    Newton, J. R.; Longland, R.; Iliadis, C.

    2008-01-01

    We address the problem of extrapolating experimental thermonuclear reaction rates toward high stellar temperatures (T>1 GK) by using statistical model (Hauser-Feshbach) results. Reliable reaction rates at such temperatures are required for studies of advanced stellar burning stages, supernovae, and x-ray bursts. Generally accepted methods are based on the concept of a Gamow peak. We follow recent ideas that emphasized the fundamental shortcomings of the Gamow peak concept for narrow resonances at high stellar temperatures. Our new method defines the effective thermonuclear energy range (ETER) by using the 8th, 50th, and 92nd percentiles of the cumulative distribution of fractional resonant reaction rate contributions. This definition is unambiguous and has a straightforward probability interpretation. The ETER is used to define a temperature at which Hauser-Feshbach rates can be matched to experimental rates. This matching temperature is usually much higher compared to previous estimates that employed the Gamow peak concept. We suggest that an increased matching temperature provides more reliable extrapolated reaction rates since Hauser-Feshbach results are more trustwhorthy the higher the temperature. Our ideas are applied to 21 (p,γ), (p,α), and (α,γ) reactions on A=20-40 target nuclei. For many of the cases studied here, our extrapolated reaction rates at high temperatures differ significantly from those obtained using the Gamow peak concept

  4. High performance statistical computing with parallel R: applications to biology and climate modelling

    International Nuclear Information System (INIS)

    Samatova, Nagiza F; Branstetter, Marcia; Ganguly, Auroop R; Hettich, Robert; Khan, Shiraj; Kora, Guruprasad; Li, Jiangtian; Ma, Xiaosong; Pan, Chongle; Shoshani, Arie; Yoginath, Srikanth

    2006-01-01

    Ultrascale computing and high-throughput experimental technologies have enabled the production of scientific data about complex natural phenomena. With this opportunity, comes a new problem - the massive quantities of data so produced. Answers to fundamental questions about the nature of those phenomena remain largely hidden in the produced data. The goal of this work is to provide a scalable high performance statistical data analysis framework to help scientists perform interactive analyses of these raw data to extract knowledge. Towards this goal we have been developing an open source parallel statistical analysis package, called Parallel R, that lets scientists employ a wide range of statistical analysis routines on high performance shared and distributed memory architectures without having to deal with the intricacies of parallelizing these routines

  5. The case for increasing the statistical power of eddy covariance ecosystem studies: why, where and how?

    Science.gov (United States)

    Hill, Timothy; Chocholek, Melanie; Clement, Robert

    2017-06-01

    Eddy covariance (EC) continues to provide invaluable insights into the dynamics of Earth's surface processes. However, despite its many strengths, spatial replication of EC at the ecosystem scale is rare. High equipment costs are likely to be partially responsible. This contributes to the low sampling, and even lower replication, of ecoregions in Africa, Oceania (excluding Australia) and South America. The level of replication matters as it directly affects statistical power. While the ergodicity of turbulence and temporal replication allow an EC tower to provide statistically robust flux estimates for its footprint, these principles do not extend to larger ecosystem scales. Despite the challenge of spatially replicating EC, it is clearly of interest to be able to use EC to provide statistically robust flux estimates for larger areas. We ask: How much spatial replication of EC is required for statistical confidence in our flux estimates of an ecosystem? We provide the reader with tools to estimate the number of EC towers needed to achieve a given statistical power. We show that for a typical ecosystem, around four EC towers are needed to have 95% statistical confidence that the annual flux of an ecosystem is nonzero. Furthermore, if the true flux is small relative to instrument noise and spatial variability, the number of towers needed can rise dramatically. We discuss approaches for improving statistical power and describe one solution: an inexpensive EC system that could help by making spatial replication more affordable. However, we note that diverting limited resources from other key measurements in order to allow spatial replication may not be optimal, and a balance needs to be struck. While individual EC towers are well suited to providing fluxes from the flux footprint, we emphasize that spatial replication is essential for statistically robust fluxes if a wider ecosystem is being studied. © 2016 The Authors Global Change Biology Published by John Wiley

  6. Statistical Surface Recovery: A Study on Ear Canals

    DEFF Research Database (Denmark)

    Jensen, Rasmus Ramsbøl; Olesen, Oline Vinter; Paulsen, Rasmus Reinhold

    2012-01-01

    We present a method for surface recovery in partial surface scans based on a statistical model. The framework is based on multivariate point prediction, where the distribution of the points are learned from an annotated data set. The training set consist of surfaces with dense correspondence...... that are Procrustes aligned. The average shape and point covariances can be estimated from this set. It is shown how missing data in a new given shape can be predicted using the learned statistics. The method is evaluated on a data set of 29 scans of ear canal impressions. By using a leave-one-out approach we...

  7. Methodological difficulties of conducting agroecological studies from a statistical perspective

    DEFF Research Database (Denmark)

    Bianconi, A.; Dalgaard, Tommy; Manly, Bryan F J

    2013-01-01

    Statistical methods for analysing agroecological data might not be able to help agroecologists to solve all of the current problems concerning crop and animal husbandry, but such methods could well help agroecologists to assess, tackle, and resolve several agroecological issues in a more reliable...... and accurate manner. Therefore, our goal in this paper is to discuss the importance of statistical tools for alternative agronomic approaches, because alternative approaches, such as organic farming, should not only be promoted by encouraging farmers to deploy agroecological techniques, but also by providing...

  8. A statistical study on consumer's perception of sustainable products

    Science.gov (United States)

    Pater, Liana; Izvercian, Monica; Ivaşcu, Larisa

    2017-07-01

    Sustainability and sustainable concepts are quite often but not always used correctly. The statistical research on consumer's perception of sustainable products has tried to identify the level of knowledge regarding the concept of sustainability and sustainable products, the selected criteria concerning the buying decision, the intention of purchasing a sustainable product, main sustainable products preferred by consumers.

  9. Enrichment of statistical power for genome-wide association studies

    Science.gov (United States)

    The inheritance of most human diseases and agriculturally important traits is controlled by many genes with small effects. Identifying these genes, while simultaneously controlling false positives, is challenging. Among available statistical methods, the mixed linear model (MLM) has been the most fl...

  10. Some statistical aspects of the generalizability of occupational health studies

    NARCIS (Netherlands)

    D. Lugtenburg (Dirk)

    1992-01-01

    textabstractThe present thesis discusses both the methodology as developed, and applications in practice. Methods of dealing in the statistical analysis with the occurrence of missing data are presented in Chapter 2 as a review of recent literature on this topic. Complete issues of Biometrics,

  11. Statistics of high-altitude and high-latitude O+ ion outflows observed by Cluster/CIS

    Directory of Open Access Journals (Sweden)

    A. Korth

    2005-07-01

    Full Text Available The persistent outflows of O+ ions observed by the Cluster CIS/CODIF instrument were studied statistically in the high-altitude (from 3 up to 11 RE and high-latitude (from 70 to ~90 deg invariant latitude, ILAT polar region. The principal results are: (1 Outflowing O+ ions with more than 1keV are observed above 10 RE geocentric distance and above 85deg ILAT location; (2 at 6-8 RE geocentric distance, the latitudinal distribution of O+ ion outflow is consistent with velocity filter dispersion from a source equatorward and below the spacecraft (e.g. the cusp/cleft; (3 however, at 8-12 RE geocentric distance the distribution of O+ outflows cannot be explained by velocity filter only. The results suggest that additional energization or acceleration processes for outflowing O+ ions occur at high altitudes and high latitudes in the dayside polar region. Keywords. Magnetospheric physics (Magnetospheric configuration and dynamics, Solar wind-magnetosphere interactions

  12. Statistical analyses in the study of solar wind-magnetosphere coupling

    International Nuclear Information System (INIS)

    Baker, D.N.

    1985-01-01

    Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms

  13. Statistical Literacy: High School Students in Reading, Interpreting and Presenting Data

    Science.gov (United States)

    Hafiyusholeh, M.; Budayasa, K.; Siswono, T. Y. E.

    2018-01-01

    One of the foundations for high school students in statistics is to be able to read data; presents data in the form of tables and diagrams and its interpretation. The purpose of this study is to describe high school students’ competencies in reading, interpreting and presenting data. Subjects were consisted of male and female students who had high levels of mathematical ability. Collecting data was done in form of task formulation which is analyzed by reducing, presenting and verifying data. Results showed that the students read the data based on explicit explanations on the diagram, such as explaining the points in the diagram as the relation between the x and y axis and determining the simple trend of a graph, including the maximum and minimum point. In interpreting and summarizing the data, both subjects pay attention to general data trends and use them to predict increases or decreases in data. The male estimates the value of the (n+1) of weight data by using the modus of the data, while the females estimate the weigth by using the average. The male tend to do not consider the characteristics of the data, while the female more carefully consider the characteristics of data.

  14. New method for eliminating the statistical bias in highly turbulent flow measurements

    International Nuclear Information System (INIS)

    Nakao, S.I.; Terao, Y.; Hirata, K.I.; Kitakyushu Industrial Research Institute, Fukuoka, Japan)

    1987-01-01

    A simple method was developed for eliminating statistical bias which can be applied to highly turbulent flows with the sparse and nonuniform seeding conditions. Unlike the method proposed so far, a weighting function was determined based on the idea that the statistical bias could be eliminated if the asymmetric form of the probability density function of the velocity data were corrected. Moreover, the data more than three standard deviations away from the mean were discarded to remove the apparent turbulent intensity resulting from noise. The present method was applied to data obtained in the wake of a block, which provided local turbulent intensities up to about 120 percent, it was found to eliminate the statistical bias with high accuracy. 9 references

  15. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  16. A statistical approach to the study of concrete carbonation

    Directory of Open Access Journals (Sweden)

    Garcia-Lodeiro, I.

    2014-03-01

    Full Text Available Carbonation is one of the factors that conditions reinforced concrete durability, while porosity is one of the parameters that determines the carbonation rate: as a rule, the greater the porosity, the higher the rate. While many papers have been published on the effect of CO2 penetration in the pore solutions of concretes prepared under different experimental conditions, the literature has yet to address the joint effect of the factors considered in concrete design, such as the water/cement (w/c ratio, type of cement, type of aggregate and presence of admixtures. The present paper discusses the findings of a statistical study of the impact of the aforementioned factors on both system porosity and carbonation rate. The type of cement, individually and in its interaction with the rest of the factors, proved to be the major determinant in concrete carbonation.La carbonatación es uno de los factores que supedita la durabilidad del hormigón armado, siendo la porosidad uno de los parámetros que más condicionan la velocidad de carbonatación. Son muchos los trabajos que estudian el efecto de la penetración del CO2 en la solución de los poros de hormigones preparados bajo distintas condiciones experimentales, sin embargo, no se encuentran referencias que analicen de forma conjunta el efecto de ciertos factores como son la relación agua/cemento (a/c ratio, el tipo de cemento, el tipo de árido y la presencia de aditivos, normalmente consideradas a la hora de diseñar un hormigón. En este trabajo se discuten los resultados obtenidos tras realizar un estudio estadístico del efecto que tienen los factores previamente mencionados, tanto en la porosidad de estos sistemas como en su nivel de carbonatación. El cemento tanto de manera individual como en sus interacciones con el resto de los factores es el factor que mas afecta a la carbonatación del hormigón.

  17. Statistical study on the strength of structural materials and elements

    International Nuclear Information System (INIS)

    Blume, J.A.; Dalal, J.S.; Honda, K.K.

    1975-07-01

    Strength data for structural materials and elements including concrete, reinforcing steel, structural steel, plywood elements, reinforced concrete beams, reinforced concrete columns, brick masonry elements, and concrete masonry walls were statistically analyzed. Sample statistics were computed for these data, and distribution parameters were derived for normal, lognormal, and Weibull distributions. Goodness-of-fit tests were performed on these distributions. Most data, except those for masonry elements, displayed fairly small dispersion. Dispersion in data for structural materials was generally found to be smaller than for structural elements. Lognormal and Weibull distributions displayed better overall fits to data than normal distribution, although either Weibull or lognormal distribution can be used to represent the data analyzed. (auth)

  18. Statistical modeling in phenomenological description of electromagnetic cascade processes produced by high-energy gamma quanta

    International Nuclear Information System (INIS)

    Slowinski, B.

    1987-01-01

    A description of a simple phenomenological model of electromagnetic cascade process (ECP) initiated by high-energy gamma quanta in heavy absorbents is given. Within this model spatial structure and fluctuations of ionization losses of shower electrons and positrons are described. Concrete formulae have been obtained as a result of statistical analysis of experimental data from the xenon bubble chamber of ITEP (Moscow)

  19. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon; Monteiro, Paulo J.M.; Macphee, Donald E.; Glasser, Fredrik P.; Imbabi, Mohammed Salah-Eldin

    2014-01-01

    the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive

  20. Statistical model of a gas diffusion electrode. III. Photomicrograph study

    Energy Technology Data Exchange (ETDEWEB)

    Winsel, A W

    1965-12-01

    A linear section through a gas diffusion electrode produces a certain distribution function of sinews with the pores. From this distribution function some qualities of the pore structure are derived, and an automatic device to determine the distribution function is described. With a statistical model of a gas diffusion electrode the behavior of a DSK electrode is discussed and compared with earlier measurements of the flow resistance of this material.

  1. The Continuation of Cloud Statistics for NASA Climate Change Studies

    Science.gov (United States)

    Wylie, Donald P.

    2001-01-01

    The weather systems, cyclones, and anticyclones, along with air trajectories and cloud forms, are compared to past studies of the Arctic to assess compatibility of the four month study of the Arctic Cloud Experiment flights of the First ISCCP Regional Experiment (FIRE/ACE) with past climatologies. The frequency and movement of cyclones (lows) and anticyclones (highs) followed the general eastward and northeastward directions indicated by past studies. Most cyclones (lows) came from eastern Siberia and the Bering Sea to the south and moved north across the Bering Straight or Alaska into the Arctic Ocean. They generally weakened in central pressure as they moved poleward. Anticyclones (highs) were most common in the eastern Beaufort Sea near Canada in June and July as predicted from previous studies. However, many cyclones and anticyclones moved in westward directions which is rare in other latitudes. Erratic changes in shape and intensity on a daily basis also were observed. The National Center for Environmental Prediction (NCEP) analysis generally reflected the Surface Heat Budget in the Arctic (SHEBA) Ship World Meteorological Organization (WMO) observations which it used. However, NCEP temperatures were biased warm by 1.0 to 1.5 C in April and early May. In July when the surface temperature were at the freezing/thawing point, the NCEP analysis changed to a cold bias of -1.0 C. Dew points had smaller biases except for July where they were biased cold by -1.4 C. Wind speeds had a -2 m/s low bias for the six windiest days. Surface barometric pressures had consistently low biases from -1.2 to -2.8 hPa in all four months. Air parcel historical trajectories were mainly from the south or from local anticyclonic gyres in the Beaufort Sea. Most air came to the SHEBA Ship from the north Pacific Ocean or from Alaska and Canada and occasionally from eastern Siberia. Very few trajectories traced back across the pole to Europe and Central Asia. Cloud cover was high, as

  2. Applied Bayesian statistical studies in biology and medicine

    CERN Document Server

    D’Amore, G; Scalfari, F

    2004-01-01

    It was written on another occasion· that "It is apparent that the scientific culture, if one means production of scientific papers, is growing exponentially, and chaotically, in almost every field of investigation". The biomedical sciences sensu lato and mathematical statistics are no exceptions. One might say then, and with good reason, that another collection of bio­ statistical papers would only add to the overflow and cause even more confusion. Nevertheless, this book may be greeted with some interest if we state that most of the papers in it are the result of a collaboration between biologists and statisticians, and partly the product of the Summer School th "Statistical Inference in Human Biology" which reaches its 10 edition in 2003 (information about the School can be obtained at the Web site http://www2. stat. unibo. itleventilSito%20scuolalindex. htm). is common experience - and not only This is rather important. Indeed, it in Italy - that encounters between statisticians and researchers are spora...

  3. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency.

    Science.gov (United States)

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.

  4. Addressing economic development goals through innovative teaching of university statistics: a case study of statistical modelling in Nigeria

    Science.gov (United States)

    Oseloka Ezepue, Patrick; Ojo, Adegbola

    2012-12-01

    A challenging problem in some developing countries such as Nigeria is inadequate training of students in effective problem solving using the core concepts of their disciplines. Related to this is a disconnection between their learning and socio-economic development agenda of a country. These problems are more vivid in statistical education which is dominated by textbook examples and unbalanced assessment 'for' and 'of' learning within traditional curricula. The problems impede the achievement of socio-economic development objectives such as those stated in the Nigerian Vision 2020 blueprint and United Nations Millennium Development Goals. They also impoverish the ability of (statistics) graduates to creatively use their knowledge in relevant business and industry sectors, thereby exacerbating mass graduate unemployment in Nigeria and similar developing countries. This article uses a case study in statistical modelling to discuss the nature of innovations in statistics education vital to producing new kinds of graduates who can link their learning to national economic development goals, create wealth and alleviate poverty through (self) employment. Wider implications of the innovations for repositioning mathematical sciences education globally are explored in this article.

  5. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    Energy Technology Data Exchange (ETDEWEB)

    Chukbar, B. K., E-mail: bchukbar@mail.ru [National Research Center Kurchatov Institute (Russian Federation)

    2015-12-15

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  6. Non-statistical fluctuations in fragmentation of target nuclei in high energy nuclear interactions

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, Dipak; Ghosh, Premomoy; Ghosh, Alokananda; Roy, Jaya [Jadavpur Univ., Calcutta (India)

    1994-07-01

    Analysis of target fragmented ''black'' particles in nuclear emulsion from high energy relativistic interactions initiated by [sup 16]O at 2.1 GeV/nucleon and [sup 12]C and [sup 24]Mg at 4.5 GeV/nucleon reveal the existence of non-statistical fluctuations in the azimuthal plane of interaction. The asymmetry or the non-statistical fluctuations, while found to be independent of projectile mass or incident energy, are dependent on the excitation energy of the target nucleus. (Author).

  7. Non-statistical fluctuations in fragmentation of target nuclei in high energy nuclear interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Ghosh, Premomoy; Ghosh, Alokananda; Roy, Jaya

    1994-01-01

    Analysis of target fragmented ''black'' particles in nuclear emulsion from high energy relativistic interactions initiated by 16 O at 2.1 GeV/nucleon and 12 C and 24 Mg at 4.5 GeV/nucleon reveal the existence of non-statistical fluctuations in the azimuthal plane of interaction. The asymmetry or the non-statistical fluctuations, while found to be independent of projectile mass or incident energy, are dependent on the excitation energy of the target nucleus. (Author)

  8. NONINVASIVE DIAGNOSIS OF BLADDER CANCER BY CROSS-POLARIZATION OPTICAL COHERENCE TOMOGRAPHY: A BLIND STATISTICAL STUDY

    Directory of Open Access Journals (Sweden)

    O. S. Streltsova

    2014-07-01

    Full Text Available Whether cross-polarization (CP optical coherence tomography (OCT could be used to detect early bladder cancer was ascertained; it was compared with traditional OCT within the framework of blind (closed clinical statistical studies. One hundred and sixteen patients with local nonexophytic (flat pathological processes of the bladder were examined; 360 CP OCT images were obtained and analyzed. The study used an OCT 1300-U CP optical coherence tomographer. CP OCT showed a high (94% sensitivity and a high (84% specificity in the identification of suspected nonexophytic areas in the urinary bladder.

  9. Statistics of Deep Convection in the Congo Basin Derived From High-Resolution Simulations.

    Science.gov (United States)

    White, B.; Stier, P.; Kipling, Z.; Gryspeerdt, E.; Taylor, S.

    2016-12-01

    Convection transports moisture, momentum, heat and aerosols through the troposphere, and so the temporal variability of convection is a major driver of global weather and climate. The Congo basin is home to some of the most intense convective activity on the planet and is under strong seasonal influence of biomass burning aerosol. However, deep convection in the Congo basin remains under studied compared to other regions of tropical storm systems, especially when compared to the neighbouring, relatively well-understood West African climate system. We use the WRF model to perform a high-resolution, cloud-system resolving simulation to investigate convective storm systems in the Congo. Our setup pushes the boundaries of current computational resources, using a 1 km grid length over a domain covering millions of square kilometres and for a time period of one month. This allows us to draw statistical conclusions on the nature of the simulated storm systems. Comparing data from satellite observations and the model enables us to quantify the diurnal variability of deep convection in the Congo basin. This approach allows us to evaluate our simulations despite the lack of in-situ observational data. This provides a more comprehensive analysis of the diurnal cycle than has previously been shown. Further, we show that high-resolution convection-permitting simulations performed over near-seasonal timescales can be used in conjunction with satellite observations as an effective tool to evaluate new convection parameterisations.

  10. Model Accuracy Comparison for High Resolution Insar Coherence Statistics Over Urban Areas

    Science.gov (United States)

    Zhang, Yue; Fu, Kun; Sun, Xian; Xu, Guangluan; Wang, Hongqi

    2016-06-01

    The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR) images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR) coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  11. MODEL ACCURACY COMPARISON FOR HIGH RESOLUTION INSAR COHERENCE STATISTICS OVER URBAN AREAS

    Directory of Open Access Journals (Sweden)

    Y. Zhang

    2016-06-01

    Full Text Available The interferometric coherence map derived from the cross-correlation of two complex registered synthetic aperture radar (SAR images is the reflection of imaged targets. In many applications, it can act as an independent information source, or give additional information complementary to the intensity image. Specially, the statistical properties of the coherence are of great importance in land cover classification, segmentation and change detection. However, compared to the amount of work on the statistical characters of SAR intensity, there are quite fewer researches on interferometric SAR (InSAR coherence statistics. And to our knowledge, all of the existing work that focuses on InSAR coherence statistics, models the coherence with Gaussian distribution with no discrimination on data resolutions or scene types. But the properties of coherence may be different for different data resolutions and scene types. In this paper, we investigate on the coherence statistics for high resolution data over urban areas, by making a comparison of the accuracy of several typical statistical models. Four typical land classes including buildings, trees, shadow and roads are selected as the representatives of urban areas. Firstly, several regions are selected from the coherence map manually and labelled with their corresponding classes respectively. Then we try to model the statistics of the pixel coherence for each type of region, with different models including Gaussian, Rayleigh, Weibull, Beta and Nakagami. Finally, we evaluate the model accuracy for each type of region. The experiments on TanDEM-X data show that the Beta model has a better performance than other distributions.

  12. A Review of Study Designs and Statistical Methods for Genomic Epidemiology Studies using Next Generation Sequencing

    Directory of Open Access Journals (Sweden)

    Qian eWang

    2015-04-01

    Full Text Available Results from numerous linkage and association studies have greatly deepened scientists’ understanding of the genetic basis of many human diseases, yet some important questions remain unanswered. For example, although a large number of disease-associated loci have been identified from genome-wide association studies (GWAS in the past 10 years, it is challenging to interpret these results as most disease-associated markers have no clear functional roles in disease etiology, and all the identified genomic factors only explain a small portion of disease heritability. With the help of next-generation sequencing (NGS, diverse types of genomic and epigenetic variations can be detected with high accuracy. More importantly, instead of using linkage disequilibrium to detect association signals based on a set of pre-set probes, NGS allows researchers to directly study all the variants in each individual, therefore promises opportunities for identifying functional variants and a more comprehensive dissection of disease heritability. Although the current scale of NGS studies is still limited due to the high cost, the success of several recent studies suggests the great potential for applying NGS in genomic epidemiology, especially as the cost of sequencing continues to drop. In this review, we discuss several pioneer applications of NGS, summarize scientific discoveries for rare and complex diseases, and compare various study designs including targeted sequencing and whole-genome sequencing using population-based and family-based cohorts. Finally, we highlight recent advancements in statistical methods proposed for sequencing analysis, including group-based association tests, meta-analysis techniques, and annotation tools for variant prioritization.

  13. Statistical Analysis of 30 Years Rainfall Data: A Case Study

    Science.gov (United States)

    Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.

    2017-07-01

    Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.

  14. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  15. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  16. QCD Precision Measurements and Structure Function Extraction at a High Statistics, High Energy Neutrino Scattering Experiment: NuSOnG

    International Nuclear Information System (INIS)

    Adams, T.; Batra, P.; Bugel, Leonard G.; Camilleri, Leslie Loris; Conrad, Janet Marie; Fisher, Peter H.; Formaggio, Joseph Angelo; Karagiorgi, Georgia S.; )

    2009-01-01

    We extend the physics case for a new high-energy, ultra-high statistics neutrino scattering experiment, NuSOnG (Neutrino Scattering On Glass) to address a variety of issues including precision QCD measurements, extraction of structure functions, and the derived Parton Distribution Functions (PDFs). This experiment uses a Tevatron-based neutrino beam to obtain a sample of Deep Inelastic Scattering (DIS) events which is over two orders of magnitude larger than past samples. We outline an innovative method for fitting the structure functions using a parameterized energy shift which yields reduced systematic uncertainties. High statistics measurements, in combination with improved systematics, will enable NuSOnG to perform discerning tests of fundamental Standard Model parameters as we search for deviations which may hint of 'Beyond the Standard Model' physics

  17. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    Science.gov (United States)

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  18. Simulation of statistical γ-spectra of highly excited rare earth nuclei

    International Nuclear Information System (INIS)

    Schiller, A.; Munos, G.; Guttormsen, M.; Bergholt, L.; Melby, E.; Rekstad, J.; Siem, S.; Tveter, T.S.

    1997-05-01

    The statistical γ-spectra of highly excited even-even rare earth nuclei are simulated applying appropriate level density and strength function to a given nucleus. Hindrance effects due to K-conservation are taken into account. Simulations are compared to experimental data from the 163 Dy( 3 He,α) 162 Dy and 173 Yb( 3 He,α) 172 Yb reactions. The influence of the K quantum number at higher energies is discussed. 21 refs., 7 figs., 2 tabs

  19. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  20. Statistical properties of highly excited quantum eigenstates of a strongly chaotic system

    International Nuclear Information System (INIS)

    Aurich, R.; Steiner, F.

    1992-06-01

    Statistical properties of highly excited quantal eigenstates are studied for the free motion (geodesic flow) on a compact surface of constant negative curvature (hyperbolic octagon) which represents a strongly chaotic system (K-system). The eigenstates are expanded in a circular-wave basis, and it turns out that the expansion coefficients behave as Gaussian pseudo-random numbers. It is shown that this property leads to a Gaussian amplitude distribution P(ψ) in the semiclassical limit, i.e. the wavefunctions behave as Gaussian random functions. This behaviour, which should hold for chaotic systems in general, is nicely confirmed for eigenstates lying 10000 states above the ground state thus probing the semiclassical limit. In addition, the autocorrelation function and the path-correlation function are calculated and compared with a crude semiclassical Bessel-function approximation. Agreement with the semiclassical prediction is only found, if a local averaging is performed over roughly 1000 de Broglie wavelengths. On smaller scales, the eigenstates show much more structure than predicted by the first semiclassical approximation. (orig.)

  1. Statistical Study of Transformation Changes in the Ukrainian Economy

    Directory of Open Access Journals (Sweden)

    O. V.

    2017-12-01

    Full Text Available The article deals with the economic diagnostics of some important macroeconomic indicators of Ukraine that will reveal the nature and speed of the economic transformation. During the period of 2003–2007, the Ukrainian economy grew at an impressive pace. However, at present, the country is undergoing a period of serious trials, it needs to address structural problems that endanger long-term economic growth. The way out of the current situation should be the realization of the potential for growth of advanced sectors and increase of productivity across the national economy. Special attention should be paid to the transition from extractive institutions to inclusive ones. Key factors in accelerating the Ukrainian economy are more vigorous fight against corruption and investment attraction. A set of institutional variables is proposed, which allows for a more thorough assessment of the nature of economic transformation in Ukraine and detection of such deviations – transformation of the national economy occurs at different speeds. Along with the traditional shifts in the structure of GDP (the dominating share of services, there’s still insignificant statistical effect of such important institutional categories as the level of political globalization, the control of corruption, the level of property rights protection, the rule of law, and the level of social globalization.

  2. Statistical Studies of Mesoscale Forecast Models MM5 and WRF

    National Research Council Canada - National Science Library

    Henmi, Teizi

    2004-01-01

    ... models were carried out and the results were compared with surface observation data. Both models tended to overforecast temperature and dew-point temperature, although the correlation coefficients between forecast and observations were fairly high...

  3. The statistical studies of the inner boundary of plasma sheet

    Directory of Open Access Journals (Sweden)

    J. B. Cao

    2011-02-01

    Full Text Available The penetration of plasma sheet ions into the inner magnetosphere is very important to the inner magnetospheric dynamics since plasma sheet ions are one of the major particle sources of ring current during storm times. However, the direct observations of the inner boundary of the plasma sheet are fairly rare due to the limited number of satellites in near equatorial orbits outside 6.6 RE. In this paper, we used the ion data recorded by TC-1 from 2004 to 2006 to study the distribution of inner boundary of ion plasma sheet (IBIPS and for the first time show the observational distribution of IBIPS in the equatorial plane. The IBIPS has a dawn-dusk asymmetry, being farthest to the Earth in the 06:00 08:00 LT bin and closest to the Earth in the 18:00–20:00 LT bin. Besides, the IBIPS has also a day-night asymmetry, which may be due to the fact that the ions on the dayside are exposed more time to loss mechanisms on their drift paths. The radial distance of IBIPS decrease generally with the increase of Kp index. The mean radial distance of IBIPS is basically larger than 6.6 RE during quiet times and smaller than 6.6 RE during active times. When the strength of convection electric field increases, the inward shift of IBIPS is most significant on the night side (22:00–02:00 LT. For Kp ≤ 0+, only 16% of IBIPSs penetrate inside the geosynchronous orbit. For 2 ≤ Kp +, however, 70% of IBIPSs penetrate inside the geosynchronous orbit. The IBIPS has weak correlations with the AE and Dst indexes. The average correlation coefficient between Ri and Kp is −0.58 while the correlation coefficient between Ri and AE/Dst is only −0.29/0.17. The correlation coefficients are local time dependent. Particularly, Ri and Kp are highly correlated (r=−0.72 in the night sector, meaning that the radial distance of IBIPS Ri in the night sector has the good response to the Kp index These observations indicate that Kp plays a key role in determining the position of

  4. A Study of Faculty Views of Statistics and Student Preparation beyond an Introductory Class

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura; Smith, Jessalyn

    2013-01-01

    The purpose of this research is to better understand the role of statistics in teaching and research by faculty from all disciplines and their perceptions of the statistical preparation of their students. This study reports the findings of a survey administered to faculty from seven colleges and universities regarding the use of statistics in…

  5. A Statistical Model for Regional Tornado Climate Studies.

    Directory of Open Access Journals (Sweden)

    Thomas H Jagger

    Full Text Available Tornado reports are locally rare, often clustered, and of variable quality making it difficult to use them directly to describe regional tornado climatology. Here a statistical model is demonstrated that overcomes some of these difficulties and produces a smoothed regional-scale climatology of tornado occurrences. The model is applied to data aggregated at the level of counties. These data include annual population, annual tornado counts and an index of terrain roughness. The model has a term to capture the smoothed frequency relative to the state average. The model is used to examine whether terrain roughness is related to tornado frequency and whether there are differences in tornado activity by County Warning Area (CWA. A key finding is that tornado reports increase by 13% for a two-fold increase in population across Kansas after accounting for improvements in rating procedures. Independent of this relationship, tornadoes have been increasing at an annual rate of 1.9%. Another finding is the pattern of correlated residuals showing more Kansas tornadoes in a corridor of counties running roughly north to south across the west central part of the state consistent with the dryline climatology. The model is significantly improved by adding terrain roughness. The effect amounts to an 18% reduction in the number of tornadoes for every ten meter increase in elevation standard deviation. The model indicates that tornadoes are 51% more likely to occur in counties served by the CWAs of DDC and GID than elsewhere in the state. Flexibility of the model is illustrated by fitting it to data from Illinois, Mississippi, South Dakota, and Ohio.

  6. A Statistical Model for Regional Tornado Climate Studies.

    Science.gov (United States)

    Jagger, Thomas H; Elsner, James B; Widen, Holly M

    2015-01-01

    Tornado reports are locally rare, often clustered, and of variable quality making it difficult to use them directly to describe regional tornado climatology. Here a statistical model is demonstrated that overcomes some of these difficulties and produces a smoothed regional-scale climatology of tornado occurrences. The model is applied to data aggregated at the level of counties. These data include annual population, annual tornado counts and an index of terrain roughness. The model has a term to capture the smoothed frequency relative to the state average. The model is used to examine whether terrain roughness is related to tornado frequency and whether there are differences in tornado activity by County Warning Area (CWA). A key finding is that tornado reports increase by 13% for a two-fold increase in population across Kansas after accounting for improvements in rating procedures. Independent of this relationship, tornadoes have been increasing at an annual rate of 1.9%. Another finding is the pattern of correlated residuals showing more Kansas tornadoes in a corridor of counties running roughly north to south across the west central part of the state consistent with the dryline climatology. The model is significantly improved by adding terrain roughness. The effect amounts to an 18% reduction in the number of tornadoes for every ten meter increase in elevation standard deviation. The model indicates that tornadoes are 51% more likely to occur in counties served by the CWAs of DDC and GID than elsewhere in the state. Flexibility of the model is illustrated by fitting it to data from Illinois, Mississippi, South Dakota, and Ohio.

  7. AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models

    DEFF Research Database (Denmark)

    Fournier, David A.; Skaug, Hans J.; Ancheta, Johnoel

    2011-01-01

    Many criteria for statistical parameter estimation, such as maximum likelihood, are formulated as a nonlinear optimization problem.Automatic Differentiation Model Builder (ADMB) is a programming framework based on automatic differentiation, aimed at highly nonlinear models with a large number...... of such a feature is the generic implementation of Laplace approximation of high-dimensional integrals for use in latent variable models. We also review the literature in which ADMB has been used, and discuss future development of ADMB as an open source project. Overall, the main advantages ofADMB are flexibility...

  8. High-dimensional data: p >> n in mathematical statistics and bio-medical applications

    OpenAIRE

    Van De Geer, Sara A.; Van Houwelingen, Hans C.

    2004-01-01

    The workshop 'High-dimensional data: p >> n in mathematical statistics and bio-medical applications' was held at the Lorentz Center in Leiden from 9 to 20 September 2002. This special issue of Bernoulli contains a selection of papers presented at that workshop. ¶ The introduction of high-throughput micro-array technology to measure gene-expression levels and the publication of the pioneering paper by Golub et al. (1999) has brought to life a whole new branch of data analysis under the name of...

  9. Spin flip statistics and spin wave interference patterns in Ising ferromagnetic films: A Monte Carlo study.

    Science.gov (United States)

    Acharyya, Muktish

    2017-07-01

    The spin wave interference is studied in two dimensional Ising ferromagnet driven by two coherent spherical magnetic field waves by Monte Carlo simulation. The spin waves are found to propagate and interfere according to the classic rule of interference pattern generated by two point sources. The interference pattern of spin wave is observed in one boundary of the lattice. The interference pattern is detected and studied by spin flip statistics at high and low temperatures. The destructive interference is manifested as the large number of spin flips and vice versa.

  10. TRAN-STAT: statistics for environmental studies, No. 21

    International Nuclear Information System (INIS)

    1982-11-01

    Three topics relevant to radionuclide studies are discussed: (1) target and sampled populations; (2) sources of errors in field radionuclide studies; and (3) an objective probability plotting procedure. 19 references, 4 figures, 3 tables

  11. A new statistic for identifying batch effects in high-throughput genomic data that uses guided principal component analysis.

    Science.gov (United States)

    Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E

    2013-11-15

    Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu

  12. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  13. A study on the use of Gumbel approximation with the Bernoulli spatial scan statistic.

    Science.gov (United States)

    Read, S; Bath, P A; Willett, P; Maheswaran, R

    2013-08-30

    The Bernoulli version of the spatial scan statistic is a well established method of detecting localised spatial clusters in binary labelled point data, a typical application being the epidemiological case-control study. A recent study suggests the inferential accuracy of several versions of the spatial scan statistic (principally the Poisson version) can be improved, at little computational cost, by using the Gumbel distribution, a method now available in SaTScan(TM) (www.satscan.org). We study in detail the effect of this technique when applied to the Bernoulli version and demonstrate that it is highly effective, albeit with some increase in false alarm rates at certain significance thresholds. We explain how this increase is due to the discrete nature of the Bernoulli spatial scan statistic and demonstrate that it can affect even small p-values. Despite this, we argue that the Gumbel method is actually preferable for very small p-values. Furthermore, we extend previous research by running benchmark trials on 12 000 synthetic datasets, thus demonstrating that the overall detection capability of the Bernoulli version (i.e. ratio of power to false alarm rate) is not noticeably affected by the use of the Gumbel method. We also provide an example application of the Gumbel method using data on hospital admissions for chronic obstructive pulmonary disease. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Tabular statistical summay of data analysis - Calawah River Riverscape Study

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The objective of this study was to identify the patterns of juvenile salmonid distribution and relative abundance in relation to habitat correlates. It is the first...

  15. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    Science.gov (United States)

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  16. Study design and statistical analysis of data in human population studies with the micronucleus assay.

    Science.gov (United States)

    Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano

    2011-01-01

    The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.

  17. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermi's equation. Approximate calculus methods are found from analytic study of the T-Fermi's equation for non zero temperature. T-Fermi's equation is solved with the code ''Golem''written in Fortran V (Univac). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (author) [es

  18. Application of the Thomas-Fermi statistical model to the thermodynamics of high density matter

    International Nuclear Information System (INIS)

    Martin, R.

    1977-01-01

    The Thomas-Fermi statistical model, from the N-body point of view is used in order to have systematic corrections to the T-Fermis equation. Approximate calculus methods are found from analytic study of the T-Fermis equation for non zero temperature. T-Fermis equation is solved with the code GOLEM written in FORTRAN V (UNIVAC). It also provides the thermodynamical quantities and a new method to calculate several isothermal tables. (Author) 24 refs

  19. Statistical methods for studying the evolution of networks and behavior

    NARCIS (Netherlands)

    Schweinberger, Michael

    2007-01-01

    Studying longitudinal network and behavior data is important for understanding social processes, because human beings are interrelated, and the relationships among human beings (human networks) on one hand and human behavior on the other hand are not independent. The complex nature of longitudinal

  20. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    Science.gov (United States)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  1. High order statistical signatures from source-driven measurements of subcritical fissile systems

    International Nuclear Information System (INIS)

    Mattingly, J.K.

    1998-01-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements

  2. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  3. Study on statistical analysis of nonlinear and nonstationary reactor noises

    International Nuclear Information System (INIS)

    Hayashi, Koji

    1993-03-01

    For the purpose of identification of nonlinear mechanism and diagnosis of nuclear reactor systems, analysis methods for nonlinear reactor noise have been studied. By adding newly developed approximate response function to GMDH, a conventional nonlinear identification method, a useful method for nonlinear spectral analysis and identification of nonlinear mechanism has been established. Measurement experiment and analysis were performed on the reactor power oscillation observed in the NSRR installed at the JAERI and the cause of the instability was clarified. Furthermore, the analysis and data recording methods for nonstationary noise have been studied. By improving the time resolution of instantaneous autoregressive spectrum, a method for monitoring and diagnosis of operational status of nuclear reactor has been established. A preprocessing system for recording of nonstationary reactor noise was developed and its usability was demonstrated through a measurement experiment. (author) 139 refs

  4. Case study of degeneracy in quantum statistics. II

    International Nuclear Information System (INIS)

    Barker, W.A.; Raney, D.; Sy, J.

    1988-01-01

    In an earlier paper, classical and weak degeneracy, with the activity parameter r between 0 and 1, are investigated. In this paper, intermediate and strong degeneracy, with r>1, are studied. Coefficients for the Joyce--Dixon-type series are calculated for bosons as well as fermions in one, two, and three dimensions, for either nonrelativistic or ultrarelativistic particles. The theory is applied to the electrons and Cooper pairs in mercury at four temperatures

  5. Statistical Study of Solar Dimmings Using CoDiT

    International Nuclear Information System (INIS)

    Krista, Larisza D.; Reinard, Alysha A.

    2017-01-01

    We present the results from analyzing the physical and morphological properties of 154 dimmings (transient coronal holes) and the associated flares and coronal mass ejections (CMEs). Each dimming in our 2013 catalog was processed with the semi-automated Coronal Dimming Tracker using Solar Dynamics Observatory AIA 193 Å observations and HMI magnetograms. Instead of the typically used difference images, we used our coronal hole detection algorithm to detect transient dark regions “directly” in extreme ultraviolet (EUV) images. This allowed us to study dimmings as the footpoints of CMEs—in contrast with the larger, diffuse dimmings seen in difference images that represent the projected view of the rising, expanding plasma. Studying the footpoint-dimming morphology allowed us to better understand the CME structure in the low corona. While comparing the physical properties of dimmings, flares, and CMEs we were also able to identify relationships between the different parts of this complex eruptive phenomenon. We found that larger dimmings are longer-lived, suggesting that it takes longer to “close down” large open magnetic regions. Also, during their growth phase, smaller dimmings acquire a higher magnetic flux imbalance (i. e., become more unipolar) than larger dimmings. Furthermore, we found that the EUV intensity of dimmings (indicative of local electron density) correlates with how much plasma was removed and how energetic the eruption was. Studying the morphology of dimmings (single, double, fragmented) also helped us identify different configurations of the quasi-open magnetic field.

  6. Statistical Study of Solar Dimmings Using CoDiT

    Energy Technology Data Exchange (ETDEWEB)

    Krista, Larisza D.; Reinard, Alysha A., E-mail: larisza.krista@noaa.gov [University of Colorado/Cooperative Institute for Research in Environmental Sciences, Boulder, CO 80205 (United States)

    2017-04-10

    We present the results from analyzing the physical and morphological properties of 154 dimmings (transient coronal holes) and the associated flares and coronal mass ejections (CMEs). Each dimming in our 2013 catalog was processed with the semi-automated Coronal Dimming Tracker using Solar Dynamics Observatory AIA 193 Å observations and HMI magnetograms. Instead of the typically used difference images, we used our coronal hole detection algorithm to detect transient dark regions “directly” in extreme ultraviolet (EUV) images. This allowed us to study dimmings as the footpoints of CMEs—in contrast with the larger, diffuse dimmings seen in difference images that represent the projected view of the rising, expanding plasma. Studying the footpoint-dimming morphology allowed us to better understand the CME structure in the low corona. While comparing the physical properties of dimmings, flares, and CMEs we were also able to identify relationships between the different parts of this complex eruptive phenomenon. We found that larger dimmings are longer-lived, suggesting that it takes longer to “close down” large open magnetic regions. Also, during their growth phase, smaller dimmings acquire a higher magnetic flux imbalance (i. e., become more unipolar) than larger dimmings. Furthermore, we found that the EUV intensity of dimmings (indicative of local electron density) correlates with how much plasma was removed and how energetic the eruption was. Studying the morphology of dimmings (single, double, fragmented) also helped us identify different configurations of the quasi-open magnetic field.

  7. Statistical Study of Turbulence: Spectral Functions and Correlation Coefficients

    Science.gov (United States)

    Frenkiel, Francois N.

    1958-01-01

    In reading the publications on turbulence of different authors, one often runs the risk of confusing the various correlation coefficients and turbulence spectra. We have made a point of defining, by appropriate concepts, the differences which exist between these functions. Besides, we introduce in the symbols a few new characteristics of turbulence. In the first chapter, we study some relations between the correlation coefficients and the different turbulence spectra. Certain relations are given by means of demonstrations which could be called intuitive rather than mathematical. In this way we demonstrate that the correlation coefficients between the simultaneous turbulent velocities at two points are identical, whether studied in Lagrange's or in Euler's systems. We then consider new spectra of turbulence, obtained by study of the simultaneous velocities along a straight line of given direction. We determine some relations between these spectra and the correlation coefficients. Examining the relation between the spectrum of the turbulence measured at a fixed point and the longitudinal-correlation curve given by G. I. Taylor, we find that this equation is exact only when the coefficient is very small.

  8. Statistical approach to predict compressive strength of high workability slag-cement mortars

    International Nuclear Information System (INIS)

    Memon, N.A.; Memon, N.A.; Sumadi, S.R.

    2009-01-01

    This paper reports an attempt made to develop empirical expressions to estimate/ predict the compressive strength of high workability slag-cement mortars. Experimental data of 54 mix mortars were used. The mortars were prepared with slag as cement replacement of the order of 0, 50 and 60%. The flow (workability) was maintained at 136+-3%. The numerical and statistical analysis was performed by using database computer software Microsoft Office Excel 2003. Three empirical mathematical models were developed to estimate/predict 28 days compressive strength of high workability slag cement-mortars with 0, 50 and 60% slag which predict the values accurate between 97 and 98%. Finally a generalized empirical mathematical model was proposed which can predict 28 days compressive strength of high workability mortars up to degree of accuracy 95%. (author)

  9. Infrared maritime target detection using the high order statistic filtering in fractional Fourier domain

    Science.gov (United States)

    Zhou, Anran; Xie, Weixin; Pei, Jihong

    2018-06-01

    Accurate detection of maritime targets in infrared imagery under various sea clutter conditions is always a challenging task. The fractional Fourier transform (FRFT) is the extension of the Fourier transform in the fractional order, and has richer spatial-frequency information. By combining it with the high order statistic filtering, a new ship detection method is proposed. First, the proper range of angle parameter is determined to make it easier for the ship components and background to be separated. Second, a new high order statistic curve (HOSC) at each fractional frequency point is designed. It is proved that maximal peak interval in HOSC reflects the target information, while the points outside the interval reflect the background. And the value of HOSC relative to the ship is much bigger than that to the sea clutter. Then, search the curve's maximal target peak interval and extract the interval by bandpass filtering in fractional Fourier domain. The value outside the peak interval of HOSC decreases rapidly to 0, so the background is effectively suppressed. Finally, the detection result is obtained by the double threshold segmenting and the target region selection method. The results show the proposed method is excellent for maritime targets detection with high clutters.

  10. Data analysis in high energy physics. A practical guide to statistical methods

    International Nuclear Information System (INIS)

    Behnke, Olaf; Schoerner-Sadenius, Thomas; Kroeninger, Kevin; Schott, Gregory

    2013-01-01

    This practical guide covers the essential tasks in statistical data analysis encountered in high energy physics and provides comprehensive advice for typical questions and problems. The basic methods for inferring results from data are presented as well as tools for advanced tasks such as improving the signal-to-background ratio, correcting detector effects, determining systematics and many others. Concrete applications are discussed in analysis walkthroughs. Each chapter is supplemented by numerous examples and exercises and by a list of literature and relevant links. The book targets a broad readership at all career levels - from students to senior researchers.

  11. Statistical Methods for Comparative Phenomics Using High-Throughput Phenotype Microarrays

    KAUST Repository

    Sturino, Joseph

    2010-01-24

    We propose statistical methods for comparing phenomics data generated by the Biolog Phenotype Microarray (PM) platform for high-throughput phenotyping. Instead of the routinely used visual inspection of data with no sound inferential basis, we develop two approaches. The first approach is based on quantifying the distance between mean or median curves from two treatments and then applying a permutation test; we also consider a permutation test applied to areas under mean curves. The second approach employs functional principal component analysis. Properties of the proposed methods are investigated on both simulated data and data sets from the PM platform.

  12. Statistical studies of vertical and horizontal earthquake spectra

    Energy Technology Data Exchange (ETDEWEB)

    Hall, W.J.; Mohraz, B.; Newmark, N.M.

    1976-01-01

    The study reveals that there is no well-defined dependence of normalized seismic design response spectra on the earthquake ground acceleration level. Recommendations for horizontal design response spectra are close to those given in Regulatory Guide 1.60. Recommendations for vertical response spectra are somewhat lower than Regulatory Guide 1.60 provisions in the frequency range 2 to 30 Hz aproximately. The results are based on seismic information recorded along the west coast of the United States and are directly applicable to that region only.

  13. A statistical study on scaling factors for radionuclide assay

    International Nuclear Information System (INIS)

    Ahn, Sang Myun

    1993-02-01

    To comply with the classification requirements listed in 10 CFR 61, operators of nuclear power plants are recommended to identify and quantify the concentration of several nuclids in low-level radioactive wastes(LLWs). Much of the specified radionuclides can not be easily measured in routine plant analyses. Many indirect methods has been suggested to determine the radionuclide concentrations upon which the waste classification is based. Such indirect methods include the use of scaling factors which infer the concentration of one radionuclide from another which can be measured easily. In this study, correlation analysis is performed to find out the important variables. Regression equations are attempted to provide a means of indirectly determining the concentration of the difficult-to-measure nuclides based on the result of the correlation analysis. Then residual analysis and the corresponding stepwise procedure are followed to check the regression model and select the best regression equation. The regression equation whose log mean dispersion is smaller than 10 is suggested as the appropriate correlation formula. Most of the quadratic regression equations are turned out to be able to use as a correlation formula. But, TRUs show log mean dispersions which are much larger than 10. It is concluded that the mechanisms of their formation and disappearance are much more complex. And it is also difficult to select the key nuclide. In the case of TRUs, further study is required to find out the relevant correlation formula

  14. Some statistical design and analysis aspects for NAEG studies

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Eberhardt, L.L.

    1975-01-01

    Some of the design and analysis aspects of the NAEG studies at safety-shot sites are reviewed in conjunction with discussions of possible new approaches. The use of double sampling to estimate inventories is suggested as a means of obtaining data for estimating the geographical distribution of plutonium using computer contouring programs. The lack of estimates of error for plutonium contours is noted and a regression approach discussed for obtaining such estimates. The kinds of new data that are now available for analysis from A site of Area 11 and the four Tonopah Test Range (TTR) sites are outlined, and the need for a closer look at methods for analyzing ratio-type data is pointed out. The necessity for thorough planning of environmental sampling programs is emphasized in order to obtain the maximum amount of information for fixed cost. Some general planning aspects of new studies at nuclear sites and experimental clean-up plots are discussed, as is the planning of interlaboratory comparisons. (U.S.)

  15. Studies on quantum field theory and statistical mechanics

    International Nuclear Information System (INIS)

    Zhang, S.

    1987-01-01

    This dissertation is a summary of research in various areas of theoretical physics and is divided into three parts. In the first part, quantum fluctuations of the recently proposed superconducting cosmic strings are studied. It is found that vortices on the string world sheet represent an important class of fluctuation modes which tend to disorder the system. Both heuristic arguments and detailed renormalization group analysis reveal that these vortices do not appear in bound pairs but rather from a gas of free vortices. Based on this observation we argue that this fluctuation mode violates the topological conservation law on which superconductivity is based. Anomalies and topological aspects of supersymmetric quantum field theories are studied in the second part of this dissertation. Using the superspace formulation of the N = 1 spinning string, we obtain a path integral measure which is free from the world-sheet general coordinate as well as the supersymmetry anomalies and therefore determine the conformal anomaly and critical dimension of the spinning string. We also apply Fujikawa's formalism to computer the chiral anomaly in conformal as well as ordinary supergravity. Finally, we given a Noether-method construction of the supersymmetrized Chern-Simons term in five dimensional supergravity. In the last part of this dissertation, the soliton excitations in the quarter-filled Peierls-Hubbard model are investigated in both the large and the small U limit. For a strictly one dimensional system at zero temperature, we find that solitons in both limits are in one-to-one correspondence, while in the presence of weak three dimensional couplings or at finite temperature, the large U systems differ qualitatively from the small U systems in that the spin associated with the solitons ceases to be a sharp quantum observable

  16. Statistical study on the thyroid disorders on Sudanese female undergoing in vitro investigations in Khartoum state

    International Nuclear Information System (INIS)

    Albaba, O. S. A.

    2002-10-01

    In this study 711 Sudanese female have been analyzed for thyroid function. Thyroid related hormones were measured thyroxine (T4), triiodothyronine (T3) and thyroid stimulating hormone (TSH). The study had been held during one complete year. The female subjects were referred to Sudan Atomic Energy Commission radioimmunoassay laboratory from different hospitals in Khartoum state. The age of females varied from less than one year up to 70 years. The age was divided into 10 years interval in order to study the dominants thyroid disorder in each interval. Statistical package for social science (SPSS) program was used in the study as data analysis tool, the clear observation from this study was the high incidence of disorders among the age between 20 up to 40 years. (Author)

  17. Statistical study on cancer patients of cancer research hospital

    International Nuclear Information System (INIS)

    Shim, Yoon Sang; Choi, Soo Yong; Won, Hyuk; Kim, Kee Hwa

    1991-01-01

    The total number of malignant neoplasms included on this study 7,787 cases(10.4%) among 74,928 cases for 2 years. On sex, females with 57.6% were much more than males with 42.4%. The highest proportion of cancer 50-59 age group. The most frequent primary site among males was found to be stomach with 36.2%, followed by liver(12.3%), lung(12.2%), esophagus(15.5%) and larynx(4.9%). In females, the first order was uterine cervix with 47.3%, followed most common type of morphology of malignant neoplasms was adenocarcinoma(39.0%) in males an squamous cell carcinoma(56.2%) in females. Among the cancer patients initially diagnosed in this hospital, the proportion of malignant neoplasms by the extent of disease was 4.6% for patient with carcinoma-in-situ, 76.3% for patients with localized involvement, 11.6% for patients with regional involvement and 7.5% for patients with distant involvement. Among,the cancer patients initially treatment in this hospital, the proportion of malignant neoplasms by the method of treatment was 19.0% for surgery, 27.7 for radiotherapy and 24.2% for chemotherapy. Among the cancer patients confirmed by medical records, 11.2% was traced more than 5 years. (Author)

  18. Task-based statistical image reconstruction for high-quality cone-beam CT

    Science.gov (United States)

    Dang, Hao; Webster Stayman, J.; Xu, Jennifer; Zbijewski, Wojciech; Sisniega, Alejandro; Mow, Michael; Wang, Xiaohui; Foos, David H.; Aygun, Nafi; Koliatsos, Vassilis E.; Siewerdsen, Jeffrey H.

    2017-11-01

    Task-based analysis of medical imaging performance underlies many ongoing efforts in the development of new imaging systems. In statistical image reconstruction, regularization is often formulated in terms to encourage smoothness and/or sharpness (e.g. a linear, quadratic, or Huber penalty) but without explicit formulation of the task. We propose an alternative regularization approach in which a spatially varying penalty is determined that maximizes task-based imaging performance at every location in a 3D image. We apply the method to model-based image reconstruction (MBIR—viz., penalized weighted least-squares, PWLS) in cone-beam CT (CBCT) of the head, focusing on the task of detecting a small, low-contrast intracranial hemorrhage (ICH), and we test the performance of the algorithm in the context of a recently developed CBCT prototype for point-of-care imaging of brain injury. Theoretical predictions of local spatial resolution and noise are computed via an optimization by which regularization (specifically, the quadratic penalty strength) is allowed to vary throughout the image to maximize local task-based detectability index ({{d}\\prime} ). Simulation studies and test-bench experiments were performed using an anthropomorphic head phantom. Three PWLS implementations were tested: conventional (constant) penalty; a certainty-based penalty derived to enforce constant point-spread function, PSF; and the task-based penalty derived to maximize local detectability at each location. Conventional (constant) regularization exhibited a fairly strong degree of spatial variation in {{d}\\prime} , and the certainty-based method achieved uniform PSF, but each exhibited a reduction in detectability compared to the task-based method, which improved detectability up to ~15%. The improvement was strongest in areas of high attenuation (skull base), where the conventional and certainty-based methods tended to over-smooth the data. The task-driven reconstruction method presents a

  19. Statistical study on cancer patients of Korea cancer centre hospital

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Soo Yong; Kim, Kee Hwa; Mok, Kang Sung [Korea Cancer Center Hospital of Korea Atomic Energy Research Institute, Seoul (Korea, Republic of)

    1994-12-01

    The total number of malignant neoplasms included in this study 53,566 cases(14.1%) among 379,582 patients from 1984 to 1993. On sex, females with 51.3% were much more than males with 48.7%. The highest proportion of cancer patients by age was 35.0% in males and 28.4% in females, respectively for 50-59 age group. The most frequent primary site among males was found to be stomach with 33.2%, followed by liver(15.1%), lung(14.9%), esophagus(5.3%) and larynx(3.3%). In females, the first order was uterine cervix with 37.8%, followed by stomach(16.5%), breast(14.8%), thyroid gland(4.3%) and lung (3.8%). The proportion of malignant neoplasms diagnosed by histology made up 67.0%, whereas 20.2% was diagnosed by clinical investigation(X-ray, CT, MRI etc). Among the cancer patients initially diagnosed in this hospital, the proportion of malignant neoplasms by the extent of disease was 3.7% for patient with carcinoma-in-situ, 58.7% for patients with localized involvement, 18.4% for patients with regional involvement and 11.1% for patients with distant involvement. Among the cancer patients initially treatment in this hospital, the proportion of malignant neoplasms by the method of treatment was 27.5% for surgery, 22.5% for radiotherapy and 30.1% for chemotherapy. The proportion of cancer patients traced to death was only to 3.6%, 1,944 cases. Among them, 72.5% survived for less than 1 year. 17 figs, 7 tabs, 28 refs. (Author).

  20. Statistical study on cancer patients of cancer research hospital

    International Nuclear Information System (INIS)

    Shim, Yun Sang; Choi, Soo Yong; Kim, Ki Wha; Kang, Sung Mok

    1993-01-01

    The total number of malignant neoplasms included in this study 15,737 cases(11.8%) among 133,251 cases for 3 years. On sex, females with 52.9% were much more than males with 47.1%. The highest proportion of cancer patients by age was 33.7% in males and 28.5% in females, respectivelty for 50-59 age group. The most frequent primary site among males was found to be stomach with 35.5%, followed by liver(14.7%), lung(13.0%), esophagus(5.4%) and colon (3.2%). In females, the first order was uterine cervix with 40.6%, followed by stomach(17.2%), breast(14.4), rectum(3.7%) and lung(3.4%). The most common type of morphology of malignant neoplasms was adenocarcinoma(47.4%) in males an squamous cell carcinoma(58.0%) in females. Among the cancer patients initially diagnosed in this hospital, the proportion of malignant neoplasms by the exent of disease was 2.5% for patient with carcinoma-in-situ, 54.1% for patients with localized involvement, 13.3% for patients with regional involvement and 8.5% for patients with distant involvement. Among the cancer patients initially treatment in this hospital, the proportion of malignant neoplasms by the method of treatment was 23.6% for surgery, 25.3% for radiotherapy and 30.3% for chemotherapy. Among the cancer patients confirmed by medical records, 7.7% was traced more than 5 years. (Author)

  1. Statistical study on cancer patients of Korea cancer centre hospital

    International Nuclear Information System (INIS)

    Choi, Soo Yong; Kim, Kee Hwa; Kang Sung Mok

    1994-12-01

    The total number of malignant neoplasms included in this study 53,566 cases(14.1%) among 379,582 patients from 1984 to 1993. On sex, females with 51.3% were much more than males with 48.7%. The highest proportion of cancer patients by age was 35.0% in males and 28.4% in females, respectively for 50-59 age group. The most frequent primary site among males was found to be stomach with 33.2%, followed by liver(15.1%), lung(14.9%), esophagus(5.3%) and larynx(3.3%). In females, the first order was uterine cervix with 37.8%, followed by stomach(16.5%), breast(14.8%), thyroid gland(4.3%) and lung (3.8%). The proportion of malignant neoplasms diagnosed by histology made up 67.0%, whereas 20.2% was diagnosed by clinical investigation(X-ray, CT, MRI etc). Among the cancer patients initially diagnosed in this hospital, the proportion of malignant neoplasms by the extent of disease was 3.7% for patient with carcinoma-in-situ, 58.7% for patients with localized involvement, 18.4% for patients with regional involvement and 11.1% for patients with distant involvement. Among the cancer patients initially treatment in this hospital, the proportion of malignant neoplasms by the method of treatment was 27.5% for surgery, 22.5% for radiotherapy and 30.1% for chemotherapy. The proportion of cancer patients traced to death was only to 3.6%, 1,944 cases. Among them, 72.5% survived for less than 1 year. 17 figs, 7 tabs, 28 refs. (Author)

  2. Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.

    Science.gov (United States)

    Walum, Hasse; Waldman, Irwin D; Young, Larry J

    2016-02-01

    Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  3. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    Science.gov (United States)

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  4. A statistical study of the maxillofacial diseases by radiograms

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yoo Tai; Lee, Sang Chull [College of Dentistry, Kyunghee University, Seoul (Korea, Republic of)

    1974-11-15

    This report based on 300 cases of serious diseases in maxillofacial region by radiograms seen at the department of dental radiodontics, infirmary school of dentistry, Kyung Hee University from October 1971 to August 1974. The maxillofacial diseases were analysed upon the following items, such as 1) the frequency of dominant diseases, 2) sex-ratio of male to female, 3) predominant region of diseases, 4) comparison with the age, 5) the incidence of diseases in relative to the individual teeth. The results were obtained as follows. 1) Among the total of 300 cases of the patients, the frequency of dominant diseases of patients were fractures of facial bone (44.3 {+-} 2.87%), inflammatory diseases (22.7 {+-} 2.39%), cysts (11.1 {+-} 1.62%), tumors(10.7 {+-} 1.77%), maxillary sinusitis (7.9 {+-} 1.56%), temporomandibular joint disorders(3.3 {+-} 1.05%) in the order. 2) The sex ratio of male to female in occurrence of jaw fractures were 7.3 : 1, temporomandibular joint disorders were 2.1 : 1, inflammatory diseases were 1.8:1, maxillary sinusitis were 1.7 : 1, but tumors were 1: 1, while cysts were 1:1 .2 in sex difference.3) The predominant region of mandibular fractures were symphysis (17.3 {+-} 3.27%), canine region (15.0 {+-} 3.09%), and angle region(14.3 {+-} 3.04%) in the order. Inflammatory diseases were occurred frequently in mandible and it's left side were a little dominant. Odontogenic cysts were observed frequently in maxilla, but regardless of right and left. Carcinomas were involved most frequently in maxilla, while sarcomas and ameloblastomas in mandible. Frequency of the maxillary sinusitis w ere dominant in right side and molar area, also temporomandibular joint disorders were dominant in right side. 4) To study comparing with the age, jaw fractures showed the highest ratio at the 2nd decade (32.3 {+-} 4.06%), and 3rd decade (27.8 {+-} 3.89%), 4th decade (19.6 {+-} 3.44%), 6th decade (9.0 {+-} 2.47%), 5th decade(6.0 {+-} 2.06%), lst decade(5.3 {+-} 1

  5. Potential errors and misuse of statistics in studies on leakage in endodontics.

    Science.gov (United States)

    Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J

    2013-04-01

    To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.

  6. High-resolution Statistics of Solar Wind Turbulence at Kinetic Scales Using the Magnetospheric Multiscale Mission

    Energy Technology Data Exchange (ETDEWEB)

    Chasapis, Alexandros; Matthaeus, W. H.; Parashar, T. N.; Maruca, B. A. [University of Delaware, Newark, DE (United States); Fuselier, S. A.; Burch, J. L. [Southwest Research Institute, San Antonio, TX (United States); Phan, T. D. [Space Sciences Laboratory, University of California, Berkeley, CA (United States); Moore, T. E.; Pollock, C. J.; Gershman, D. J. [NASA Goddard Space Flight Center, Greenbelt, MD (United States); Torbert, R. B. [University of New Hampshire, Durham, NH (United States); Russell, C. T.; Strangeway, R. J., E-mail: chasapis@udel.edu [University of California, Los Angeles, CA (United States)

    2017-07-20

    Using data from the Magnetospheric Multiscale (MMS) and Cluster missions obtained in the solar wind, we examine second-order and fourth-order structure functions at varying spatial lags normalized to ion inertial scales. The analysis includes direct two-spacecraft results and single-spacecraft results employing the familiar Taylor frozen-in flow approximation. Several familiar statistical results, including the spectral distribution of energy, and the sale-dependent kurtosis, are extended down to unprecedented spatial scales of ∼6 km, approaching electron scales. The Taylor approximation is also confirmed at those small scales, although small deviations are present in the kinetic range. The kurtosis is seen to attain very high values at sub-proton scales, supporting the previously reported suggestion that monofractal behavior may be due to high-frequency plasma waves at kinetic scales.

  7. An instrument for the high-statistics measurement of plastic scintillating fibers

    International Nuclear Information System (INIS)

    Buontempo, S.; Ereditato, A.; Marchetti-Stasi, F.; Riccardi, F.; Strolin, P.

    1994-01-01

    There is today widespread use of plastic scintillating fibers in particle physics, mainly for calorimetric and tracking applications. In the case of calorimeters, we have to cope with very massive detectors and a large quantity of scintillating fibers. The CHORUS Collaboration has built a new detector to search for ν μ -ν τ oscillations in the CERN neutrino beam. A crucial task of the detector is ruled by the high-energy resolution calorimeter. For its construction more than 400 000 scintillating plastic fibers have been used. In this paper we report on the design and performance of a new instrument for the high-statistics measurement of the fiber properties, in terms of light yield and light attenuation length. The instrument has been successfully used to test about 3% of the total number of fibers before the construction of the calorimeter. ((orig.))

  8. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    International Nuclear Information System (INIS)

    Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-01-01

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites

  9. Quantitative study on the statistical properties of fibre architecture of genuine and numerical composite microstructures

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Brøndsted, Povl

    2013-01-01

    A quantitative study is carried out regarding the statistical properties of the fibre architecture found in composite laminates and that generated numerically using Statistical Representative Volume Elements (SRVE’s). The aim is to determine the reliability and consistency of SRVE’s for represent......A quantitative study is carried out regarding the statistical properties of the fibre architecture found in composite laminates and that generated numerically using Statistical Representative Volume Elements (SRVE’s). The aim is to determine the reliability and consistency of SRVE...

  10. A study of statistics anxiety levels of graduate dental hygiene students.

    Science.gov (United States)

    Welch, Paul S; Jacks, Mary E; Smiley, Lynn A; Walden, Carolyn E; Clark, William D; Nguyen, Carol A

    2015-02-01

    In light of increased emphasis on evidence-based practice in the profession of dental hygiene, it is important that today's dental hygienist comprehend statistical measures to fully understand research articles, and thereby apply scientific evidence to practice. Therefore, the purpose of this study was to investigate statistics anxiety among graduate dental hygiene students in the U.S. A web-based self-report, anonymous survey was emailed to directors of 17 MSDH programs in the U.S. with a request to distribute to graduate students. The survey collected data on statistics anxiety, sociodemographic characteristics and evidence-based practice. Statistic anxiety was assessed using the Statistical Anxiety Rating Scale. Study significance level was α=0.05. Only 8 of the 17 invited programs participated in the study. Statistical Anxiety Rating Scale data revealed graduate dental hygiene students experience low to moderate levels of statistics anxiety. Specifically, the level of anxiety on the Interpretation Anxiety factor indicated this population could struggle with making sense of scientific research. A decisive majority (92%) of students indicated statistics is essential for evidence-based practice and should be a required course for all dental hygienists. This study served to identify statistics anxiety in a previously unexplored population. The findings should be useful in both theory building and in practical applications. Furthermore, the results can be used to direct future research. Copyright © 2015 The American Dental Hygienists’ Association.

  11. Estimating annual high-flow statistics and monthly and seasonal low-flow statistics for ungaged sites on streams in Alaska and conterminous basins in Canada

    Science.gov (United States)

    Wiley, Jeffrey B.; Curran, Janet H.

    2003-01-01

    Methods for estimating daily mean flow-duration statistics for seven regions in Alaska and low-flow frequencies for one region, southeastern Alaska, were developed from daily mean discharges for streamflow-gaging stations in Alaska and conterminous basins in Canada. The 15-, 10-, 9-, 8-, 7-, 6-, 5-, 4-, 3-, 2-, and 1-percent duration flows were computed for the October-through-September water year for 222 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the individual months of July, August, and September for 226 stations in Alaska and conterminous basins in Canada. The 98-, 95-, 90-, 85-, 80-, 70-, 60-, and 50-percent duration flows were computed for the season July-through-September for 65 stations in southeastern Alaska. The 7-day, 10-year and 7-day, 2-year low-flow frequencies for the season July-through-September were computed for 65 stations for most of southeastern Alaska. Low-flow analyses were limited to particular months or seasons in order to omit winter low flows, when ice effects reduce the quality of the records and validity of statistical assumptions. Regression equations for estimating the selected high-flow and low-flow statistics for the selected months and seasons for ungaged sites were developed from an ordinary-least-squares regression model using basin characteristics as independent variables. Drainage area and precipitation were significant explanatory variables for high flows, and drainage area, precipitation, mean basin elevation, and area of glaciers were significant explanatory variables for low flows. The estimating equations can be used at ungaged sites in Alaska and conterminous basins in Canada where streamflow regulation, streamflow diversion, urbanization, and natural damming and releasing of water do not affect the streamflow data for the given month or season. Standard errors of estimate ranged from 15 to 56 percent for high-duration flow

  12. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    Science.gov (United States)

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  13. Mask effects on cosmological studies with weak-lensing peak statistics

    International Nuclear Information System (INIS)

    Liu, Xiangkun; Pan, Chuzhong; Fan, Zuhui; Wang, Qiao

    2014-01-01

    With numerical simulations, we analyze in detail how the bad data removal, i.e., the mask effect, can influence the peak statistics of the weak-lensing convergence field reconstructed from the shear measurement of background galaxies. It is found that high peak fractions are systematically enhanced because of the presence of masks; the larger the masked area is, the higher the enhancement is. In the case where the total masked area is about 13% of the survey area, the fraction of peaks with signal-to-noise ratio ν ≥ 3 is ∼11% of the total number of peaks, compared with ∼7% of the mask-free case in our considered cosmological model. This can have significant effects on cosmological studies with weak-lensing convergence peak statistics, inducing a large bias in the parameter constraints if the effects are not taken into account properly. Even for a survey area of 9 deg 2 , the bias in (Ω m , σ 8 ) is already intolerably large and close to 3σ. It is noted that most of the affected peaks are close to the masked regions. Therefore, excluding peaks in those regions in the peak statistics can reduce the bias effect but at the expense of losing usable survey areas. Further investigations find that the enhancement of the number of high peaks around the masked regions can be largely attributed to the smaller number of galaxies usable in the weak-lensing convergence reconstruction, leading to higher noise than that of the areas away from the masks. We thus develop a model in which we exclude only those very large masks with radius larger than 3' but keep all the other masked regions in peak counting statistics. For the remaining part, we treat the areas close to and away from the masked regions separately with different noise levels. It is shown that this two-noise-level model can account for the mask effect on peak statistics very well, and the bias in cosmological parameters is significantly reduced if this model is applied in the parameter fitting.

  14. A Statistical Study of Serum Cholesterol Level by Gender and Race.

    Science.gov (United States)

    Tharu, Bhikhari Prasad; Tsokos, Chris P

    2017-07-25

    Cholesterol level (CL) is growing concerned as health issue in human health since it is considered one of the causes in heart diseases. A study of cholesterol level can provide insight about its nature and characteristics. A cross-sectional study. National Health and Nutrition Examination Survey (NHANS) II was conducted on a probability sample of approximately 28,000 persons in the USA and cholesterol level is obtained from laboratory results. Samples were selected so that certain population groups thought to be at high risk of malnutrition. Study included 11,864 persons for CL cases with 9,602 males and 2,262 females with races: whites, blacks, and others. Non-parametric statistical tests and goodness of fit test have been used to identify probability distributions. The study concludes that the cholesterol level exhibits significant racial and gender differences in terms of probability distributions. The study has concluded that white people are relatively higher at risk than black people to have risk line and high risk cholesterol. The study clearly indicates that black males normally have higher cholesterol. Females have lower variation in cholesterol than males. There exists gender and racial discrepancies in cholesterol which has been identified as lognormal and gamma probability distributions. White individuals seem to be at a higher risk of having high risk cholesterol level than blacks. Females tend to have higher variation in cholesterol level than males.

  15. Mathematical problem solving ability of sport students in the statistical study

    Science.gov (United States)

    Sari, E. F. P.; Zulkardi; Putri, R. I. I.

    2017-12-01

    This study aims to determine the problem-solving ability of sport students of PGRI Palembang semester V in the statistics course. Subjects in this study were sport students of PGRI Palembang semester V which amounted to 31 people. The research method used is quasi experiment type one case shoot study. Data collection techniques in this study use the test and data analysis used is quantitative descriptive statistics. The conclusion of this study shown that the mathematical problem solving ability of PGRI Palembang sport students of V semester in the statistical course is categorized well with the average of the final test score of 80.3.

  16. Statistical characteristics of transient enclosure voltage in ultra-high-voltage gas-insulated switchgear

    Science.gov (United States)

    Cai, Yuanji; Guan, Yonggang; Liu, Weidong

    2017-06-01

    Transient enclosure voltage (TEV), which is a phenomenon induced by the inner dielectric breakdown of SF6 during disconnector operations in a gas-insulated switchgear (GIS), may cause issues relating to shock hazard and electromagnetic interference to secondary equipment. This is a critical factor regarding the electromagnetic compatibility of ultra-high-voltage (UHV) substations. In this paper, the statistical characteristics of TEV at UHV level are collected from field experiments, and are analyzed and compared to those from a repeated strike process. The TEV waveforms during disconnector operations are recorded by a self-developed measurement system first. Then, statistical characteristics, such as the pulse number, duration of pulses, frequency components, magnitude and single pulse duration, are extracted. The transmission line theory is introduced to analyze the TEV and is validated by the experimental results. Finally, the relationship between the TEV and the repeated strike process is analyzed. This proves that the pulse voltage of the TEV is proportional to the corresponding breakdown voltage. The results contribute to the definition of the standard testing waveform of the TEV, and can aid the protection of electronic devices in substations by minimizing the threat of this phenomenon.

  17. Statistical methods for elimination of guarantee-time bias in cohort studies: a simulation study

    Directory of Open Access Journals (Sweden)

    In Sung Cho

    2017-08-01

    Full Text Available Abstract Background Aspirin has been considered to be beneficial in preventing cardiovascular diseases and cancer. Several pharmaco-epidemiology cohort studies have shown protective effects of aspirin on diseases using various statistical methods, with the Cox regression model being the most commonly used approach. However, there are some inherent limitations to the conventional Cox regression approach such as guarantee-time bias, resulting in an overestimation of the drug effect. To overcome such limitations, alternative approaches, such as the time-dependent Cox model and landmark methods have been proposed. This study aimed to compare the performance of three methods: Cox regression, time-dependent Cox model and landmark method with different landmark times in order to address the problem of guarantee-time bias. Methods Through statistical modeling and simulation studies, the performance of the above three methods were assessed in terms of type I error, bias, power, and mean squared error (MSE. In addition, the three statistical approaches were applied to a real data example from the Korean National Health Insurance Database. Effect of cumulative rosiglitazone dose on the risk of hepatocellular carcinoma was used as an example for illustration. Results In the simulated data, time-dependent Cox regression outperformed the landmark method in terms of bias and mean squared error but the type I error rates were similar. The results from real-data example showed the same patterns as the simulation findings. Conclusions While both time-dependent Cox regression model and landmark analysis are useful in resolving the problem of guarantee-time bias, time-dependent Cox regression is the most appropriate method for analyzing cumulative dose effects in pharmaco-epidemiological studies.

  18. Statistical testing and power analysis for brain-wide association study.

    Science.gov (United States)

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Sub-ionospheric VLF signal anomaly due to geomagnetic storms: a statistical study

    Directory of Open Access Journals (Sweden)

    K. Tatsuta

    2015-11-01

    Full Text Available We investigate quantitatively the effect of geomagnetic storms on the sub-ionospheric VLF/LF (Very Low Frequency/Low Frequency propagations for different latitudes based on 2-year nighttime data from Japanese VLF/LF observation network. Three statistical parameters such as average signal amplitude, variability of the signal amplitude, and nighttime fluctuation were calculated daily for 2 years for 16–21 independent VLF/LF transmitter–receiver propagation paths consisting of three transmitters and seven receiving stations. These propagation paths are suitable to simultaneously study high-latitude, low-mid-latitude and mid-latitude D/E-region ionospheric properties. We found that these three statistical parameters indicate significant anomalies exceeding at least 2 times of their standard deviation from the mean value during the geomagnetic storm time period in the high-latitude paths with an occurrence rate of anomaly between 40 and 50 % presumably due to the auroral energetic electron precipitation. The mid-latitude and low-mid-latitude paths have a smaller influence from the geomagnetic activity because of a lower occurrence rate of anomalies even during the geomagnetically active time period (from 20 to 30 %. The anomalies except geomagnetic storm periods may be caused by atmospheric and/or lithospheric origins. The statistical occurrence rates of ionospheric anomalies for different latitudinal paths during geomagnetic storm and non-storm time periods are basic and important information not only to identify the space weather effects toward the lower ionosphere depending on the latitudes but also to separate various external physical causes of lower ionospheric disturbances.

  20. Statistical methods for the analysis of high-throughput metabolomics data

    Directory of Open Access Journals (Sweden)

    Fabian J. Theis

    2013-01-01

    Full Text Available Metabolomics is a relatively new high-throughput technology that aims at measuring all endogenous metabolites within a biological sample in an unbiased fashion. The resulting metabolic profiles may be regarded as functional signatures of the physiological state, and have been shown to comprise effects of genetic regulation as well as environmental factors. This potential to connect genotypic to phenotypic information promises new insights and biomarkers for different research fields, including biomedical and pharmaceutical research. In the statistical analysis of metabolomics data, many techniques from other omics fields can be reused. However recently, a number of tools specific for metabolomics data have been developed as well. The focus of this mini review will be on recent advancements in the analysis of metabolomics data especially by utilizing Gaussian graphical models and independent component analysis.

  1. Geant4 electromagnetic physics for high statistic simulation of LHC experiments

    CERN Document Server

    Allison, J; Bagulya, A; Champion, C; Elles, S; Garay, F; Grichine, V; Howard, A; Incerti, S; Ivanchenko, V; Jacquemier, J; Maire, M; Mantero, A; Nieminen, P; Pandola, L; Santin, G; Sawkey, D; Schalicke, A; Urban, L

    2012-01-01

    An overview of the current status of electromagnetic physics (EM) of the Geant4 toolkit is presented. Recent improvements are focused on the performance of large scale production for LHC and on the precision of simulation results over a wide energy range. Significant efforts have been made to improve the accuracy without compromising of CPU speed for EM particle transport. New biasing options have been introduced, which are applicable to any EM process. These include algorithms to enhance and suppress processes, force interactions or splitting of secondary particles. It is shown that the performance of the EM sub-package is improved. We will report extensions of the testing suite allowing high statistics validation of EM physics. It includes validation of multiple scattering, bremsstrahlung and other models. Cross checks between standard and low-energy EM models have been performed using evaluated data libraries and reference benchmark results.

  2. The Statistical Analysis of Relation between Compressive and Tensile/Flexural Strength of High Performance Concrete

    Directory of Open Access Journals (Sweden)

    Kępniak M.

    2016-12-01

    Full Text Available This paper addresses the tensile and flexural strength of HPC (high performance concrete. The aim of the paper is to analyse the efficiency of models proposed in different codes. In particular, three design procedures from: the ACI 318 [1], Eurocode 2 [2] and the Model Code 2010 [3] are considered. The associations between design tensile strength of concrete obtained from these three codes and compressive strength are compared with experimental results of tensile strength and flexural strength by statistical tools. Experimental results of tensile strength were obtained in the splitting test. Based on this comparison, conclusions are drawn according to the fit between the design methods and the test data. The comparison shows that tensile strength and flexural strength of HPC depend on more influential factors and not only compressive strength.

  3. Dissipative Effects on Inertial-Range Statistics at High Reynolds Numbers.

    Science.gov (United States)

    Sinhuber, Michael; Bewley, Gregory P; Bodenschatz, Eberhard

    2017-09-29

    Using the unique capabilities of the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, Göttingen, we report experimental measurements in classical grid turbulence that uncover oscillations of the velocity structure functions in the inertial range. This was made possible by measuring extremely long time series of up to 10^{10} samples of the turbulent fluctuating velocity, which corresponds to O(10^{7}) integral length scales. The measurements were conducted in a well-controlled environment at a wide range of high Reynolds numbers from R_{λ}=110 up to R_{λ}=1600, using both traditional hot-wire probes as well as the nanoscale thermal anemometry probe developed at Princeton University. An implication of the observed oscillations is that dissipation influences the inertial-range statistics of turbulent flows at scales significantly larger than predicted by current models and theories.

  4. Data analysis in high energy physics a practical guide to statistical methods

    CERN Document Server

    Behnke, Olaf; Kröninger, Kevin; Schott, Grégory; Schörner-Sadenius, Thomas

    2013-01-01

    This practical guide covers the most essential statistics-related tasks and problems encountered in high-energy physics data analyses. It addresses both advanced students entering the field of particle physics as well as researchers looking for a reliable source on optimal separation of signal and background, determining signals or estimating upper limits, correcting the data for detector effects and evaluating systematic uncertainties. Each chapter is dedicated to a single topic and supplemented by a substantial number of both paper and computer exercises related to real experiments, with the solutions provided at the end of the book along with references. A special feature of the book are the analysis walk-throughs used to illustrate the application of the methods discussed beforehand. The authors give examples of data analysis, referring to real problems in HEP, and display the different stages of data analysis in a descriptive manner. The accompanying website provides more algorithms as well as up-to-date...

  5. A multi-scale and model approach to estimate future tidal high water statistics in the southern German Bright

    Science.gov (United States)

    Hein, H.; Mai, S.; Mayer, B.; Pohlmann, T.; Barjenbruch, U.

    2012-04-01

    The interactions of tides, external surges, storm surges and waves with an additional role of the coastal bathymetry define the probability of extreme water levels at the coast. Probabilistic analysis and also process based numerical models allow the estimation of future states. From the physical point of view both, deterministic processes and stochastic residuals are the fundamentals of high water statistics. This study uses a so called model chain to reproduce historic statistics of tidal high water levels (Thw) as well as the prediction of future statistics high water levels. The results of the numerical models are post-processed by a stochastic analysis. Recent studies show, that for future extrapolation of extreme Thw nonstationary parametric approaches are required. With the presented methods a better prediction of time depended parameter sets seems possible. The investigation region of this study is the southern German Bright. The model-chain is the representation of a downscaling process, which starts with an emissions scenario. Regional atmospheric and ocean models refine the results of global climate models. The concept of downscaling was chosen to resolve coastal topography sufficiently. The North Sea and estuaries are modeled with the three-dimensional model HAMburg Shelf Ocean Model. The running time includes 150 years (1950 - 2100). Results of four different hindcast runs and also of one future prediction run are validated. Based on multi-scale analysis and the theory of entropy we analyze whether any significant periodicities are represented numerically. Results show that also hindcasting the climate of Thw with a model chain for the last 60 years is a challenging task. For example, an additional modeling activity must be the inclusion of tides into regional climate ocean models. It is found that the statistics of climate variables derived from model results differs from the statistics derived from measurements. E.g. there are considerable shifts in

  6. A Descriptive Study of Individual and Cross-Cultural Differences in Statistics Anxiety

    Science.gov (United States)

    Baloglu, Mustafa; Deniz, M. Engin; Kesici, Sahin

    2011-01-01

    The present study investigated individual and cross-cultural differences in statistics anxiety among 223 Turkish and 237 American college students. A 2 x 2 between-subjects factorial multivariate analysis of covariance (MANCOVA) was performed on the six dependent variables which are the six subscales of the Statistical Anxiety Rating Scale.…

  7. DATA MINING AND STATISTICS METHODS USAGE FOR ADVANCED TRAINING COURSES QUALITY MEASUREMENT: CASE STUDY

    Directory of Open Access Journals (Sweden)

    Maxim I. Galchenko

    2014-01-01

    Full Text Available In the article we consider a case of the analysis of the data connected with educational statistics, namely – result of professional development courses students survey with specialized software usage. Need for expanded statistical results processing, the scheme of carrying out the analysis is shown. Conclusions on a studied case are presented. 

  8. An Exploratory Study of Taiwanese Mathematics Teachers' Conceptions of School Mathematics, School Statistics, and Their Differences

    Science.gov (United States)

    Yang, Kai-Lin

    2014-01-01

    This study used phenomenography, a qualitative method, to investigate Taiwanese mathematics teachers' conceptions of school mathematics, school statistics, and their differences. To collect data, we interviewed five mathematics teachers by open questions. They also responded to statements drawn on mathematical/statistical conceptions and…

  9. The Effect of "Clickers" on Attendance in an Introductory Statistics Course: An Action Research Study

    Science.gov (United States)

    Amstelveen, Raoul H.

    2013-01-01

    The purpose of this study was to design and implement a Classroom Response System, also known as a "clicker," to increase attendance in introductory statistics courses at an undergraduate university. Since 2010, non-attendance had been prevalent in introductory statistics courses. Moreover, non-attendance created undesirable classrooms…

  10. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    Science.gov (United States)

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  11. Statistical dynamic image reconstruction in state-of-the-art high-resolution PET

    International Nuclear Information System (INIS)

    Rahmim, Arman; Cheng, J-C; Blinder, Stephan; Camborde, Maurie-Laure; Sossi, Vesna

    2005-01-01

    Modern high-resolution PET is now more than ever in need of scrutiny into the nature and limitations of the imaging modality itself as well as image reconstruction techniques. In this work, we have reviewed, analysed and addressed the following three considerations within the particular context of state-of-the-art dynamic PET imaging: (i) the typical average numbers of events per line-of-response (LOR) are now (much) less than unity (ii) due to the physical and biological decay of the activity distribution, one requires robust and efficient reconstruction algorithms applicable to a wide range of statistics and (iii) the computational considerations in dynamic imaging are much enhanced (i.e., more frames to be stored and reconstructed). Within the framework of statistical image reconstruction, we have argued theoretically and shown experimentally that the sinogram non-negativity constraint (when using the delayed-coincidence and/or scatter-subtraction techniques) is especially expected to result in an overestimation bias. Subsequently, two schemes are considered: (a) subtraction techniques in which an image non-negativity constraint has been imposed and (b) implementation of random and scatter estimates inside the reconstruction algorithms, thus enabling direct processing of Poisson-distributed prompts. Both techniques are able to remove the aforementioned bias, while the latter, being better conditioned theoretically, is able to exhibit superior noise characteristics. We have also elaborated upon and verified the applicability of the accelerated list-mode image reconstruction method as a powerful solution for accurate, robust and efficient dynamic reconstructions of high-resolution data (as well as a number of additional benefits in the context of state-of-the-art PET)

  12. Statistical surrogate models for prediction of high-consequence climate change.

    Energy Technology Data Exchange (ETDEWEB)

    Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick

    2011-09-01

    In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.

  13. Statistics of vacuum breakdown in the high-gradient and low-rate regime

    Science.gov (United States)

    Wuensch, Walter; Degiovanni, Alberto; Calatroni, Sergio; Korsbäck, Anders; Djurabekova, Flyura; Rajamäki, Robin; Giner-Navarro, Jorge

    2017-01-01

    In an increasing number of high-gradient linear accelerator applications, accelerating structures must operate with both high surface electric fields and low breakdown rates. Understanding the statistical properties of breakdown occurrence in such a regime is of practical importance for optimizing accelerator conditioning and operation algorithms, as well as of interest for efforts to understand the physical processes which underlie the breakdown phenomenon. Experimental data of breakdown has been collected in two distinct high-gradient experimental set-ups: A prototype linear accelerating structure operated in the Compact Linear Collider Xbox 12 GHz test stands, and a parallel plate electrode system operated with pulsed DC in the kV range. Collected data is presented, analyzed and compared. The two systems show similar, distinctive, two-part distributions of number of pulses between breakdowns, with each part corresponding to a specific, constant event rate. The correlation between distance and number of pulses between breakdown indicates that the two parts of the distribution, and their corresponding event rates, represent independent primary and induced follow-up breakdowns. The similarity of results from pulsed DC to 12 GHz rf indicates a similar vacuum arc triggering mechanism over the range of conditions covered by the experiments.

  14. Statistics of vacuum breakdown in the high-gradient and low-rate regime

    Directory of Open Access Journals (Sweden)

    Walter Wuensch

    2017-01-01

    Full Text Available In an increasing number of high-gradient linear accelerator applications, accelerating structures must operate with both high surface electric fields and low breakdown rates. Understanding the statistical properties of breakdown occurrence in such a regime is of practical importance for optimizing accelerator conditioning and operation algorithms, as well as of interest for efforts to understand the physical processes which underlie the breakdown phenomenon. Experimental data of breakdown has been collected in two distinct high-gradient experimental set-ups: A prototype linear accelerating structure operated in the Compact Linear Collider Xbox 12 GHz test stands, and a parallel plate electrode system operated with pulsed DC in the kV range. Collected data is presented, analyzed and compared. The two systems show similar, distinctive, two-part distributions of number of pulses between breakdowns, with each part corresponding to a specific, constant event rate. The correlation between distance and number of pulses between breakdown indicates that the two parts of the distribution, and their corresponding event rates, represent independent primary and induced follow-up breakdowns. The similarity of results from pulsed DC to 12 GHz rf indicates a similar vacuum arc triggering mechanism over the range of conditions covered by the experiments.

  15. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    Science.gov (United States)

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  16. Medical school attrition-beyond the statistics a ten year retrospective study.

    Science.gov (United States)

    Maher, Bridget M; Hynes, Helen; Sweeney, Catherine; Khashan, Ali S; O'Rourke, Margaret; Doran, Kieran; Harris, Anne; Flynn, Siun O'

    2013-01-31

    Medical school attrition is important--securing a place in medical school is difficult and a high attrition rate can affect the academic reputation of a medical school and staff morale. More important, however, are the personal consequences of dropout for the student. The aims of our study were to examine factors associated with attrition over a ten-year period (2001-2011) and to study the personal effects of dropout on individual students. The study included quantitative analysis of completed cohorts and qualitative analysis of ten-year data. Data were collected from individual student files, examination and admission records, exit interviews and staff interviews. Statistical analysis was carried out on five successive completed cohorts. Qualitative data from student files was transcribed and independently analysed by three authors. Data was coded and categorized and key themes were identified. Overall attrition rate was 5.7% (45/779) in 6 completed cohorts when students who transferred to other medical courses were excluded. Students from Kuwait and United Arab Emirates had the highest dropout rate (RR = 5.70, 95% Confidence Intervals 2.65 to 12.27;p psychological morbidity in 40% (higher than other studies). Qualitative analysis revealed recurrent themes of isolation, failure, and despair. Student Welfare services were only accessed by one-third of dropout students. While dropout is often multifactorial, certain red flag signals may alert us to risk of dropout including non-EU origin, academic struggling, absenteeism, social isolation, depression and leave of absence. Psychological morbidity amongst dropout students is high and Student Welfare services should be actively promoted. Absenteeism should prompt early intervention. Behind every dropout statistic lies a personal story. All medical schools have a duty of care to support students who leave the medical programme.

  17. Medical School Attrition-Beyond the Statistics A Ten Year Retrospective Study

    Directory of Open Access Journals (Sweden)

    Maher Bridget M

    2013-01-01

    Full Text Available Abstract Background Medical school attrition is important - securing a place in medical school is difficult and a high attrition rate can affect the academic reputation of a medical school and staff morale. More important, however, are the personal consequences of dropout for the student. The aims of our study were to examine factors associated with attrition over a ten-year period (2001–2011 and to study the personal effects of dropout on individual students. Methods The study included quantitative analysis of completed cohorts and qualitative analysis of ten-year data. Data were collected from individual student files, examination and admission records, exit interviews and staff interviews. Statistical analysis was carried out on five successive completed cohorts. Qualitative data from student files was transcribed and independently analysed by three authors. Data was coded and categorized and key themes were identified. Results Overall attrition rate was 5.7% (45/779 in 6 completed cohorts when students who transferred to other medical courses were excluded. Students from Kuwait and United Arab Emirates had the highest dropout rate (RR = 5.70, 95% Confidence Intervals 2.65 to 12.27;p  Absenteeism was documented in 30% of students, academic difficulty in 55.7%, social isolation in 20%, and psychological morbidity in 40% (higher than other studies. Qualitative analysis revealed recurrent themes of isolation, failure, and despair. Student Welfare services were only accessed by one-third of dropout students. Conclusions While dropout is often multifactorial, certain red flag signals may alert us to risk of dropout including non-EU origin, academic struggling, absenteeism, social isolation, depression and leave of absence. Psychological morbidity amongst dropout students is high and Student Welfare services should be actively promoted. Absenteeism should prompt early intervention. Behind every dropout statistic lies a personal story. All

  18. Use of a mixture statistical model in studying malaria vectors density.

    Directory of Open Access Journals (Sweden)

    Olayidé Boussari

    Full Text Available Vector control is a major step in the process of malaria control and elimination. This requires vector counts and appropriate statistical analyses of these counts. However, vector counts are often overdispersed. A non-parametric mixture of Poisson model (NPMP is proposed to allow for overdispersion and better describe vector distribution. Mosquito collections using the Human Landing Catches as well as collection of environmental and climatic data were carried out from January to December 2009 in 28 villages in Southern Benin. A NPMP regression model with "village" as random effect is used to test statistical correlations between malaria vectors density and environmental and climatic factors. Furthermore, the villages were ranked using the latent classes derived from the NPMP model. Based on this classification of the villages, the impacts of four vector control strategies implemented in the villages were compared. Vector counts were highly variable and overdispersed with important proportion of zeros (75%. The NPMP model had a good aptitude to predict the observed values and showed that: i proximity to freshwater body, market gardening, and high levels of rain were associated with high vector density; ii water conveyance, cattle breeding, vegetation index were associated with low vector density. The 28 villages could then be ranked according to the mean vector number as estimated by the random part of the model after adjustment on all covariates. The NPMP model made it possible to describe the distribution of the vector across the study area. The villages were ranked according to the mean vector density after taking into account the most important covariates. This study demonstrates the necessity and possibility of adapting methods of vector counting and sampling to each setting.

  19. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  20. GWAPower: a statistical power calculation software for genome-wide association studies with quantitative traits.

    Science.gov (United States)

    Feng, Sheng; Wang, Shengchu; Chen, Chia-Cheng; Lan, Lan

    2011-01-21

    In designing genome-wide association (GWA) studies it is important to calculate statistical power. General statistical power calculation procedures for quantitative measures often require information concerning summary statistics of distributions such as mean and variance. However, with genetic studies, the effect size of quantitative traits is traditionally expressed as heritability, a quantity defined as the amount of phenotypic variation in the population that can be ascribed to the genetic variants among individuals. Heritability is hard to transform into summary statistics. Therefore, general power calculation procedures cannot be used directly in GWA studies. The development of appropriate statistical methods and a user-friendly software package to address this problem would be welcomed. This paper presents GWAPower, a statistical software package of power calculation designed for GWA studies with quantitative traits, where genetic effect is defined as heritability. Based on several popular one-degree-of-freedom genetic models, this method avoids the need to specify the non-centrality parameter of the F-distribution under the alternative hypothesis. Therefore, it can use heritability information directly without approximation. In GWAPower, the power calculation can be easily adjusted for adding covariates and linkage disequilibrium information. An example is provided to illustrate GWAPower, followed by discussions. GWAPower is a user-friendly free software package for calculating statistical power based on heritability in GWA studies with quantitative traits. The software is freely available at: http://dl.dropbox.com/u/10502931/GWAPower.zip.

  1. Statistical Analyses of High-Resolution Aircraft and Satellite Observations of Sea Ice: Applications for Improving Model Simulations

    Science.gov (United States)

    Farrell, S. L.; Kurtz, N. T.; Richter-Menge, J.; Harbeck, J. P.; Onana, V.

    2012-12-01

    Satellite-derived estimates of ice thickness and observations of ice extent over the last decade point to a downward trend in the basin-scale ice volume of the Arctic Ocean. This loss has broad-ranging impacts on the regional climate and ecosystems, as well as implications for regional infrastructure, marine navigation, national security, and resource exploration. New observational datasets at small spatial and temporal scales are now required to improve our understanding of physical processes occurring within the ice pack and advance parameterizations in the next generation of numerical sea-ice models. High-resolution airborne and satellite observations of the sea ice are now available at meter-scale resolution or better that provide new details on the properties and morphology of the ice pack across basin scales. For example the NASA IceBridge airborne campaign routinely surveys the sea ice of the Arctic and Southern Oceans with an advanced sensor suite including laser and radar altimeters and digital cameras that together provide high-resolution measurements of sea ice freeboard, thickness, snow depth and lead distribution. Here we present statistical analyses of the ice pack primarily derived from the following IceBridge instruments: the Digital Mapping System (DMS), a nadir-looking, high-resolution digital camera; the Airborne Topographic Mapper, a scanning lidar; and the University of Kansas snow radar, a novel instrument designed to estimate snow depth on sea ice. Together these instruments provide data from which a wide range of sea ice properties may be derived. We provide statistics on lead distribution and spacing, lead width and area, floe size and distance between floes, as well as ridge height, frequency and distribution. The goals of this study are to (i) identify unique statistics that can be used to describe the characteristics of specific ice regions, for example first-year/multi-year ice, diffuse ice edge/consolidated ice pack, and convergent

  2. Effectiveness of mouse minute virus inactivation by high temperature short time treatment technology: a statistical assessment.

    Science.gov (United States)

    Murphy, Marie; Quesada, Guillermo Miro; Chen, Dayue

    2011-11-01

    Viral contamination of mammalian cell cultures in GMP manufacturing facility represents a serious safety threat to biopharmaceutical industry. Such adverse events usually require facility shutdown for cleaning/decontamination, and thus result in significant loss of production and/or delay of product development. High temperature short time (HTST) treatment of culture media has been considered as an effective method to protect GMP facilities from viral contaminations. Log reduction factor (LRF) has been commonly used to measure the effectiveness of HTST treatment for viral inactivation. However, in order to prevent viral contaminations, HTST treatment must inactivate all infectious viruses (100%) in the medium batch since a single virus is sufficient to cause contamination. Therefore, LRF may not be the most appropriate indicator for measuring the effectiveness of HTST in preventing viral contaminations. We report here the use of the probability to achieve complete (100%) virus inactivation to assess the effectiveness of HTST treatment. By using mouse minute virus (MMV) as a model virus, we have demonstrated that the effectiveness of HTST treatment highly depends upon the level of viral contaminants in addition to treatment temperature and duration. We believe that the statistical method described in this report can provide more accurate information about the power and potential limitation of technologies such as HTST in our shared quest to mitigate the risk of viral contamination in manufacturing facilities. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  3. On the efficiency of high-energy particle identification statistical methods

    International Nuclear Information System (INIS)

    Chilingaryan, A.A.

    1982-01-01

    An attempt is made to analyze the statistical methods of making decisions on the high-energy particle identification. The Bayesian approach is shown to provide the most complete account of the primary discriminative information between the particles of various tupes. It does not impose rigid requirements on the density form of the probability function and ensures the account of the a priori information as compared with the Neyman-Pearson approach, the mimimax technique and the heristic rules of the decision limits construction in the variant region of the specially chosen parameter. The methods based on the concept of the nearest neighbourhood are shown to be the most effective one among the local methods of the probability function density estimation. The probability distances between the training sample classes are suggested to make a decision on selecting the high-energy particle detector optimal parameters. The method proposed and the software constructed are tested on the problem of the cosmic radiation hadron identification by means of transition radiation detectors (the ''PION'' experiment)

  4. Integrating functional data to prioritize causal variants in statistical fine-mapping studies.

    Directory of Open Access Journals (Sweden)

    Gleb Kichaev

    2014-10-01

    Full Text Available Standard statistical approaches for prioritization of variants for functional testing in fine-mapping studies either use marginal association statistics or estimate posterior probabilities for variants to be causal under simplifying assumptions. Here, we present a probabilistic framework that integrates association strength with functional genomic annotation data to improve accuracy in selecting plausible causal variants for functional validation. A key feature of our approach is that it empirically estimates the contribution of each functional annotation to the trait of interest directly from summary association statistics while allowing for multiple causal variants at any risk locus. We devise efficient algorithms that estimate the parameters of our model across all risk loci to further increase performance. Using simulations starting from the 1000 Genomes data, we find that our framework consistently outperforms the current state-of-the-art fine-mapping methods, reducing the number of variants that need to be selected to capture 90% of the causal variants from an average of 13.3 to 10.4 SNPs per locus (as compared to the next-best performing strategy. Furthermore, we introduce a cost-to-benefit optimization framework for determining the number of variants to be followed up in functional assays and assess its performance using real and simulation data. We validate our findings using a large scale meta-analysis of four blood lipids traits and find that the relative probability for causality is increased for variants in exons and transcription start sites and decreased in repressed genomic regions at the risk loci of these traits. Using these highly predictive, trait-specific functional annotations, we estimate causality probabilities across all traits and variants, reducing the size of the 90% confidence set from an average of 17.5 to 13.5 variants per locus in this data.

  5. High-statistics measurement of the η →3 π0 decay at the Mainz Microtron

    Science.gov (United States)

    Prakhov, S.; Abt, S.; Achenbach, P.; Adlarson, P.; Afzal, F.; Aguar-Bartolomé, P.; Ahmed, Z.; Ahrens, J.; Annand, J. R. M.; Arends, H. J.; Bantawa, K.; Bashkanov, M.; Beck, R.; Biroth, M.; Borisov, N. S.; Braghieri, A.; Briscoe, W. J.; Cherepnya, S.; Cividini, F.; Collicott, C.; Costanza, S.; Denig, A.; Dieterle, M.; Downie, E. J.; Drexler, P.; Ferretti Bondy, M. I.; Fil'kov, L. V.; Fix, A.; Gardner, S.; Garni, S.; Glazier, D. I.; Gorodnov, I.; Gradl, W.; Gurevich, G. M.; Hamill, C. B.; Heijkenskjöld, L.; Hornidge, D.; Huber, G. M.; Käser, A.; Kashevarov, V. L.; Kay, S.; Keshelashvili, I.; Kondratiev, R.; Korolija, M.; Krusche, B.; Lazarev, A.; Lisin, V.; Livingston, K.; Lutterer, S.; MacGregor, I. J. D.; Manley, D. M.; Martel, P. P.; McGeorge, J. C.; Middleton, D. G.; Miskimen, R.; Mornacchi, E.; Mushkarenkov, A.; Neganov, A.; Neiser, A.; Oberle, M.; Ostrick, M.; Otte, P. B.; Paudyal, D.; Pedroni, P.; Polonski, A.; Ron, G.; Rostomyan, T.; Sarty, A.; Sfienti, C.; Sokhoyan, V.; Spieker, K.; Steffen, O.; Strakovsky, I. I.; Strandberg, B.; Strub, Th.; Supek, I.; Thiel, A.; Thiel, M.; Thomas, A.; Unverzagt, M.; Usov, Yu. A.; Wagner, S.; Walford, N. K.; Watts, D. P.; Werthmüller, D.; Wettig, J.; Witthauer, L.; Wolfes, M.; Zana, L. A.; A2 Collaboration at MAMI

    2018-06-01

    The largest, at the moment, statistics of 7 ×106η →3 π0 decays, based on 6.2 ×107η mesons produced in the γ p →η p reaction, has been accumulated by the A2 Collaboration at the Mainz Microtron, MAMI. It allowed a detailed study of the η →3 π0 dynamics beyond its conventional parametrization with just the quadratic slope parameter α and enabled, for the first time, a measurement of the second-order term and a better understanding of the cusp structure in the neutral decay. The present data are also compared to recent theoretical calculations that predict a nonlinear dependence along the quadratic distance from the Dalitz-plot center.

  6. Evaluating clinical and public health interventions: a practical guide to study design and statistics

    National Research Council Canada - National Science Library

    Katz, Mitchell H

    2010-01-01

    ... and observational studies. In addition to reviewing standard statistical analysis, the book has easy-to-follow explanations of cutting edge techniques for evaluating interventions, including propensity score analysis...

  7. Addressing Economic Development Goals through Innovative Teaching of University Statistics: A Case Study of Statistical Modelling in Nigeria

    Science.gov (United States)

    Ezepue, Patrick Oseloka; Ojo, Adegbola

    2012-01-01

    A challenging problem in some developing countries such as Nigeria is inadequate training of students in effective problem solving using the core concepts of their disciplines. Related to this is a disconnection between their learning and socio-economic development agenda of a country. These problems are more vivid in statistical education which…

  8. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Science.gov (United States)

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  9. The MAX Statistic is Less Powerful for Genome Wide Association Studies Under Most Alternative Hypotheses.

    Science.gov (United States)

    Shifflett, Benjamin; Huang, Rong; Edland, Steven D

    2017-01-01

    Genotypic association studies are prone to inflated type I error rates if multiple hypothesis testing is performed, e.g., sequentially testing for recessive, multiplicative, and dominant risk. Alternatives to multiple hypothesis testing include the model independent genotypic χ 2 test, the efficiency robust MAX statistic, which corrects for multiple comparisons but with some loss of power, or a single Armitage test for multiplicative trend, which has optimal power when the multiplicative model holds but with some loss of power when dominant or recessive models underlie the genetic association. We used Monte Carlo simulations to describe the relative performance of these three approaches under a range of scenarios. All three approaches maintained their nominal type I error rates. The genotypic χ 2 and MAX statistics were more powerful when testing a strictly recessive genetic effect or when testing a dominant effect when the allele frequency was high. The Armitage test for multiplicative trend was most powerful for the broad range of scenarios where heterozygote risk is intermediate between recessive and dominant risk. Moreover, all tests had limited power to detect recessive genetic risk unless the sample size was large, and conversely all tests were relatively well powered to detect dominant risk. Taken together, these results suggest the general utility of the multiplicative trend test when the underlying genetic model is unknown.

  10. Study on Semi-Parametric Statistical Model of Safety Monitoring of Cracks in Concrete Dams

    Directory of Open Access Journals (Sweden)

    Chongshi Gu

    2013-01-01

    Full Text Available Cracks are one of the hidden dangers in concrete dams. The study on safety monitoring models of concrete dam cracks has always been difficult. Using the parametric statistical model of safety monitoring of cracks in concrete dams, with the help of the semi-parametric statistical theory, and considering the abnormal behaviors of these cracks, the semi-parametric statistical model of safety monitoring of concrete dam cracks is established to overcome the limitation of the parametric model in expressing the objective model. Previous projects show that the semi-parametric statistical model has a stronger fitting effect and has a better explanation for cracks in concrete dams than the parametric statistical model. However, when used for forecast, the forecast capability of the semi-parametric statistical model is equivalent to that of the parametric statistical model. The modeling of the semi-parametric statistical model is simple, has a reasonable principle, and has a strong practicality, with a good application prospect in the actual project.

  11. An extensive study of Bose-Einstein condensation in liquid helium using Tsallis statistics

    Science.gov (United States)

    Guha, Atanu; Das, Prasanta Kumar

    2018-05-01

    Realistic scenario can be represented by general canonical ensemble way better than the ideal one, with proper parameter sets involved. We study the Bose-Einstein condensation phenomena of liquid helium within the framework of Tsallis statistics. With a comparatively high value of the deformation parameter q(∼ 1 . 4) , the theoretically calculated value of the critical temperature (Tc) of the phase transition of liquid helium is found to agree with the experimentally determined value (Tc = 2 . 17 K), although they differs from each other for q = 1 (undeformed scenario). This throws a light on the understanding of the phenomenon and connects temperature fluctuation(non-equilibrium conditions) with the interactions between atoms qualitatively. More interactions between atoms give rise to more non-equilibrium conditions which is as expected.

  12. Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies

    Science.gov (United States)

    Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle

    2016-01-01

    Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…

  13. Comparison of Vital Statistics Definitions of Suicide against a Coroner Reference Standard: A Population-Based Linkage Study.

    Science.gov (United States)

    Gatov, Evgenia; Kurdyak, Paul; Sinyor, Mark; Holder, Laura; Schaffer, Ayal

    2018-03-01

    We sought to determine the utility of health administrative databases for population-based suicide surveillance, as these data are generally more accessible and more integrated with other data sources compared to coroners' records. In this retrospective validation study, we identified all coroner-confirmed suicides between 2003 and 2012 in Ontario residents aged 21 and over and linked this information to Statistics Canada's vital statistics data set. We examined the overlap between the underlying cause of death field and secondary causes of death using ICD-9 and ICD-10 codes for deliberate self-harm (i.e., suicide) and examined the sociodemographic and clinical characteristics of misclassified records. Among 10,153 linked deaths, there was a very high degree of overlap between records coded as deliberate self-harm in the vital statistics data set and coroner-confirmed suicides using both ICD-9 and ICD-10 definitions (96.88% and 96.84% sensitivity, respectively). This alignment steadily increased throughout the study period (from 95.9% to 98.8%). Other vital statistics diagnoses in primary fields included uncategorised signs and symptoms. Vital statistics records that were misclassified did not differ from valid records in terms of sociodemographic characteristics but were more likely to have had an unspecified place of injury on the death certificate ( P statistics and coroner classification of suicide deaths suggests that health administrative data can reliably be used to identify suicide deaths.

  14. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  15. First high-statistics and high-resolution recoil-ion data from the WITCH retardation spectrometer

    Science.gov (United States)

    Finlay, P.; Breitenfeldt, M.; Porobić, T.; Wursten, E.; Ban, G.; Beck, M.; Couratin, C.; Fabian, X.; Fléchard, X.; Friedag, P.; Glück, F.; Herlert, A.; Knecht, A.; Kozlov, V. Y.; Liénard, E.; Soti, G.; Tandecki, M.; Traykov, E.; Van Gorp, S.; Weinheimer, Ch.; Zákoucký, D.; Severijns, N.

    2016-07-01

    The first high-statistics and high-resolution data set for the integrated recoil-ion energy spectrum following the β^+ decay of 35Ar has been collected with the WITCH retardation spectrometer located at CERN-ISOLDE. Over 25 million recoil-ion events were recorded on a large-area multichannel plate (MCP) detector with a time-stamp precision of 2ns and position resolution of 0.1mm due to the newly upgraded data acquisition based on the LPC Caen FASTER protocol. The number of recoil ions was measured for more than 15 different settings of the retardation potential, complemented by dedicated background and half-life measurements. Previously unidentified systematic effects, including an energy-dependent efficiency of the main MCP and a radiation-induced time-dependent background, have been identified and incorporated into the analysis. However, further understanding and treatment of the radiation-induced background requires additional dedicated measurements and remains the current limiting factor in extracting a beta-neutrino angular correlation coefficient for 35Ar decay using the WITCH spectrometer.

  16. Audit sampling: A qualitative study on the role of statistical and non-statistical sampling approaches on audit practices in Sweden

    OpenAIRE

    Ayam, Rufus Tekoh

    2011-01-01

    PURPOSE: The two approaches to audit sampling; statistical and nonstatistical have been examined in this study. The overall purpose of the study is to explore the current extent at which statistical and nonstatistical sampling approaches are utilized by independent auditors during auditing practices. Moreover, the study also seeks to achieve two additional purposes; the first is to find out whether auditors utilize different sampling techniques when auditing SME´s (Small and Medium-Sized Ente...

  17. Statistical list-mode image reconstruction for the high resolution research tomograph

    International Nuclear Information System (INIS)

    Rahmim, A; Lenox, M; Reader, A J; Michel, C; Burbar, Z; Ruth, T J; Sossi, V

    2004-01-01

    We have investigated statistical list-mode reconstruction applicable to a depth-encoding high resolution research tomograph. An image non-negativity constraint has been employed in the reconstructions and is shown to effectively remove the overestimation bias introduced by the sinogram non-negativity constraint. We have furthermore implemented a convergent subsetized (CS) list-mode reconstruction algorithm, based on previous work (Hsiao et al 2002 Conf. Rec. SPIE Med. Imaging 4684 10-19; Hsiao et al 2002 Conf. Rec. IEEE Int. Symp. Biomed. Imaging 409-12) on convergent histogram OSEM reconstruction. We have demonstrated that the first step of the convergent algorithm is exactly equivalent (unlike the histogram-mode case) to the regular subsetized list-mode EM algorithm, while the second and final step takes the form of additive updates in image space. We have shown that in terms of contrast, noise as well as FWHM width behaviour, the CS algorithm is robust and does not result in limit cycles. A hybrid algorithm based on the ordinary and the convergent algorithms is also proposed, and is shown to combine the advantages of the two algorithms (i.e. it is able to reach a higher image quality in fewer iterations while maintaining the convergent behaviour), making the hybrid approach a good alternative to the ordinary subsetized list-mode EM algorithm

  18. Cluster-level statistical inference in fMRI datasets: The unexpected behavior of random fields in high dimensions.

    Science.gov (United States)

    Bansal, Ravi; Peterson, Bradley S

    2018-06-01

    Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal

  19. Statistical evaluation of the mechanical properties of high-volume class F fly ash concretes

    KAUST Repository

    Yoon, Seyoon

    2014-03-01

    High-Volume Fly Ash (HVFA) concretes are seen by many as a feasible solution for sustainable, low embodied carbon construction. At the moment, fly ash is classified as a waste by-product, primarily of thermal power stations. In this paper the authors experimentally and statistically investigated the effects of mix-design factors on the mechanical properties of high-volume class F fly ash concretes. A total of 240 and 32 samples were produced and tested in the laboratory to measure compressive strength and Young\\'s modulus respectively. Applicability of the CEB-FIP (Comite Euro-international du Béton - Fédération Internationale de la Précontrainte) and ACI (American Concrete Institute) Building Model Code (Thomas, 2010; ACI Committee 209, 1982) [1,2] to the experimentally-derived mechanical property data for HVFA concretes was established. Furthermore, using multiple linear regression analysis, Mean Squared Residuals (MSRs) were obtained to determine whether a weight- or volume-based mix proportion is better to predict the mechanical properties of HVFA concrete. The significance levels of the design factors, which indicate how significantly the factors affect the HVFA concrete\\'s mechanical properties, were determined using analysis of variance (ANOVA) tests. The results show that a weight-based mix proportion is a slightly better predictor of mechanical properties than volume-based one. The significance level of fly ash substitution rate was higher than that of w/b ratio initially but reduced over time. © 2014 Elsevier Ltd. All rights reserved.

  20. Anomalous variations of NmF2 over the Argentine Islands: a statistical study

    Directory of Open Access Journals (Sweden)

    A. V. Pavlov

    2009-04-01

    Full Text Available We present a statistical study of variations in the F2-layer peak electron density, NmF2, and altitude, hmF2, over the Argentine Islands ionosonde. The critical frequencies, foF2, and, foE, of the F2 and E-layers, and the propagation factor, M(3000F2, measured by the ionosonde during the 1957–1959 and 1962–1995 time periods were used in the statistical analysis to determine the values of NmF2 and hmF2. The probabilities to observe maximum and minimum values of NmF2 and hmF2 in a diurnal variation of the electron density are calculated. Our study shows that the main part of the maximum diurnal values of NmF2 is observed in a time sector close to midnight in November, December, January, and February exhibiting the anomalous diurnal variations of NmF2. Another anomalous feature of the diurnal variations of NmF2 exhibited during November, December, and January when the minimum diurnal value of NmF2 is mainly located close to the noon sector. These anomalous diurnal variations of NmF2 are found to be during both geomagnetically quiet and disturbed conditions. Anomalous features are not found in the diurnal variations of hmF2. The statistical study of the NmF2 winter anomaly phenomena over the Argentine Islands ionosonde was carried out. The variations in a maximum daytime value, R, of a ratio of a geomagnetically quiet daytime winter NmF2 to a geomagnetically quiet daytime summer NmF2 taken at a given UT and for approximately the same level of solar activity were studied. The conditional probability of the occurrence of R in an interval of R, the most frequent value of R, the mean expected value of R, and the conditional probability to observe the F2-region winter anomaly during a daytime period were calculated for low, moderate, and high solar activity. The calculations show that the mean expected value of R and the occurrence frequency of the F2-region winter anomaly increase with increasing solar activity.

  1. From regular text to artistic writing and artworks: Fourier statistics of images with low and high aesthetic appeal

    Directory of Open Access Journals (Sweden)

    Tamara eMelmer

    2013-04-01

    Full Text Available The spatial characteristics of letters and their influence on readability and letter identification have been intensely studied during the last decades. There have been few studies, however, on statistical image properties that reflect more global aspects of text, for example properties that may relate to its aesthetic appeal. It has been shown that natural scenes and a large variety of visual artworks possess a scale-invariant Fourier power spectrum that falls off linearly with increasing frequency in log-log plots. We asked whether images of text share this property. As expected, the Fourier spectrum of images of regular typed or handwritten text is highly anisotropic, i.e. the spectral image properties in vertical, horizontal and oblique orientations differ. Moreover, the spatial frequency spectra of text images are not scale invariant in any direction. The decline is shallower in the low-frequency part of the spectrum for text than for aesthetic artworks, whereas, in the high-frequency part, it is steeper. These results indicate that, in general, images of regular text contain less global structure (low spatial frequencies relative to fine detail (high spatial frequencies than images of aesthetics artworks. Moreover, we studied images of text with artistic claim (ornate print and calligraphy and ornamental art. For some measures, these images assume average values intermediate between regular text and aesthetic artworks. Finally, to answer the question of whether the statistical properties measured by us are universal amongst humans or are subject to intercultural differences, we compared images from three different cultural backgrounds (Western, East Asian and Arabic. Results for different categories (regular text, aesthetic writing, ornamental art and fine art were similar across cultures.

  2. From regular text to artistic writing and artworks: Fourier statistics of images with low and high aesthetic appeal

    Science.gov (United States)

    Melmer, Tamara; Amirshahi, Seyed A.; Koch, Michael; Denzler, Joachim; Redies, Christoph

    2013-01-01

    The spatial characteristics of letters and their influence on readability and letter identification have been intensely studied during the last decades. There have been few studies, however, on statistical image properties that reflect more global aspects of text, for example properties that may relate to its aesthetic appeal. It has been shown that natural scenes and a large variety of visual artworks possess a scale-invariant Fourier power spectrum that falls off linearly with increasing frequency in log-log plots. We asked whether images of text share this property. As expected, the Fourier spectrum of images of regular typed or handwritten text is highly anisotropic, i.e., the spectral image properties in vertical, horizontal, and oblique orientations differ. Moreover, the spatial frequency spectra of text images are not scale-invariant in any direction. The decline is shallower in the low-frequency part of the spectrum for text than for aesthetic artworks, whereas, in the high-frequency part, it is steeper. These results indicate that, in general, images of regular text contain less global structure (low spatial frequencies) relative to fine detail (high spatial frequencies) than images of aesthetics artworks. Moreover, we studied images of text with artistic claim (ornate print and calligraphy) and ornamental art. For some measures, these images assume average values intermediate between regular text and aesthetic artworks. Finally, to answer the question of whether the statistical properties measured by us are universal amongst humans or are subject to intercultural differences, we compared images from three different cultural backgrounds (Western, East Asian, and Arabic). Results for different categories (regular text, aesthetic writing, ornamental art, and fine art) were similar across cultures. PMID:23554592

  3. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    Science.gov (United States)

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  4. Statistical Study to Check the Conformity of Aggregate in Kirkuk City to Requirement of Iraqi Specification

    OpenAIRE

    Ammar Saleem Khazaal; Nizar N Ismeel; Abdel fattah K. Hussein

    2018-01-01

    This research reviews a statistical study to check the conformity of aggregates (Coarse and Fine) was used in Kirkuk city to the requirements of the Iraqi specifications. The data of sieve analysis (215 samples) of aggregates being obtained from of National Central Construction Laboratory and Technical College Construction Laboratory in Kirkuk city have analyzed using the statistical program SAS. The results showed that 5%, 17%, and 18% of fine aggregate samples are passing sieve sizes 10 mm,...

  5. Statistical learning and auditory processing in children with music training: An ERP study.

    Science.gov (United States)

    Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Ibrahim, Ronny; Arciuli, Joanne

    2017-07-01

    The question whether musical training is associated with enhanced auditory and cognitive abilities in children is of considerable interest. In the present study, we compared children with music training versus those without music training across a range of auditory and cognitive measures, including the ability to detect implicitly statistical regularities in input (statistical learning). Statistical learning of regularities embedded in auditory and visual stimuli was measured in musically trained and age-matched untrained children between the ages of 9-11years. In addition to collecting behavioural measures, we recorded electrophysiological measures to obtain an online measure of segmentation during the statistical learning tasks. Musically trained children showed better performance on melody discrimination, rhythm discrimination, frequency discrimination, and auditory statistical learning. Furthermore, grand-averaged ERPs showed that triplet onset (initial stimulus) elicited larger responses in the musically trained children during both auditory and visual statistical learning tasks. In addition, children's music skills were associated with performance on auditory and visual behavioural statistical learning tasks. Our data suggests that individual differences in musical skills are associated with children's ability to detect regularities. The ERP data suggest that musical training is associated with better encoding of both auditory and visual stimuli. Although causality must be explored in further research, these results may have implications for developing music-based remediation strategies for children with learning impairments. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  6. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  7. Number projected statistics and the pairing correlations at high excitation energies

    International Nuclear Information System (INIS)

    Esebbag, C.; Egido, J.L.

    1993-01-01

    We analyze the use of particle-number projected statistics (PNPS) as an effective way to include the quantum and statistical fluctuations, associated with the pairing degree of freedom, left out in finite-temperature mean-field theories. As a numerical application the exact-soluble degenerate model is worked out. In particular, we find that the sharp temperature-induced superfluid-normal phase transition, predicted in the mean-field approximations, is washed out in the PNPS. Some approximations as well as the Landau prescription to include statistical fluctuations are also discussed. We find that the Landau prescription provides a reasonable approximation to the PNPS. (orig.)

  8. The Nonlinear Statistics of High-contrast Patches in Natural Images

    DEFF Research Database (Denmark)

    Lee, Ann; Pedersen, Kim Steenstrup; Mumford, David

    2003-01-01

    described. In this study, we explore the space of data points representing the values of 3 × 3 high-contrast patches from optical and 3D range images. We find that the distribution of data is extremely sparse with the majority of the data points concentrated in clusters and non-linear low...

  9. High blood levels of persistent organic pollutants are statistically correlated with smoking

    DEFF Research Database (Denmark)

    Deutch, Bente; Hansen, Jens C.

    1999-01-01

    , smoking and intake of traditional Inuit food. Multiple linear regression analyses showed highly significant positive associations between the mothers' smoking status (never, previous, present) and plasma concentrations of all the studied organic pollutants both in maternal blood and umbilical cord blood...

  10. Application of a Statistical Linear Time-Varying System Model of High Grazing Angle Sea Clutter for Computing Interference Power

    Science.gov (United States)

    2017-12-08

    STATISTICAL LINEAR TIME-VARYING SYSTEM MODEL OF HIGH GRAZING ANGLE SEA CLUTTER FOR COMPUTING INTERFERENCE POWER 1. INTRODUCTION Statistical linear time...beam. We can approximate one of the sinc factors using the Dirichlet kernel to facilitate computation of the integral in (6) as follows: ∣∣∣∣sinc(WB...plotted in Figure 4. The resultant autocorrelation can then be found by substituting (18) into (28). The Python code used to generate Figures 1-4 is found

  11. Distribution of Oxycephalidae (Hyperiidea-Amphipoda) in the Indian Ocean- A statistical study

    Digital Repository Service at National Institute of Oceanography (India)

    Nair, K.K.C.; Jayalakshmy, K.V.

    Statistical analysis of oxycephalids on coexistence of the species showed two clusters of high affinity in the Arabian Sea, four in the Bay of Bengal, one in the South East Indian Ocean and three in the South West Indian Ocean. Species occurring...

  12. Statistical improvements in functional magnetic resonance imaging analyses produced by censoring high-motion data points.

    Science.gov (United States)

    Siegel, Joshua S; Power, Jonathan D; Dubis, Joseph W; Vogel, Alecia C; Church, Jessica A; Schlaggar, Bradley L; Petersen, Steven E

    2014-05-01

    Subject motion degrades the quality of task functional magnetic resonance imaging (fMRI) data. Here, we test two classes of methods to counteract the effects of motion in task fMRI data: (1) a variety of motion regressions and (2) motion censoring ("motion scrubbing"). In motion regression, various regressors based on realignment estimates were included as nuisance regressors in general linear model (GLM) estimation. In motion censoring, volumes in which head motion exceeded a threshold were withheld from GLM estimation. The effects of each method were explored in several task fMRI data sets and compared using indicators of data quality and signal-to-noise ratio. Motion censoring decreased variance in parameter estimates within- and across-subjects, reduced residual error in GLM estimation, and increased the magnitude of statistical effects. Motion censoring performed better than all forms of motion regression and also performed well across a variety of parameter spaces, in GLMs with assumed or unassumed response shapes. We conclude that motion censoring improves the quality of task fMRI data and can be a valuable processing step in studies involving populations with even mild amounts of head movement. Copyright © 2013 Wiley Periodicals, Inc.

  13. A Profile of Romanian Highly Educated Eco-Consumers Interested in Product Recycling A Statistical Approach

    Directory of Open Access Journals (Sweden)

    Simionescu Mihaela

    2014-07-01

    Full Text Available The objective of this research is to create a profile of the Romanian eco-consumer with university education. The profile is not limited to the information regarding environmental and economic benefits of recycling, but focuses on ecological behaviour. A detailed statistical analysis was made based on a large representative sample of respondents with secondary and university education. Indeed, the tendency of practical ecobehaviour becomes more pronounced for the people with university education. For people that are more than 30 years old the chance of being aware of the significance of the recycling symbols on the packages decreases, the lowest chance being given to people aged more than 50. The respondents that are interested in environment protection buy products with ecological symbols. However, those people who already know the meaning of these symbols do not buy this type of products for ecological reasons, even if they are interested in the environment protection. This research also offers an extensive description of its results, being an opportunity for the respondents to know more about the meaning of the recycling symbols. The results of this research also provide information being a guideline for consumers. This study achieves two main goals: the ecological component (the eco-consumers were identified and ordinary consumers were attracted through the ecological behaviour and the economic aspect (the resources allocation will be more efficient and the marketers will be able to address ecoconsumers who have specific characteristics.

  14. Matched case-control studies: a review of reported statistical methodology

    Directory of Open Access Journals (Sweden)

    Niven DJ

    2012-04-01

    Full Text Available Daniel J Niven1, Luc R Berthiaume2, Gordon H Fick1, Kevin B Laupland11Department of Critical Care Medicine, Peter Lougheed Centre, Calgary, 2Department of Community Health Sciences, University of Calgary, Calgary, Alberta, CanadaBackground: Case-control studies are a common and efficient means of studying rare diseases or illnesses with long latency periods. Matching of cases and controls is frequently employed to control the effects of known potential confounding variables. The analysis of matched data requires specific statistical methods.Methods: The objective of this study was to determine the proportion of published, peer reviewed matched case-control studies that used statistical methods appropriate for matched data. Using a comprehensive set of search criteria we identified 37 matched case-control studies for detailed analysis.Results: Among these 37 articles, only 16 studies were analyzed with proper statistical techniques (43%. Studies that were properly analyzed were more likely to have included case patients with cancer and cardiovascular disease compared to those that did not use proper statistics (10/16 or 63%, versus 5/21 or 24%, P = 0.02. They were also more likely to have matched multiple controls for each case (14/16 or 88%, versus 13/21 or 62%, P = 0.08. In addition, studies with properly analyzed data were more likely to have been published in a journal with an impact factor listed in the top 100 according to the Journal Citation Reports index (12/16 or 69%, versus 1/21 or 5%, P ≤ 0.0001.Conclusion: The findings of this study raise concern that the majority of matched case-control studies report results that are derived from improper statistical analyses. This may lead to errors in estimating the relationship between a disease and exposure, as well as the incorrect adaptation of emerging medical literature.Keywords: case-control, matched, dependent data, statistics

  15. Selecting the most appropriate inferential statistical test for your quantitative research study.

    Science.gov (United States)

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  16. Statistical learning of multisensory regularities is enhanced in musicians: An MEG study.

    Science.gov (United States)

    Paraskevopoulos, Evangelos; Chalas, Nikolas; Kartsidis, Panagiotis; Wollbrink, Andreas; Bamidis, Panagiotis

    2018-07-15

    The present study used magnetoencephalography (MEG) to identify the neural correlates of audiovisual statistical learning, while disentangling the differential contributions of uni- and multi-modal statistical mismatch responses in humans. The applied paradigm was based on a combination of a statistical learning paradigm and a multisensory oddball one, combining an audiovisual, an auditory and a visual stimulation stream, along with the corresponding deviances. Plasticity effects due to musical expertise were investigated by comparing the behavioral and MEG responses of musicians to non-musicians. The behavioral results indicated that the learning was successful for both musicians and non-musicians. The unimodal MEG responses are consistent with previous studies, revealing the contribution of Heschl's gyrus for the identification of auditory statistical mismatches and the contribution of medial temporal and visual association areas for the visual modality. The cortical network underlying audiovisual statistical learning was found to be partly common and partly distinct from the corresponding unimodal networks, comprising right temporal and left inferior frontal sources. Musicians showed enhanced activation in superior temporal and superior frontal gyrus. Connectivity and information processing flow amongst the sources comprising the cortical network of audiovisual statistical learning, as estimated by transfer entropy, was reorganized in musicians, indicating enhanced top-down processing. This neuroplastic effect showed a cross-modal stability between the auditory and audiovisual modalities. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Microvariability in AGNs: study of different statistical methods - I. Observational analysis

    Science.gov (United States)

    Zibecchi, L.; Andruchow, I.; Cellone, S. A.; Carpintero, D. D.; Romero, G. E.; Combi, J. A.

    2017-05-01

    We present the results of a study of different statistical methods currently used in the literature to analyse the (micro)variability of active galactic nuclei (AGNs) from ground-based optical observations. In particular, we focus on the comparison between the results obtained by applying the so-called C and F statistics, which are based on the ratio of standard deviations and variances, respectively. The motivation for this is that the implementation of these methods leads to different and contradictory results, making the variability classification of the light curves of a certain source dependent on the statistics implemented. For this purpose, we re-analyse the results on an AGN sample observed along several sessions with the 2.15 m 'Jorge Sahade' telescope (CASLEO), San Juan, Argentina. For each AGN, we constructed the nightly differential light curves. We thus obtained a total of 78 light curves for 39 AGNs, and we then applied the statistical tests mentioned above, in order to re-classify the variability state of these light curves and in an attempt to find the suitable statistical methodology to study photometric (micro)variations. We conclude that, although the C criterion is not proper a statistical test, it could still be a suitable parameter to detect variability and that its application allows us to get more reliable variability results, in contrast with the F test.

  18. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Science.gov (United States)

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  19. Statistics and Probability at Secondary Schools in the Federal State of Salzburg: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Wolfgang Voit

    2014-12-01

    Full Text Available Knowledge about the practical use of statistics and probability in today's mathematics instruction at secondary schools is vital in order to improve the academic education for future teachers. We have conducted an empirical study among school teachers to inform towards improved mathematics instruction and teacher preparation. The study provides a snapshot into the daily practice of instruction at school. Centered around the four following questions, the status of statistics and probability was examined. Where did  the current mathematics teachers study? What relevance do statistics and probability have in school? Which contents are actually taught in class? What kind of continuing education would be desirable for teachers? The study population consisted of all teachers of mathematics at secondary schools in the federal state of Salzburg.

  20. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  1. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  2. Statistical gamma-ray decay studies at iThemba LABS

    Directory of Open Access Journals (Sweden)

    Wiedeking M.

    2017-01-01

    Full Text Available A program to study the γ-ray decay from the region of high-level density has been established at iThemba LABS, where a high-resolution gamma-ray detector array is used in conjunction with silicon particle-telescopes. Results from two recent projects are presented: 1 The 74Ge(α,α′γ reaction was used to investigate the Pygmy Dipole Resonance. The results were compared to (γ,γ′ data and indicate that the dipole states split into mixed isospin and relatively pure isovector excitations. 2 Data from the 95Mo(d,p reaction were used to develop a novel method for the determination of spins for low-lying discrete levels utilizing statistical γ-ray decay in the vicinity of the neutron separation energy. These results provide insight into the competition of (γ,n and (γ,γ′ reactions and highlights the need to correct for angular momentum barrier effects.

  3. Multivariate statistical process control in product quality review assessment - A case study.

    Science.gov (United States)

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  4. Statistical analysis of the limitation of half integer resonances on the available momentum acceptance of the High Energy Photon Source

    Energy Technology Data Exchange (ETDEWEB)

    Jiao, Yi, E-mail: jiaoyi@ihep.ac.cn; Duan, Zhe

    2017-01-01

    In a diffraction-limited storage ring, half integer resonances can have strong effects on the beam dynamics, associated with the large detuning terms from the strong focusing and strong sextupoles as required for an ultralow emittance. In this study, the limitation of half integer resonances on the available momentum acceptance (MA) was statistically analyzed based on one design of the High Energy Photon Source (HEPS). It was found that the probability of MA reduction due to crossing of half integer resonances is closely correlated with the level of beta beats at the nominal tunes, but independent of the error sources. The analysis indicated that for the presented HEPS lattice design, the rms amplitude of beta beats should be kept below 1.5% horizontally and 2.5% vertically to reach a small MA reduction probability of about 1%.

  5. Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.

    Science.gov (United States)

    Yalch, Matthew M

    2016-03-01

    Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).

  6. Methodological Problems Of Statistical Study Of Regional Tourism And Tourist Expenditure

    Directory of Open Access Journals (Sweden)

    Anton Olegovich Ovcharov

    2015-03-01

    Full Text Available The aim of the work is the analysis of the problems of regional tourism statistics. The subject of the research is the tourism expenditure, the specificity of their recording and modeling. The methods of statistical observation and factor analysis are used. The article shows the features and directions of statistical methodology of tourism. A brief review of international publications on statistical studies of tourist expenditure is made. It summarizes the data from different statistical forms and shows the positive and negative trends in the development of tourism in Russia. It is concluded that the tourist industry in Russia is focused on outbound tourism rather than on inbound or internal. The features of statistical accounting and statistical analysis of tourism expenditure in Russian and international statistics are described. To assess the level of development of regional tourism the necessity of use the coefficient of efficiency of tourism. The reasons of the prevalence of imports over exports of tourism services are revealed using the data of the balance of payments. This is due to the raw material orientation of Russian exports and low specific weight of the account “Services” in the structure of the balance of payments. The additive model is also proposed in the paper. It describes the influence of three factors on the changes in tourist expenditure. These factors are the number of trips, the cost of a trip and structural changes in destinations and travel purposes. On the basis of the data from 2012–2013 we estimate the force and the direction of the influence of each factor. Testing of the model showed that the increase in tourism exports caused by the combined positive impact of all three factors, chief of which is the growing number of foreigners who visited Russia during the concerned period.

  7. Dental Calculus Links Statistically to Angina Pectoris: 26-Year Observational Study.

    Science.gov (United States)

    Söder, Birgitta; Meurman, Jukka H; Söder, Per-Östen

    2016-01-01

    Dental infections, such as periodontitis, associate with atherosclerosis and its complications. We studied a cohort followed-up since 1985 for incidence of angina pectoris with the hypothesis that calculus accumulation, proxy for poor oral hygiene, links to this symptom. In our Swedish prospective cohort study of 1676 randomly selected subjects followed-up for 26 years. In 1985 all subjects underwent clinical oral examination and answered a questionnaire assessing background variables such as socio-economic status and pack-years of smoking. By using data from the Center of Epidemiology, Swedish National Board of Health and Welfare, Sweden we analyzed the association of oral health parameters with the prevalence of in-hospital verified angina pectoris classified according to the WHO International Classification of Diseases, using descriptive statistics and logistic regression analysis. Of the 1676 subjects, 51 (28 women/23 men) had been diagnosed with angina pectoris at a mean age of 59.8 ± 2.9 years. No difference was observed in age and gender between patients with angina pectoris and subjects without. Neither was there any difference in education level and smoking habits (in pack years), Gingival index and Plaque index between the groups. Angina pectoris patients had significantly more often their first maxillary molar tooth extracted (d. 16) than the other subjects (p = 0.02). Patients also showed significantly higher dental calculus index values than the subjects without angina pectoris (p = 0.01). Multiple regression analysis showed odds ratio 2.21 (95% confidence interval 1.17-4.17) in the association between high calculus index and angina pectoris (p = 0.015). Our study hypothesis was confirmed by showing for the first time that high dental calculus score indeed associated with the incidence of angina pectoris in this cohort study.

  8. Gauge invariant lattice quantum field theory: Implications for statistical properties of high frequency financial markets

    Science.gov (United States)

    Dupoyet, B.; Fiebig, H. R.; Musgrove, D. P.

    2010-01-01

    We report on initial studies of a quantum field theory defined on a lattice with multi-ladder geometry and the dilation group as a local gauge symmetry. The model is relevant in the cross-disciplinary area of econophysics. A corresponding proposal by Ilinski aimed at gauge modeling in non-equilibrium pricing is implemented in a numerical simulation. We arrive at a probability distribution of relative gains which matches the high frequency historical data of the NASDAQ stock exchange index.

  9. Interactions among Knowledge, Beliefs, and Goals in Framing a Qualitative Study in Statistics Education

    Science.gov (United States)

    Groth, Randall E.

    2010-01-01

    In the recent past, qualitative research methods have become more prevalent in the field of statistics education. This paper offers thoughts on the process of framing a qualitative study by means of an illustrative example. The decisions that influenced the framing of a study of pre-service teachers' understanding of the concept of statistical…

  10. The kid, the clerk, and the gambler: Critical studies in statistics and cognitive science

    NARCIS (Netherlands)

    Madsen, M.W.

    2015-01-01

    This dissertation presents a series of case studies in linguistics, psychology, and statistics. These case studies take up a variety of theories, concepts, and debates, and in each case attempt to shed new light on these topics by consistently focusing on foundational issues.

  11. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    Science.gov (United States)

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  12. A statistical/computational/experimental approach to study the microstructural morphology of damage

    NARCIS (Netherlands)

    Hoefnagels, J. P. M.; Du, C.; de Geus, T. W. J.; Peerlings, R. H. J.; Geers, M. G. D.; Beese, A.M.; Zehnder, A.T.; Xia, Sh.

    2016-01-01

    The fractural behavior of multi-phase materials is not well understood. Therefore, a statistic study of micro-failures is conducted to deepen our insights on the failure mechanisms. We systematically studied the influence of the morphology of dual phase (DP) steel on the fracture behavior at the

  13. A new universality class in corpus of texts; A statistical physics study

    Science.gov (United States)

    Najafi, Elham; Darooneh, Amir H.

    2018-05-01

    Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.

  14. Mesoscale modeling of smoke transport over Central Africa: influences of trade winds, subtropical high, ITCZ and vertical statistics

    Science.gov (United States)

    Yang, Z.; Wang, J.; Hyer, E. J.; Ichoku, C. M.

    2012-12-01

    A fully-coupled meteorology-chemistry-aerosol model, Weather Research and Forecasting model with Chemistry (WRF-Chem), is used to simulate the transport of smoke aerosol over the Central Africa during February 2008. Smoke emission used in this study is specified from the Fire Locating and Modeling of Burning Emissions (FLAMBE) database derived from Moderate Resolution Imaging Spectroradiometer (MODIS) fire products. Model performance is evaluated using MODIS true color images, measured Aerosol Optical Depth (AOD) from space-borne MODIS (550 nm) and ground-based AERONET (500 nm), and Cloud-Aerosol Lidar data with Orthogonal Polarization (CALIOP) level 1 and 2 products. The simulated smoke transport is in good agreement with the validation data. Analyzing from three smoke events, smoke is constrained in a narrow belt between the Equator and 10°N near the surface, with the interplay of trade winds, subtropical high, and ITCZ. At the 700 hpa level, smoke expands farther meridionally. Topography blocks the smoke transport to the southeast of study area, because of high mountains located near the Great Rift Valley region. The simulation with injection height of 650 m is consistent with CALIOP measurements. The particular phenomenon, aerosol above cloud, is studied statistically from CALIOP observations. The total percentage of aerosol above cloud is about 5%.

  15. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    International Nuclear Information System (INIS)

    Clerc, F; Njiki-Menga, G-H; Witschger, O

    2013-01-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a

  16. Studying the microlenses mass function from statistical analysis of the caustic concentration

    Energy Technology Data Exchange (ETDEWEB)

    Mediavilla, T; Ariza, O [Departamento de Estadistica e Investigacion Operativa, Universidad de Cadiz, Avda de Ramon Puyol, s/n 11202 Algeciras (Spain); Mediavilla, E [Instituto de Astrofisica de Canarias, Avda Via Lactea s/n, La Laguna (Spain); Munoz, J A, E-mail: teresa.mediavilla@ca.uca.es, E-mail: octavio.ariza@uca.es, E-mail: emg@iac.es [Departamento de Astrofisica y Astronomia, Universidad de Valencia, Burjassot, Valencia (Spain)

    2011-09-22

    The statistical distribution of caustic crossings by the images of a lensed quasar depends on the properties of the distribution of microlenses in the lens galaxy. We use a procedure based in Inverse Polygon Mapping to easily identify the critical and caustic curves generated by a distribution of stars in the lens galaxy. We analyze the statistical distributions of the number of caustic crossings by a pixel size source for several projected mass densities and different mass distributions. We compare the results of simulations with theoretical binomial distributions. Finally we apply this method to the study of the stellar mass distribution in the lens galaxy of QSO 2237+0305.

  17. [Confidence interval or p-value--similarities and differences between two important methods of statistical inference of quantitative studies].

    Science.gov (United States)

    Harari, Gil

    2014-01-01

    Statistic significance, also known as p-value, and CI (Confidence Interval) are common statistics measures and are essential for the statistical analysis of studies in medicine and life sciences. These measures provide complementary information about the statistical probability and conclusions regarding the clinical significance of study findings. This article is intended to describe the methodologies, compare between the methods, assert their suitability for the different needs of study results analysis and to explain situations in which each method should be used.

  18. State of Business Statistics Education in MENA Region: A Comparative Study with Best Practices

    Science.gov (United States)

    Hijazi, Rafiq; Zoubeidi, Taoufik

    2017-01-01

    Purpose: The purpose of this study is to investigate the state of undergraduate business statistics education in the Middle East and North Africa (MENA) and assess its alignment with the best practices in equipping business graduates with the knowledge and skills demanded by the labor market. Design/methodology/approach: A survey of 108…

  19. Learning Curves and Bootstrap Estimates for Inference with Gaussian Processes: A Statistical Mechanics Study

    DEFF Research Database (Denmark)

    Malzahn, Dorthe; Opper, Manfred

    2003-01-01

    We employ the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based...... on Gaussian processes, we discuss Bootstrap estimates for learning curves....

  20. Nonparametric and group-based person-fit statistics : a validity study and an empirical example

    NARCIS (Netherlands)

    Meijer, R.R.

    1994-01-01

    In person-fit analysis, the object is to investigate whether an item score pattern is improbable given the item score patterns of the other persons in the group or given what is expected on the basis of a test model. In this study, several existing group-based statistics to detect such improbable

  1. Study of relationship between MUF correlation and detection sensitivity of statistical analysis

    International Nuclear Information System (INIS)

    Tamura, Toshiaki; Ihara, Hitoshi; Yamamoto, Yoichi; Ikawa, Koji

    1989-11-01

    Various kinds of statistical analysis are proposed to NRTA (Near Real Time Materials Accountancy) which was devised to satisfy the timeliness goal of one of the detection goals of IAEA. It will be presumed that different statistical analysis results will occur between the case of considered rigorous error propagation (with MUF correlation) and the case of simplified error propagation (without MUF correlation). Therefore, measurement simulation and decision analysis were done using flow simulation of 800 MTHM/Y model reprocessing plant, and relationship between MUF correlation and detection sensitivity and false alarm of statistical analysis was studied. Specific character of material accountancy for 800 MTHM/Y model reprocessing plant was grasped by this simulation. It also became clear that MUF correlation decreases not only false alarm but also detection probability for protracted loss in case of CUMUF test and Page's test applied to NRTA. (author)

  2. Ionization-potential depression and other dense plasma statistical property studies - Application to spectroscopic diagnostics.

    Science.gov (United States)

    Calisti, Annette; Ferri, Sandrine; Mossé, Caroline; Talin, Bernard

    2017-02-01

    The radiative properties of an emitter surrounded by a plasma, are modified through various mechanisms. For instance the line shapes emitted by bound-bound transitions are broadened and carry useful information for plasma diagnostics. Depending on plasma conditions the electrons occupying the upper quantum levels of radiators no longer exist as they belong to the plasma free electron population. All the charges present in the radiator environment contribute to the lowering of the energy required to free an electron in the fundamental state. This mechanism is known as ionization potential depression (IPD). The knowledge of IPD is useful as it affects both the radiative properties of the various ionic states and their populations. Its evaluation deals with highly complex n-body coupled systems, involving particles with different dynamics and attractive ion-electron forces. A classical molecular dynamics (MD) code, the BinGo-TCP code, has been recently developed to simulate neutral multi-component (various charge state ions and electrons) plasma accounting for all the charge correlations. In the present work, results on IPD and other dense plasma statistical properties obtained using the BinGo-TCP code are presented. The study focuses on aluminum plasmas for different densities and several temperatures in order to explore different plasma coupling conditions.

  3. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  4. Polarizing a stored proton beam by spin flip? - A high statistic reanalysis

    International Nuclear Information System (INIS)

    Oellers, Dieter

    2011-01-01

    Prompted by recent, conflicting calculations, we have carried out a measurement of the spin flip cross section in low-energy electron-proton scattering. The experiment uses the cooling electron beam at COSY as an electron target. A reanalysis of the data leeds to a reduced statistical errors resulting in a factor of 4 reduced upper limit for the spin flip cross section. The measured cross sections are too small for making spin flip a viable tool in polarizing a stored beam.

  5. Study of groundwater arsenic pollution in Lanyang Plain using multivariate statistical analysis

    Science.gov (United States)

    chan, S.

    2013-12-01

    The study area, Lanyang Plain in the eastern Taiwan, has highly developed agriculture and aquaculture, which consume over 70% of the water supplies. Groundwater is frequently considered as an alternative water source. However, the serious arsenic pollution of groundwater in Lanyan Plain should be well studied to ensure the safety of groundwater usage. In this study, 39 groundwater samples were collected. The results of hydrochemistry demonstrate two major trends in Piper diagram. The major trend with most of groundwater samples is determined with water type between Ca+Mg-HCO3 and Na+K-HCO3. This can be explained with cation exchange reaction. The minor trend is obviously corresponding to seawater intrusion, which has water type of Na+K-Cl, because the localities of these samples are all in the coastal area. The multivariate statistical analysis on hydrochemical data was conducted for further exploration on the mechanism of arsenic contamination. Two major factors can be extracted with factor analysis. The major factor includes Ca, Mg and Sr while the minor factor includes Na, K and As. This reconfirms that cation exchange reaction mainly control the groundwater hydrochemistry in the study area. It is worth to note that arsenic is positively related to Na and K. The result of cluster analysis shows that groundwater samples with high arsenic concentration can be grouped into that with high Na, K and HCO3. This supports that cation exchange would enhance the release of arsenic and exclude the effect of seawater intrusion. In other words, the water-rock reaction time is key to obtain higher arsenic content. In general, the major source of arsenic in sediments include exchangeable, reducible and oxidizable phases, which are adsorbed ions, Fe-Mn oxides and organic matters/pyrite, respectively. However, the results of factor analysis do not show apparent correlation between arsenic and Fe/Mn. This may exclude Fe-Mn oxides as a major source of arsenic. The other sources

  6. Statistical inferences under the Null hypothesis: Common mistakes and pitfalls in neuroimaging studies.

    Directory of Open Access Journals (Sweden)

    Jean-Michel eHupé

    2015-02-01

    Full Text Available Published studies using functional and structural MRI include many errors in the way data are analyzed and conclusions reported. This was observed when working on a comprehensive review of the neural bases of synesthesia, but these errors are probably endemic to neuroimaging studies. All studies reviewed had based their conclusions using Null Hypothesis Significance Tests (NHST. NHST have yet been criticized since their inception because they are more appropriate for taking decisions related to a Null hypothesis (like in manufacturing than for making inferences about behavioral and neuronal processes. Here I focus on a few key problems of NHST related to brain imaging techniques, and explain why or when we should not rely on significance tests. I also observed that, often, the ill-posed logic of NHST was even not correctly applied, and describe what I identified as common mistakes or at least problematic practices in published papers, in light of what could be considered as the very basics of statistical inference. MRI statistics also involve much more complex issues than standard statistical inference. Analysis pipelines vary a lot between studies, even for those using the same software, and there is no consensus which pipeline is the best. I propose a synthetic view of the logic behind the possible methodological choices, and warn against the usage and interpretation of two statistical methods popular in brain imaging studies, the false discovery rate (FDR procedure and permutation tests. I suggest that current models for the analysis of brain imaging data suffer from serious limitations and call for a revision taking into account the new statistics (confidence intervals logic.

  7. Study of the effects of photoelectron statistics on Thomson scattering data

    International Nuclear Information System (INIS)

    Hart, G.W.; Levinton, F.M.; McNeill, D.H.

    1986-01-01

    A computer code has been developed which simulates a Thomson scattering measurement, from the counting statistics of the input channels through the mathematical analysis of the data. The scattered and background signals in each of the wavelength channels are assumed to obey Poisson statistics, and the spectral data are fitted to a Gaussian curve using a nonlinear least-squares fitting algorithm. This method goes beyond the usual calculation of the signal-to-noise ratio for the hardware and gives a quantitative measure of the effect of the noise on the final measurement. This method is applicable to Thomson scattering measurements in which the signal-to-noise ratio is low due to either low signal or high background. Thomson scattering data from the S-1 spheromak have been compared to this simulation, and they have been found to be in good agreement. This code has proven to be useful in assessing the effects of counting statistics relative to shot-to-shot variability in producing the observed spread in the data. It was also useful for designing improvements for the S-1 Thomson scattering system, and this method would be applicable to any measurement affected by counting statistics

  8. Statistical study of the reproductive hormones in relation to age and PCOS for patients undergoing in vitro investigation in Khartoum

    International Nuclear Information System (INIS)

    Abdelgadir, O. M.

    2002-09-01

    In this study 587 Sudanese woman were studied those women were referred to gynecological clinics a infertile cases. Hormonal investigations were done for them, prolactin, (PRL). Female stimulating hormones (FSH) luotulizing hormones (LH) level were analyzed at Sudan Atomic Energy Commission (SAEC), (RIA ) lab, with the radioimmunoassay (RIA) method. The objective of this study was to find the relation between age versus hyperprolicinemia and (PCOS) polycystic ovary syndrome. Statistical analysis was done with the (SPSS) computer program. The result was 39.2% of the total patient 587 were high prolactin level hyper prolactin >370 mu/I which 10% of them were in the age between 25-30 years old. Age between 30-35 years old was found to be high frequency complain high FSH levels (>8 mu/ I) 29.1% of the patients. Found to be of high LH/FSH. Ratio which clear indication of polycystic ovary syndrome. (PCOS). (Author)

  9. Statistical Optimization of Medium Compositions for High Cell Mass and Exopolysaccharide Production by Lactobacillus plantarum ATCC 8014

    Directory of Open Access Journals (Sweden)

    Nor Zalina Othman

    2018-03-01

    Full Text Available Background and Objective: Lactobacillus plantarum ATCC 8014 is known as a good producer of water soluble exopolysaccharide. Therefore, the aim of this study is to optimize the medium composition concurrently for high cell mass and exopolysaccharide production by Lactobacillus plantarum ATCC 8014. Since both are useful for food and pharmaceutical application and where most studies typically focus on one outcome only, the optimization process was carried out by using molasses as cheaper carbon source.Material and Methods: The main medium component which is known significantly give high effect on the cell mass and EPS production was selected as variables and statistically optimized based on Box-Behnken design in shake flask levels. The optimal medium for cell mass and exopolysaccharide production was composed of (in g l -1: molasses, 40; yeast extract, 16.8; phosphate, 2.72; sodium acetate, 3.98. The model was found to be significant and subsequently validated through the growth kinetics studies in un-optimized and optimized medium in the shake flask cultivation.Results and Conclusion: The maximum cell mass and exopolysaccharide in the new optimized medium was 4.40 g l-1 and 4.37 g l-1 respectively after 44 h of the cultivation. As a result, cell mass and exopolysaccharide production increased up to 4.5 and 16.5 times respectively, and the maximal exopolysaccharide yield of 1.19 per gram of cells was obtained when molasses was used as the carbon source. In conclusion, molasses has the potential to be a cheap carbon source for the cultivation of Lactobacillus plantarum ATCC 8014 concurrently for high cell mass and exopolysaccharide production.Conflict of interest: The authors declare no conflict of interest.

  10. The polar cusp from a particle point of view: A statistical study based on Viking data

    International Nuclear Information System (INIS)

    Aparicio, B.; Thelin, B.; Lundin, R.

    1991-01-01

    The authors present results from the particle measurements made on board the Viking satellite. For the period of interest the Viking orbits covered at high latitudes the whole dayside sector. Data from the Viking V-3 particle experiment acquired during the Polar Region Outer Magnetospheric International Study period have been used to study the extension of the cusp and cleft in magnetic local time and invariant latitude, and furthermore, their dependence on solar wind and interplanetary magnetic field parameters. The study is limited to the MLT range from 0900 to 1500 and to invariant latitudes (ILAT) from 74 degree to 82 degree. This region is divided into bins of size. The authors concentrated on the region where magnetosheath solar wind plasma penetrates more directly into the magnetosphere and is measured at Viking altitudes. This region is called the cusp proper, to be distinguished from a broader region denoted the cleft, where more energetic particles are observed. Statistically, they find the cusp proper to extend from invariant latitudes of 75 degree to 82 degree and magnetic local times from 0930 to 1400 MLT. The width in ILAT is found to be on average ∼2 degree and in MLT ∼2 hours. It is shown that a clear correlation exists between the densities in the cusp proper calculated from the Viking V-3 experiment in the cusp proper and those in the solar wind calculated from IMP 8 measurements. It is also shown that the position of the cusp proper in MLT depends on the sense of the By component of the interplanetary magnetic field (IMF By), giving a well-defined displacement of the region of maximum occurrence toward earlier MLTs for IMF By 0

  11. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    Science.gov (United States)

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  12. Application of the non-extensive statistical approach to high energy particle collisions

    Science.gov (United States)

    Bíró, Gábor; Barnaföldi, Gergely Gábor; Biró, Tamás Sándor; Ürmössy, Károly

    2017-06-01

    In high-energy collisions the number of created particles is far less than the thermodynamic limit, especially in small colliding systems (e.g. proton-proton). Therefore final-state effects and fluctuations in the one-particle energy distribution are appreciable. As a consequence the characterization of identified hadron spectra with the Boltzmann - Gibbs thermodynamical approach is insuffcient [1]. Instead particle spectra measured in high-energy collisions can be described very well with Tsallis -Pareto distributions, derived from non-extensive thermodynamics [2, 3]. Using the Tsallis q-entropy formula, a generalization of the Boltzmann - Gibbs entropy, we interpret the microscopic physics by analysing the Tsallis q and T parameters. In this paper we give a quick overview on these parameters, analyzing identified hadron spectra from recent years in a wide center-of-mass energy range. We demonstrate that the fitted Tsallis-parameters show dependency on the center-of-mass energy and particle species. Our findings are described well by a QCD inspired evolution ansatz. Based on this comprehensive study, apart from the evolution, both mesonic and barionic components found to be non-extensive (q > 1), beside the mass ordered hierarchy observed in parameter T.

  13. High blood levels of persistent organic pollutants are statistically correlated with smoking

    DEFF Research Database (Denmark)

    Deutch, Bente; Hansen, Jens C.

    1999-01-01

    , smoking and intake of traditional Inuit food. Multiple linear regression analyses showed highly significant positive associations between the mothers' smoking status (never, previous, present) and plasma concentrations of all the studied organic pollutants both in maternal blood and umbilical cord blood......Persistent Organic Pollutants (11 pesticides and 14 PCB-congeners), and heavy metals (Cd, Cu, Hg, Pb, Se, and Zn) were determined in 175 pregnant women and 160 newborn infants (umbilical cord blood) from Disko Bay, Greenland, 1994-96. Among these, 135 women filled out questionnaires about drinking....... Traditional food and not the tobacco is known to be the source of the contaminants. But smoking may influence the enzymatic turnover of toxic substances....

  14. A study of outliers in statistical distributions of mechanical properties of structural steels

    International Nuclear Information System (INIS)

    Oefverbeck, P.; Oestberg, G.

    1977-01-01

    The safety against failure of pressure vessels can be assessed by statistical methods, so-called probabilistic fracture mechanics. The data base for such estimations is admittedly rather meagre, making it necessary to assume certain conventional statistical distributions. Since the failure rates arrived at are low, for nuclear vessels of the order of 10 - to 10 - per year, the extremes of the variables involved, among other things the mechanical properties of the steel used, are of particular interest. A question sometimes raised is whether outliers, or values exceeding the extremes in the assumed distributions, might occur. In order to explore this possibility a study has been made of strength values of three qualities of structural steels, available in samples of up to about 12,000. Statistical evaluation of these samples with respect to outliers, using standard methods for this purpose, revealed the presence of such outliers in most cases, with a frequency of occurrence of, typically, a few values per thousand, estimated by the methods described. Obviously, statistical analysis alone cannot be expected to shed any light on the causes of outliers. Thus, the interpretation of these results with respect to their implication for the probabilistic estimation of the integrety of pressure vessels must await further studies of a similar nature in which the test specimens corresponding to outliers can be recovered and examined metallographically. For the moment the results should be regarded only as a factor to be considered in discussions of the safety of pressure vessels. (author)

  15. Connecting functional and statistical definitions of genotype by genotype interactions in coevolutionary studies

    Directory of Open Access Journals (Sweden)

    Katy Denise Heath

    2014-04-01

    Full Text Available Predicting how species interactions evolve requires that we understand the mechanistic basis of coevolution, and thus the functional genotype-by-genotype interactions (G × G that drive reciprocal natural selection. Theory on host-parasite coevolution provides testable hypotheses for empiricists, but depends upon models of functional G × G that remain loosely tethered to the molecular details of any particular system. In practice, reciprocal cross-infection studies are often used to partition the variation in infection or fitness in a population that is attributable to G × G (statistical G × G. Here we use simulations to demonstrate that within-population statistical G × G likely tells us little about the existence of coevolution, its strength, or the genetic basis of functional G × G. Combined with studies of multiple populations or points in time, mapping and molecular techniques can bridge the gap between natural variation and mechanistic models of coevolution, while model-based statistics can formally confront coevolutionary models with cross-infection data. Together these approaches provide a robust framework for inferring the infection genetics underlying statistical G × G, helping unravel the genetic basis of coevolution.

  16. Sizing for the apparel industry using statistical analysis - a Brazilian case study

    Science.gov (United States)

    Capelassi, C. H.; Carvalho, M. A.; El Kattel, C.; Xu, B.

    2017-10-01

    The study of the body measurements of Brazilian women used the Kinect Body Imaging system for 3D body scanning. The result of the study aims to meet the needs of the apparel industry for accurate measurements. Data was statistically treated using the IBM SPSS 23 system, with 95% confidence (P 0,58) and from the Hip-to-Height Ratio - HHR (bottom portion): Small (HHR 0,68).

  17. High-throughput automated system for statistical biosensing employing microcantilevers arrays

    DEFF Research Database (Denmark)

    Bosco, Filippo; Chen, Ching H.; Hwu, En T.

    2011-01-01

    In this paper we present a completely new and fully automated system for parallel microcantilever-based biosensing. Our platform is able to monitor simultaneously the change of resonance frequency (dynamic mode), of deflection (static mode), and of surface roughness of hundreds of cantilevers...... in a very short time over multiple biochemical reactions. We have proven that our system is capable to measure 900 independent microsensors in less than a second. Here, we report statistical biosensing results performed over a haptens-antibody assay, where complete characterization of the biochemical...

  18. Murder-suicide of the jealous paranoia type: a multicenter statistical pilot study.

    Science.gov (United States)

    Palermo, G B; Smith, M B; Jenzten, J M; Henry, T E; Konicek, P J; Peterson, G F; Singh, R P; Witeck, M J

    1997-12-01

    The authors present a pilot statistical study of murder-suicide comprising 32 cases from the years 1990-1992, collected from the offices of the medical examiners of seven counties in five of the United States. The study includes brief reviews of previous statistical surveys of murder, murder-suicide, and suicide. This present study's conclusions parallel the findings of previous research on the demographic characteristics of the perpetrators of murder-suicide, the relationship between killers and victims, the types of weapon used, locations of the incidents, and the time intervals between the murder and suicide. It also highlights the similarities between the characteristics of the perpetrator of murder-suicide and those of persons who commit only suicide, supporting the thesis that murder-suicide is an extended suicide. Suggestions for prevention of such a type of crime are offered.

  19. What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science.

    Science.gov (United States)

    Patil, Prasad; Peng, Roger D; Leek, Jeffrey T

    2016-07-01

    A recent study of the replicability of key psychological findings is a major contribution toward understanding the human side of the scientific process. Despite the careful and nuanced analysis reported, the simple narrative disseminated by the mass, social, and scientific media was that in only 36% of the studies were the original results replicated. In the current study, however, we showed that 77% of the replication effect sizes reported were within a 95% prediction interval calculated using the original effect size. Our analysis suggests two critical issues in understanding replication of psychological studies. First, researchers' intuitive expectations for what a replication should show do not always match with statistical estimates of replication. Second, when the results of original studies are very imprecise, they create wide prediction intervals-and a broad range of replication effects that are consistent with the original estimates. This may lead to effects that replicate successfully, in that replication results are consistent with statistical expectations, but do not provide much information about the size (or existence) of the true effect. In this light, the results of the Reproducibility Project: Psychology can be viewed as statistically consistent with what one might expect when performing a large-scale replication experiment. © The Author(s) 2016.

  20. Approximations to the distribution of a test statistic in covariance structure analysis: A comprehensive study.

    Science.gov (United States)

    Wu, Hao

    2018-05-01

    In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.

  1. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  2. Using a higher criticism statistic to detect modest effects in a genome-wide study of rheumatoid arthritis

    Science.gov (United States)

    2009-01-01

    In high-dimensional studies such as genome-wide association studies, the correction for multiple testing in order to control total type I error results in decreased power to detect modest effects. We present a new analytical approach based on the higher criticism statistic that allows identification of the presence of modest effects. We apply our method to the genome-wide study of rheumatoid arthritis provided in the Genetic Analysis Workshop 16 Problem 1 data set. There is evidence for unknown bias in this study that could be explained by the presence of undetected modest effects. We compared the asymptotic and empirical thresholds for the higher criticism statistic. Using the asymptotic threshold we detected the presence of modest effects genome-wide. We also detected modest effects using 90th percentile of the empirical null distribution as a threshold; however, there is no such evidence when the 95th and 99th percentiles were used. While the higher criticism method suggests that there is some evidence for modest effects, interpreting individual single-nucleotide polymorphisms with significant higher criticism statistics is of undermined value. The goal of higher criticism is to alert the researcher that genetic effects remain to be discovered and to promote the use of more targeted and powerful studies to detect the remaining effects. PMID:20018032

  3. Study of energy fluctuation effect on the statistical mechanics of equilibrium systems

    International Nuclear Information System (INIS)

    Lysogorskiy, Yu V; Wang, Q A; Tayurskii, D A

    2012-01-01

    This work is devoted to the modeling of energy fluctuation effect on the behavior of small classical thermodynamic systems. It is known that when an equilibrium system gets smaller and smaller, one of the major quantities that becomes more and more uncertain is its internal energy. These increasing fluctuations can considerably modify the original statistics. The present model considers the effect of such energy fluctuations and is based on an overlapping between the Boltzmann-Gibbs statistics and the statistics of the fluctuation. Within this o verlap statistics , we studied the effects of several types of energy fluctuations on the probability distribution, internal energy and heat capacity. It was shown that the fluctuations can considerably change the temperature dependence of internal energy and heat capacity in the low energy range and at low temperatures. Particularly, it was found that, due to the lower energy limit of the systems, the fluctuations reduce the probability for the low energy states close to the lowest energy and increase the total average energy. This energy increasing is larger for lower temperatures, making negative heat capacity possible for this case.

  4. A Statistical Study of Eiscat Electron and Ion Temperature Measurements In The E-region

    Science.gov (United States)

    Hussey, G.; Haldoupis, C.; Schlegel, K.; Bösinger, T.

    Motivated by the large EISCAT data base, which covers over 15 years of common programme operation, and previous statistical work with EISCAT data (e.g., C. Hal- doupis, K. Schlegel, and G. Hussey, Auroral E-region electron density gradients mea- sured with EISCAT, Ann. Geopshysicae, 18, 1172-1181, 2000), a detailed statistical analysis of electron and ion EISCAT temperature measurements has been undertaken. This study was specifically concerned with the statistical dependence of heating events with other ambient parameters such as the electric field and electron density. The re- sults showed previously reported dependences such as the electron temperature being directly correlated with the ambient electric field and inversely related to the electron density. However, these correlations were found to be also dependent upon altitude. There was also evidence of the so called "Schlegel effect" (K. Schlegel, Reduced effective recombination coefficient in the disturbed polar E-region, J. Atmos. Terr. Phys., 44, 183-185, 1982); that is, the heated electron gas leads to increases in elec- tron density through a reduction in the recombination rate. This paper will present the statistical heating results and attempt to offer physical explanations and interpretations of the findings.

  5. Systematic Analysis of the Non-Extensive Statistical Approach in High Energy Particle Collisions—Experiment vs. Theory †

    Directory of Open Access Journals (Sweden)

    Gábor Bíró

    2017-02-01

    Full Text Available The analysis of high-energy particle collisions is an excellent testbed for the non-extensive statistical approach. In these reactions we are far from the thermodynamical limit. In small colliding systems, such as electron-positron or nuclear collisions, the number of particles is several orders of magnitude smaller than the Avogadro number; therefore, finite-size and fluctuation effects strongly influence the final-state one-particle energy distributions. Due to the simple characterization, the description of the identified hadron spectra with the Boltzmann–Gibbs thermodynamical approach is insufficient. These spectra can be described very well with Tsallis–Pareto distributions instead, derived from non-extensive thermodynamics. Using the q-entropy formula, we interpret the microscopic physics in terms of the Tsallis q and T parameters. In this paper we give a view on these parameters, analyzing identified hadron spectra from recent years in a wide center-of-mass energy range. We demonstrate that the fitted Tsallis-parameters show dependency on the center-of-mass energy and particle species (mass. Our findings are described well by a QCD (Quantum Chromodynamics inspired parton evolution ansatz. Based on this comprehensive study, apart from the evolution, both mesonic and baryonic components found to be non-extensive ( q > 1 , besides the mass ordered hierarchy observed in the parameter T. We also study and compare in details the theory-obtained parameters for the case of PYTHIA8 Monte Carlo Generator, perturbative QCD and quark coalescence models.

  6. High diversity of beta-lactamases in the General Hospital Vienna verified by whole genome sequencing and statistical analysis.

    Science.gov (United States)

    Barišić, Ivan; Mitteregger, Dieter; Hirschl, Alexander M; Noehammer, Christa; Wiesinger-Mayr, Herbert

    2014-10-01

    The detailed analysis of antibiotic resistance mechanisms is essential for understanding the underlying evolutionary processes, the implementation of appropriate intervention strategies and to guarantee efficient treatment options. In the present study, 110 β-lactam-resistant, clinical isolates of Enterobacteriaceae sampled in 2011 in one of Europe's largest hospitals, the General Hospital Vienna, were screened for the presence of 31 β-lactamase genes. Twenty of those isolates were selected for whole genome sequencing (WGS). In addition, the number of β-lactamase genes was estimated using biostatistical models. The carbapenemase genes blaKPC-2, blaKPC-3, and blaVIM-4 were identified in carbapenem-resistant and intermediate susceptible isolates, blaOXA-72 in an extended-spectrum β-lactamase (ESBL)-positive one. Furthermore, the observed high prevalence of the acquired blaDHA-1 and blaCMY AmpC β-lactamase genes (70%) in phenotypically AmpC-positive isolates is alarming due to their capability to become carbapenem-resistant upon changes in membrane permeability. The statistical analyses revealed that approximately 55% of all β-lactamase genes present in the General Hospital Vienna were detected by this study. In summary, this work gives a very detailed picture on the disseminated β-lactamases and other resistance genes in one of Europe's largest hospitals. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Novel asymptotic results on the high-order statistics of the channel capacity over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-06-01

    The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.

  8. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    Science.gov (United States)

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-03-29

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  9. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  10. Monitoring and Evaluation; Statistical Support for Life-cycle Studies, 2003 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Skalski, John

    2003-12-01

    This report summarizes the statistical analysis and consulting activities performed under Contract No. 00004134, Project No. 199105100 funded by Bonneville Power Administration during 2003. These efforts are focused on providing real-time predictions of outmigration timing, assessment of life-history performance measures, evaluation of status and trends in recovery, and guidance on the design and analysis of Columbia Basin fish and wildlife studies monitoring and evaluation studies. The overall objective of the project is to provide BPA and the rest of the fisheries community with statistical guidance on design, analysis, and interpretation of monitoring data, which will lead to improved monitoring and evaluation of salmonid mitigation programs in the Columbia/Snake River Basin. This overall goal is being accomplished by making fisheries data readily available for public scrutiny, providing statistical guidance on the design and analyses of studies by hands-on support and written documents, and providing real-time analyses of tagging results during the smolt outmigration for review by decision makers. For a decade, this project has been providing in-season projections of smolt outmigration timing to assist in spill management. As many as 50 different fish stocks at 8 different hydroprojects are tracked and real-time to predict the 'percent of run to date' and 'date to specific percentile'. The project also conducts added-value analyses of historical tagging data to understand relationships between fish responses, environmental factors, and anthropogenic effects. The statistical analysis of historical tagging data crosses agency lines in order to assimilate information on salmon population dynamics irrespective of origin. The lessons learned from past studies are used to improve the design and analyses of future monitoring and evaluation efforts. Through these efforts, the project attempts to provide the fisheries community with reliable analyses

  11. Study on the Orion spiral arm structure by the statistical modelling method

    International Nuclear Information System (INIS)

    Basharina, T.S.; Pavlovskaya, E.D.; Filippova, A.A.

    1980-01-01

    A method of investigation of the spiral structure based on the statistical modelling methods is suggested. This method is used for the study of the Orion spiral arm. The maxima of density and the widths of the Orion arm in the direction of the areas considered for the longitude interval 55 deg - 187 deg are defined under the assumption of normal distribution of stars across the arm. The Sun is shown to be at the inner edge of the arm [ru

  12. Current and high-β sheets in CIR streams: statistics and interaction with the HCS and the magnetosphere

    Science.gov (United States)

    Potapov, A. S.

    2018-04-01

    Thirty events of CIR streams (corotating interaction regions between fast and slow solar wind) were analyzed in order to study statistically plasma structure within the CIR shear zones and to examine the interaction of the CIRs with the heliospheric current sheet (HCS) and the Earth's magnetosphere. The occurrence of current layers and high-beta plasma sheets in the CIR structure has been estimated. It was found that on average, each of the CIR streams had four current layers in its structure with a current density of more than 0.12 A/m2 and about one and a half high-beta plasma regions with a beta value of more than five. Then we traced how and how often the high-speed stream associated with the CIR can catch up with the heliospheric current sheet (HCS) and connect to it. The interface of each fourth CIR stream coincided in time within an hour with the HCS, but in two thirds of cases, the CIR connection with the HCS was completely absent. One event of the simultaneous observation of the CIR stream in front of the magnetosphere by the ACE satellite in the vicinity of the L1 libration point and the Wind satellite in the remote geomagnetic tail was considered in detail. Measurements of the components of the interplanetary magnetic field and plasma parameters showed that the overall structure of the stream is conserved. Moreover, some details of the fine structure are also transferred through the magnetosphere. In particular, the so-called "magnetic hole" almost does not change its shape when moving from L1 point to a neighborhood of L2 point.

  13. Statistical Analysis And Treatment Of Accident Black Spots: A Case Study Of Nandyal Mandal

    Science.gov (United States)

    Sudharshan Reddy, B.; Vishnu Vardhan Reddy, L.; Sreenivasa Reddy, G., Dr

    2017-08-01

    Background: Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. Nandyal Mandal is located in the Kurnool district of Andhra Pradesh and well developed in both agricultural and industrial sectors after Kurnool. 567 accidents occurred in the last seven years at 143 locations shows the severity of the accidents in the Nandyal Mandal. There is a need to carry out some work in the Nandyal Mandal to improve the accidents black spots for reducing the accidents. Methods: Last seven years (2010-2016) of accident data collected from Police Stations. Weighted Severity Index (WSI), a scientific method is used for identifying the accident black spots. Statistical analysis has carried out for the collected data using Chi-Square Test to determine the independence of accidents with other attributes. Chi-Square Goodness of fit test conducted for test whether the accidents are occurring by chance or following any pattern. Results: WSI values are determined for the 143 locations. The Locations with high WSI are treated as accident black spots. Five black spots are taken for field study. After field observations and interaction with the public, some improvements are suggested for improving the accident black spots. There is no relationship between the severity of accidents and the other attributes like month, season, day, hours in day and the age group except type of vehicle. Road accidents are distributed throughout the Year, Month and Season. Road accidents are not distributed throughout the day.

  14. A statistical study of GPS loss of lock caused by ionospheric disturbances

    Science.gov (United States)

    Tsugawa, T.; Nishioka, M.; Otsuka, Y.; Saito, A.; Kato, H.; Kubota, M.; Nagatsuma, T.; Murata, K. T.

    2010-12-01

    Two-dimensional total electron content (TEC) maps have been derived from ground-based GPS receiver networks and applied to studies of various ionospheric disturbances since mid-1990s. For the purpose of monitoring and researching ionospheric disturbances which can degrade GNSS navigations and cause loss-of-lock on GNSS signals, National Institute of Information and Communications Technology (NICT), Japan has developed TEC maps over Japan using the dense GPS network, GEONET, which consists of more than 1,200 GPS receivers and is operated by Geophysical Survey Institute, Japan. Currently, we are providing two-dimensional maps of absolute TEC, detrended TEC with 60, 30, 15-minute window, rate of TEC change index (ROTI), and loss-of-lock (LOL) on GPS signal over Japan. These data and quick-look maps since 1997 are archived and available in the website of NICT (http://wdc.nict.go.jp/IONO/). Recently developed GPS receiver networks in North America and Europe make it possible to obtain regional TEC maps with higher spatial and temporal resolution than the global weighted mean TEC maps in the IONEX format provided by several institutes such as International GNSS Service (IGS) and another global TEC map provided by MIT Haystack observatory. Recently, we have also developed the regional TEC maps over North America and Europe. These data and quick-look maps are also available in the NICT website. In this presentation, we will show some severe ionospheric events such as high latitude storm-time plasma bubbles and storm enhanced density events observed over Japan using the GPS-TEC database. These events cause loss-of-lock of GPS signals and large GPS positioning errors. We also discuss about the statistical characteristics of LOL on the GPS signal caused by ionospheric disturbances.

  15. The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition Defensive Functioning Scale: a validity study.

    Science.gov (United States)

    Porcerelli, John H; Cogan, Rosemary; Markova, Tsveti; Miller, Kristen; Mickens, Lavonda

    2011-01-01

    We assess the convergent and predictive validity of the Defensive Functioning Scale (DFS) with measures of life events, including childhood abuse and adult partner victimization; dimensions of psychopathology, including axis I (depressive) and axis II (borderline personality disorder) symptoms; and quality of object relations. One hundred and ten women from a university-based urban primary care clinic completed a research interview from which defense mechanisms were assessed. The quality of object relations was also assessed from interview data. The women completed self-report measures assessing depression, borderline personality disorder symptoms, childhood physical and sexual abuse, and adult partner physical and sexual victimization. Inter-rater reliability of the scoring of the DFS levels was good. High adaptive defenses were positively correlated with the quality of object relations and pathological defenses were positively correlated with childhood and adult victimization and symptom measures. Although major image distorting defenses were infrequently used, they were robustly correlated with all study variables. In a stepwise multiple regression analysis, major image distorting defenses, depressive symptoms, and minor image distorting defenses significantly predict childhood victimization, accounting for 37% of the variance. In a second stepwise multiple regression analysis, borderline personality disorder symptoms and disavowal defenses combined to significantly predict adult victimization, accounting for 16% of the variance. The DFS demonstrates good convergent validity with axis I and axis II symptoms, as well as with measures of childhood and adult victimization and object relations. The DFS levels add nonredundant information to Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition beyond axis I and axis II. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Databases and statistical methods of cohort studies (1979-1995) in Yangjiang, China

    International Nuclear Information System (INIS)

    Sun Quanfu; Zou Jianming; Liu Yusheng

    1997-01-01

    There are epidemiological databases of some 40 MB available to risk analysis, mainly including databases of cohort follow-up and deaths of 12000 subject for the periods 1979-1986 and 1987-1995, and dosimetric database for 6783 households in 526 hamlets. Because of no strict projection relationship between database of the two periods of 1979-1986 and 1987-1995, the authors developed methods to combine the data of the two periods into one for risk analysis. The first one is to set up a theoretical cohort of 1979-1995 based on record linkage between the two periods. The other method is simply to sum up stratified person-year tables of different periods. It is suggested through extensive analysis of dosimetric data that indoor exposures should be divided further into two parts (exposure received on bed and those received during other indoor activities), outdoor exposure is homogeneous within a hamlet, and occupancy factors are sex-and-age-dependent. Cumulative dose estimates based upon hamlet-specific average of dose rates in bedroom, living room, and outdoor and sex-age-specific occupancy factors are derived for each cohort member. Person-years and number of deaths are tabulated with stratification by sex, attained age, calendar years, and dose. Cancer risks are analyzed for the period of 1979-1990. Conclusion: The epidemiological studies in high background radiation areas of Yangjiang, have been greatly improved by extensively using database management system and advanced statistical analysis with more attention paid to standardization and systematization of survey data management

  17. Study on loss detection algorithms for tank monitoring data using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Burr, Tom

    2009-01-01

    Evaluation of solution monitoring data to support material balance evaluation was proposed about a decade ago because of concerns regarding the large throughput planned at Rokkasho Reprocessing Plant (RRP). A numerical study using the simulation code (FACSIM) was done and significant increases in the detection probabilities (DP) for certain types of losses were shown. To be accepted internationally, it is very important to verify such claims using real solution monitoring data. However, a demonstrative study with real tank data has not been carried out due to the confidentiality of the tank data. This paper describes an experimental study that has been started using actual data from the Solution Measurement and Monitoring System (SMMS) in the Tokai Reprocessing Plant (TRP) and the Savannah River Site (SRS). Multivariate statistical methods, such as a vector cumulative sum and a multi-scale statistical analysis, have been applied to the real tank data that have superimposed simulated loss. Although quantitative conclusions have not been derived for the moment due to the difficulty of baseline evaluation, the multivariate statistical methods remain promising for abrupt and some types of protracted loss detection. (author)

  18. High throughput label-free platform for statistical bio-molecular sensing

    DEFF Research Database (Denmark)

    Bosco, Filippo; Hwu, En-Te; Chen, Ching-Hsiu

    2011-01-01

    Sensors are crucial in many daily operations including security, environmental control, human diagnostics and patient monitoring. Screening and online monitoring require reliable and high-throughput sensing. We report on the demonstration of a high-throughput label-free sensor platform utilizing...

  19. High frequency statistical energy analysis applied to fluid filled pipe systems

    NARCIS (Netherlands)

    Beek, P.J.G. van; Smeulers, J.P.M.

    2013-01-01

    In pipe systems, carrying gas with high velocities, broadband turbulent pulsations can be generated causing strong vibrations and fatigue failure, called Acoustic Fatigue. This occurs at valves with high pressure differences (i.e. chokes), relief valves and obstructions in the flow, such as sharp

  20. Technical issues relating to the statistical parametric mapping of brain SPECT studies

    International Nuclear Information System (INIS)

    Hatton, R.L.; Cordato, N.; Hutton, B.F.; Lau, Y.H.; Evans, S.G.

    2000-01-01

    Full text: Statistical Parametric Mapping (SPM) is a software tool designed for the statistical analysis of functional neuro images, specifically Positron Emission Tomography and functional Magnetic Resonance Imaging, and more recently SPECT. This review examines some problems associated with the analysis of SPECT. A comparison of a patient group with normal studies revealed factors that could influence results, some that commonly occur, others that require further exploration. To optimise the differences between two groups of subjects, both spatial variability and differences in global activity must be minimised. The choice and effectiveness of co registration method and approach to normalisation of activity concentration can affect the optimisation. A small number of subject scans were identified as possessing truncated data resulting in edge effects that could adversely influence the analysis. Other problems included unusual areas of significance possibly related to reconstruction methods and the geometry associated with nonparallel collimators. Areas of extra cerebral significance are a point of concern - and may result from scatter effects, or mis registration. Difficulties in patient positioning, due to postural limitations, can lead to resolution differences. SPM has been used to assess areas of statistical significance arising from these technical factors, as opposed to areas of true clinical significance when comparing subject groups. This contributes to a better understanding of the effects of technical factors so that these may be eliminated, minimised, or incorporated in the study design. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  1. A multivariate statistical study on a diversified data gathering system for nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.; Teichmann, T.; Levine, M.M.; Kato, W.Y.

    1989-02-01

    In this report, multivariate statistical methods are presented and applied to demonstrate their use in analyzing nuclear power plant operational data. For analyses of nuclear power plant events, approaches are presented for detecting malfunctions and degradations within the course of the event. At the system level, approaches are investigated as a means of diagnosis of system level performance. This involves the detection of deviations from normal performance of the system. The input data analyzed are the measurable physical parameters, such as steam generator level, pressurizer water level, auxiliary feedwater flow, etc. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients and computer simulation of a plant system performance (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to explore statistically the detection of failure trends and patterns and prevention of conditions with serious safety implications. 33 refs., 18 figs., 9 tabs

  2. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    Science.gov (United States)

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  3. Statistical fission parameters for nuclei at high excitation and angular momenta

    International Nuclear Information System (INIS)

    Blann, M.; Komoto, T.A.

    1982-01-01

    Experimental fusion/fission excitation functions are analyzed by the statistical model with modified rotating liquid drop model barriers and with single particle level densities modeled for deformation for ground state (a/sub ν/) and saddle point nuclei (a/sub f/). Values are estimated for the errors in rotating liquid drop model barriers for the different systems analyzed. These results are found to correlate well with the trends predicted by the finite range model of Krappe, Nix, and Sierk, although the discrepancies seem to be approximately 1 MeV greater than the finite range model predictions over the limited range tested. The a priori values calculated for a/sub f/ and a/sub ν/ are within +- 2% of optimum free parameter values. Analyses for barrier decrements explore the importance of collective enhancement on level densities and of nuclear deformation in calculating transmission coefficients. A calculation is performed for the 97 Rh nucleus for which a first order angular momentum scaling is used for the J = 0 finite range corrections. An excellent fit is found for the fission excitation function in this approach. Results are compared in which rotating liquid drop model barriers are decremented by a constant energy, or alternatively multiplied by a constant factor. Either parametrization is shown to be capable of satisfactorily reproducing the data although their J = 0 extrapolated values differ markedly from one another. This underscores the dangers inherent in arbitrary barrier extrapolations

  4. Statistical physics of fracture: scientific discovery through high-performance computing

    International Nuclear Information System (INIS)

    Kumar, Phani; Nukala, V V; Simunovic, Srdan; Mills, Richard T

    2006-01-01

    The paper presents the state-of-the-art algorithmic developments for simulating the fracture of disordered quasi-brittle materials using discrete lattice systems. Large scale simulations are often required to obtain accurate scaling laws; however, due to computational complexity, the simulations using the traditional algorithms were limited to small system sizes. We have developed two algorithms: a multiple sparse Cholesky downdating scheme for simulating 2D random fuse model systems, and a block-circulant preconditioner for simulating 2D random fuse model systems. Using these algorithms, we were able to simulate fracture of largest ever lattice system sizes (L = 1024 in 2D, and L = 64 in 3D) with extensive statistical sampling. Our recent simulations on 1024 processors of Cray-XT3 and IBM Blue-Gene/L have further enabled us to explore fracture of 3D lattice systems of size L = 200, which is a significant computational achievement. These largest ever numerical simulations have enhanced our understanding of physics of fracture; in particular, we analyze damage localization and its deviation from percolation behavior, scaling laws for damage density, universality of fracture strength distribution, size effect on the mean fracture strength, and finally the scaling of crack surface roughness

  5. A Statist Political Economy and High Demand for Education in South Korea

    Directory of Open Access Journals (Sweden)

    Ki Su Kim

    1999-06-01

    Full Text Available In the 1998 academic year, 84 percent of South Korea's high school "leavers" entered a university or college while almost all children went up to high schools. This is to say, South Korea is now moving into a new age of universal higher education. Even so, competition for university entrance remains intense. What is here interesting is South Koreans' unusually high demand for education. In this article, I criticize the existing cultural and socio-economic interpretations of the phenomenon. Instead, I explore a new interpretation by critically referring to the recent political economy debate on South Korea's state-society/market relationship. In my interpretation, the unusually high demand for education is largely due to the powerful South Korean state's losing flexibility in the management of its "developmental" policies. For this, I blame the traditional "personalist ethic" which still prevails as the

  6. Computational and statistical methods for high-throughput mass spectrometry-based PTM analysis

    DEFF Research Database (Denmark)

    Schwämmle, Veit; Vaudel, Marc

    2017-01-01

    Cell signaling and functions heavily rely on post-translational modifications (PTMs) of proteins. Their high-throughput characterization is thus of utmost interest for multiple biological and medical investigations. In combination with efficient enrichment methods, peptide mass spectrometry analy...

  7. Statistical power of model selection strategies for genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Zheyang Wu

    2009-07-01

    Full Text Available Genome-wide association studies (GWAS aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the

  8. Mechanical behaviour of alkali-activated blast furnace slag-activated metakaolin blended pastes. Statistical study

    Directory of Open Access Journals (Sweden)

    Higuera, I.

    2012-06-01

    Full Text Available The study and development of alternative, more ecoefficient binders than portland cement are attracting a good deal of scientific and technological interest. Binders obtained from the chemical interaction between calcium silico-aluminous materials and highly alkaline solutions are one of several types of such possible cements. The present paper discusses the mechanical behaviour and mineralogical composition of blended pastes made from NaOH-activated vitreous blast furnace slag and metakaolin. The aim of the study was to determine how parameters such as the slag/metakaolin ratio, activating solution concentration and curing temperature affect strength development in these binders. A statistical study was conducted to establish the impact of each variable and model strength behaviour in these alkaline cements. The conclusion drawn is that activator concentration and the slag/metakaolin ratio are both determinant parameters.

    El estudio y desarrollo de cementos alternativos y más eco-eficientes que el cemento Portland es un tema de gran impacto a nivel científico y tecnológico. Entre esos posibles cementos se encuentran los cementos alcalinos que son materiales conglomerantes obtenidos por la interacción química de materiales silico-aluminosos cálcicos y disoluciones fuertemente alcalinas. En el presente trabajo se estudia el comportamiento mecánico y la composición mineralógica de mezclas de escoria vítrea de horno alto y metacaolín activadas alcalinamente con disoluciones de NaOH. El objetivo de este estudio es conocer cómo afectan parámetros tales como la relación escoria/metacaolín, la concentración de la disolución activadora y la temperatura de curado, al desarrollo resistente de las mezclas. A través del estudio estadístico realizado se ha podido establecer la influencia de cada variable y modelizar el comportamiento resistente de estos cementos alcalinos. Se concluye que la concentración del activador y la relaci

  9. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  10. A study of the feasibility of statistical analysis of airport performance simulation

    Science.gov (United States)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  11. The statistical study of Chorus waves using the Double star TC1 data

    Science.gov (United States)

    Yearby, K.; Aryan, H.; Balikhin, M. A.; Krasnoselskikh, V.; Agapitov, O. V.

    2013-12-01

    The Double star satellite was launched on 29 December 2003 into an equatorial elliptical orbit with a perigee of 570km and an apogee of 78970km and an inclination of 28.5°. The satellite operated until 14 October 2007. The Double star TC1 data provides extensive coverage of the inner magnetosphere regions in the range of L shells >1.1L*, and a wide range of latitudes. This study presents a detailed statistical study of the Chorus waves during 4 years of the Double star operation.

  12. Statistical analysis and application of quasi experiments to antimicrobial resistance intervention studies.

    Science.gov (United States)

    Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N

    2007-10-01

    Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.

  13. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages

    Science.gov (United States)

    Kim, Yoonsang; Emery, Sherry

    2013-01-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods’ performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages—SAS GLIMMIX Laplace and SuperMix Gaussian quadrature—perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes. PMID:24288415

  14. Single-cell mRNA transfection studies: delivery, kinetics and statistics by numbers.

    Science.gov (United States)

    Leonhardt, Carolin; Schwake, Gerlinde; Stögbauer, Tobias R; Rappl, Susanne; Kuhr, Jan-Timm; Ligon, Thomas S; Rädler, Joachim O

    2014-05-01

    In artificial gene delivery, messenger RNA (mRNA) is an attractive alternative to plasmid DNA (pDNA) since it does not require transfer into the cell nucleus. Here we show that, unlike for pDNA transfection, the delivery statistics and dynamics of mRNA-mediated expression are generic and predictable in terms of mathematical modeling. We measured the single-cell expression time-courses and levels of enhanced green fluorescent protein (eGFP) using time-lapse microscopy and flow cytometry (FC). The single-cell analysis provides direct access to the distribution of onset times, life times and expression rates of mRNA and eGFP. We introduce a two-step stochastic delivery model that reproduces the number distribution of successfully delivered and translated mRNA molecules and thereby the dose-response relation. Our results establish a statistical framework for mRNA transfection and as such should advance the development of RNA carriers and small interfering/micro RNA-based drugs. This team of authors established a statistical framework for mRNA transfection by using a two-step stochastic delivery model that reproduces the number distribution of successfully delivered and translated mRNA molecules and thereby their dose-response relation. This study establishes a nice connection between theory and experimental planning and will aid the cellular delivery of mRNA molecules. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Logistic Regression with Multiple Random Effects: A Simulation Study of Estimation Methods and Statistical Packages.

    Science.gov (United States)

    Kim, Yoonsang; Choi, Young-Ku; Emery, Sherry

    2013-08-01

    Several statistical packages are capable of estimating generalized linear mixed models and these packages provide one or more of three estimation methods: penalized quasi-likelihood, Laplace, and Gauss-Hermite. Many studies have investigated these methods' performance for the mixed-effects logistic regression model. However, the authors focused on models with one or two random effects and assumed a simple covariance structure between them, which may not be realistic. When there are multiple correlated random effects in a model, the computation becomes intensive, and often an algorithm fails to converge. Moreover, in our analysis of smoking status and exposure to anti-tobacco advertisements, we have observed that when a model included multiple random effects, parameter estimates varied considerably from one statistical package to another even when using the same estimation method. This article presents a comprehensive review of the advantages and disadvantages of each estimation method. In addition, we compare the performances of the three methods across statistical packages via simulation, which involves two- and three-level logistic regression models with at least three correlated random effects. We apply our findings to a real dataset. Our results suggest that two packages-SAS GLIMMIX Laplace and SuperMix Gaussian quadrature-perform well in terms of accuracy, precision, convergence rates, and computing speed. We also discuss the strengths and weaknesses of the two packages in regard to sample sizes.

  16. Statistical Study of Corruption in the Region (on the Example of the Rostov Region

    Directory of Open Access Journals (Sweden)

    Kirill A. Belokrylov

    2016-09-01

    Full Text Available The paper proves the effectiveness of the using of statistical methods for evaluation of the corruption level as a result of its, on the one hand, latency, concealment of official statistical accounting, and on the other - the scale of corruption in Russia. The comparative analysis of the substantial characteristics of the nature of corruption by Russian and foreign scientists, as well as their reflection in the law has allowed to develop a questionnaire to adequately assess the levels, causes and the effectiveness of implementation of the policy on the fight against corruption as the most important social indicator of inefficient institutions. Analysis of the results of the economic and sociological survey of the population of the Rostov region revealed a shift of corruption performance in the region in the direction of the education system as a result of the dominance in the selection of students, but it led to the conclusion about the need to tighten legislation on the fight against corruption (72% of respondents, including the confiscation of the property (79,1%, the dismissal of corrupt officials, the ban on public office. The necessity of further in-depth statistical studies of corruption on the development of more effective measures is improved to combat it as a tool for removal of the Russian economy from the crisis and ensure that it is more sustainable growth than projected in the 2020s (the lost decade 1,5% positive GDP dynamics.

  17. Detecting rater bias using a person-fit statistic: a Monte Carlo simulation study.

    Science.gov (United States)

    Aubin, André-Sébastien; St-Onge, Christina; Renaud, Jean-Sébastien

    2018-04-01

    With the Standards voicing concern for the appropriateness of response processes, we need to explore strategies that would allow us to identify inappropriate rater response processes. Although certain statistics can be used to help detect rater bias, their use is complicated by either a lack of data about their actual power to detect rater bias or the difficulty related to their application in the context of health professions education. This exploratory study aimed to establish the worthiness of pursuing the use of l z to detect rater bias. We conducted a Monte Carlo simulation study to investigate the power of a specific detection statistic, that is: the standardized likelihood l z person-fit statistics (PFS). Our primary outcome was the detection rate of biased raters, namely: raters whom we manipulated into being either stringent (giving lower scores) or lenient (giving higher scores), using the l z statistic while controlling for the number of biased raters in a sample (6 levels) and the rate of bias per rater (6 levels). Overall, stringent raters (M = 0.84, SD = 0.23) were easier to detect than lenient raters (M = 0.31, SD = 0.28). More biased raters were easier to detect then less biased raters (60% bias: 62, SD = 0.37; 10% bias: 43, SD = 0.36). The PFS l z seems to offer an interesting potential to identify biased raters. We observed detection rates as high as 90% for stringent raters, for whom we manipulated more than half their checklist. Although we observed very interesting results, we cannot generalize these results to the use of PFS with estimated item/station parameters or real data. Such studies should be conducted to assess the feasibility of using PFS to identify rater bias.

  18. Statistical Analysis of Large Simulated Yield Datasets for Studying Climate Effects

    Science.gov (United States)

    Makowski, David; Asseng, Senthold; Ewert, Frank; Bassu, Simona; Durand, Jean-Louis; Martre, Pierre; Adam, Myriam; Aggarwal, Pramod K.; Angulo, Carlos; Baron, Chritian; hide

    2015-01-01

    Many studies have been carried out during the last decade to study the effect of climate change on crop yields and other key crop characteristics. In these studies, one or several crop models were used to simulate crop growth and development for different climate scenarios that correspond to different projections of atmospheric CO2 concentration, temperature, and rainfall changes (Semenov et al., 1996; Tubiello and Ewert, 2002; White et al., 2011). The Agricultural Model Intercomparison and Improvement Project (AgMIP; Rosenzweig et al., 2013) builds on these studies with the goal of using an ensemble of multiple crop models in order to assess effects of climate change scenarios for several crops in contrasting environments. These studies generate large datasets, including thousands of simulated crop yield data. They include series of yield values obtained by combining several crop models with different climate scenarios that are defined by several climatic variables (temperature, CO2, rainfall, etc.). Such datasets potentially provide useful information on the possible effects of different climate change scenarios on crop yields. However, it is sometimes difficult to analyze these datasets and to summarize them in a useful way due to their structural complexity; simulated yield data can differ among contrasting climate scenarios, sites, and crop models. Another issue is that it is not straightforward to extrapolate the results obtained for the scenarios to alternative climate change scenarios not initially included in the simulation protocols. Additional dynamic crop model simulations for new climate change scenarios are an option but this approach is costly, especially when a large number of crop models are used to generate the simulated data, as in AgMIP. Statistical models have been used to analyze responses of measured yield data to climate variables in past studies (Lobell et al., 2011), but the use of a statistical model to analyze yields simulated by complex

  19. Improving the analysis of designed studies by combining statistical modelling with study design information

    NARCIS (Netherlands)

    Thissen, U.; Wopereis, S.; Berg, S.A.A. van den; Bobeldijk, I.; Kleemann, R.; Kooistra, T.; Dijk, K.W. van; Ommen, B. van; Smilde, A.K.

    2009-01-01

    Background: In the fields of life sciences, so-called designed studies are used for studying complex biological systems. The data derived from these studies comply with a study design aimed at generating relevant information while diminishing unwanted variation (noise). Knowledge about the study

  20. High Resolution 3D Experimental Investigation of Flow Structures and Turbulence Statistics in the Viscous and Buffer Layer

    Science.gov (United States)

    Sheng, Jian; Malkiel, Edwin; Katz, Joseph

    2006-11-01

    Digital Holographic Microscopy is implemented to perform 3D velocity measurement in the near-wall region of a turbulent boundary layer in a square channel over a smooth wall at Reτ=1,400. The measurements are performed at a resolution of ˜1μm over a sample volume of 1.5x2x1.5mm (x^+=50, y^+=60, z^+=50), sufficient for resolving buffer layer structures and for measuring the instantaneous wall shear stress distributions from velocity gradients in the sublayer. The data provides detailed statistics on the spatial distribution of both wall shear stress components along with the characteristic flow structures, including streamwise counter-rotating vortex pairs, multiple streamwise vortices, and rare hairpins. Conditional sampling identifies characteristic length scales of 70 wall units in spanwise and 10 wall units in wall-normal direction. In the region of high stress, the conditionally averaged flow consists of a stagnation-like sweeping motion induced by a counter rotating pair of streamwise vortices. Regions with low stress are associated with ejection motion, also generated by pairs of counter-rotating vortices. Statistics on the local strain and geometric alignment between strain and vorticity shows that the high shear generating vortices are inclined at 45 to streamwise direction, indicating that vortices are being stretched. Results of on-going analysis examines statistics of helicity, strain and impacts of near-wall structures.

  1. On understanding crosstalk in the face of small, quantized, signals highly smeared by Poisson statistics

    International Nuclear Information System (INIS)

    Lincoln, D.; Hsieh, F.; Li, H.

    1995-01-01

    As detectors become smaller and more densely packed, signals become smaller and crosstalk between adjacent channels generally increases. Since it is often appropriate to use the distribution of signals in adjacent channels to make a useful measurement, it is imperative that inter-channel crosstalk be well understood. In this paper we shall describe the manner in which Poissonian fluctuations can give counter-intuitive results and offer some methods for extracting the desired information from the highly smeared, observed distributions. (orig.)

  2. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  3. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  4. Learning Styles Preferences of Statistics Students: A Study in the Faculty of Business and Economics at the UAE University

    Science.gov (United States)

    Yousef, Darwish Abdulrahman

    2016-01-01

    Purpose: Although there are many studies addressing the learning styles of business students as well as students of other disciplines, there are few studies which address the learning style preferences of statistics students. The purpose of this study is to explore the learning style preferences of statistics students at a United Arab Emirates…

  5. Statistical Identification of Composed Visual Features Indicating High Likelihood of Grasp Success

    DEFF Research Database (Denmark)

    Thomsen, Mikkel Tang; Bodenhagen, Leon; Krüger, Norbert

    2013-01-01

    configurations of three 3D surface features that predict grasping actions with a high success probability. The strategy is based on first computing spatial relations between visual entities and secondly, exploring the cross-space of these relational feature space and grasping actions. The data foundation...... for identifying such indicative feature constellations is generated in a simulated environment wherein visual features are extracted and a large amount of grasping actions are evaluated through dynamic simulation. Based on the identified feature constellations, we validate by applying the acquired knowledge...

  6. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  7. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  8. Statistical Study to Check the Conformity of Aggregate in Kirkuk City to Requirement of Iraqi Specification

    Directory of Open Access Journals (Sweden)

    Ammar Saleem Khazaal

    2018-01-01

    Full Text Available This research reviews a statistical study to check the conformity of aggregates (Coarse and Fine was used in Kirkuk city to the requirements of the Iraqi specifications. The data of sieve analysis (215 samples of aggregates being obtained from of National Central Construction Laboratory and Technical College Construction Laboratory in Kirkuk city have analyzed using the statistical program SAS. The results showed that 5%, 17%, and 18% of fine aggregate samples are passing sieve sizes 10 mm, 4.75 mm, and 2.36 mm, respectively, which were less than the minimum limit allowed by the Iraqi specifications for each sieve. The percentages passing sieve sizes 1.18mm, 600micrometers, and 300micrometers were more than the upper limit of specification by 5%, 20%, and 30% respectively. The samples were passing sieve sizes 1.18mm, and 600micrometers less than the minimum limit of specification by 17%, and 4%, respectively. The results showed that the deviation in a sieve size of 150 micrometers for the upper limit of the specification performs 2% of the total number of samples. For Coarse aggregate, the samples passing sieves size 37.5mm and 20mm were comforting the Iraqi specifications by 100% and 83% respectively, it has found that the samples were passing sieve sizes 10 mm was 5% was more than the higher limit of Iraqi specifications, and 27% of these samples were less than the minimum limit, whereas sample passing sieve size 5mm was 1% which is more than the upper limit of the Iraqi specification. As a result of statistical analysis of data for fine aggregate, it has found that the samples were passing sieve sizes 10 mm, 2.36 mm, 1.18 mm and 150micrometers conforming from statistical point of view the Iraqi specifications, whereas the samples were passing sieve sizes 4.75 mm, 600micrometers and 300 micrometers didn’t conform. Statistical analysis of the results of the coarse aggregates also showed that conforming to sieve sizes of 37.5 mm and 20 mm and

  9. Myocardial infarction (heart attack) and its risk factors: a statistical study

    International Nuclear Information System (INIS)

    Salahuddin; Alamgir

    2005-01-01

    A Statistical technique of odds ratio analysis was performed to look at the association of Myocardial Infarction with sex, smoking, hypertension, cholesterol, diabetes, family history, number of dependents, household income and residence. For this purpose a total of 506 patients were examined and their personal and medical data were collected. For each patient, the phenomenon of myocardial infarction was studied in relation to different risk factors. The analysis suggests that smoking, hypertension, cholesterol level, diabetes, family history are important risk factors for the occurrence of MI. (author)

  10. [Statistical study of the incidence of agenesis in a sample of 1529 subjects].

    Science.gov (United States)

    Lo Muzio, L; Mignogna, M D; Bucci, P; Sorrentino, F

    1989-09-01

    Following a short review of the main aetiopathogenetic theories on dental agenesia, a personal statistical study of this pathology is reported. 1529 orthopantomographs of juveniles aged between 7 and 14 were examined. 79 cases of hypodentia were observed (5.2%), 32 in males (4.05%) and 47 in females (6.78%). The most interesting tooth was the second premolar with an incidence of 58.9% followed by the lateral incisor, with an incidence of 26.38%. This is in agreement with the international literature.

  11. Geo-statistical model of Rainfall erosivity by using high temporal resolution precipitation data in Europe

    Science.gov (United States)

    Panagos, Panos; Ballabio, Cristiano; Borrelli, Pasquale; Meusburger, Katrin; Alewell, Christine

    2015-04-01

    Rainfall erosivity (R-factor) is among the 6 input factors in estimating soil erosion risk by using the empirical Revised Universal Soil Loss Equation (RUSLE). R-factor is a driving force for soil erosion modelling and potentially can be used in flood risk assessments, landslides susceptibility, post-fire damage assessment, application of agricultural management practices and climate change modelling. The rainfall erosivity is extremely difficult to model at large scale (national, European) due to lack of high temporal resolution precipitation data which cover long-time series. In most cases, R-factor is estimated based on empirical equations which take into account precipitation volume. The Rainfall Erosivity Database on the European Scale (REDES) is the output of an extensive data collection of high resolution precipitation data in the 28 Member States of the European Union plus Switzerland taking place during 2013-2014 in collaboration with national meteorological/environmental services. Due to different temporal resolutions of the data (5, 10, 15, 30, 60 minutes), conversion equations have been applied in order to homogenise the database at 30-minutes interval. The 1,541 stations included in REDES have been interpolated using the Gaussian Process Regression (GPR) model using as covariates the climatic data (monthly precipitation, monthly temperature, wettest/driest month) from WorldClim Database, Digital Elevation Model and latitude/longitude. GPR has been selected among other candidate models (GAM, Regression Kriging) due the best performance both in cross validation (R2=0.63) and in fitting dataset (R2=0.72). The highest uncertainty has been noticed in North-western Scotland, North Sweden and Finland due to limited number of stations in REDES. Also, in highlands such as Alpine arch and Pyrenees the diversity of environmental features forced relatively high uncertainty. The rainfall erosivity map of Europe available at 500m resolution plus the standard error

  12. The Communicability of Graphical Alternatives to Tabular Displays of Statistical Simulation Studies

    Science.gov (United States)

    Cook, Alex R.; Teo, Shanice W. L.

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form. PMID:22132184

  13. Marine Traffic Density Over Port Klang, Malaysia Using Statistical Analysis of AIS Data: A Preliminary Study

    Directory of Open Access Journals (Sweden)

    Masnawi MUSTAFFA

    2016-12-01

    Full Text Available Port Klang Malaysia is the 13th busiest port in the world, the capacity at the port expected to be able to meet the demand until 2018. It is one of the busiest ports in the world and also the busiest port in Malaysia. Even though there are statistics published by Port Klang Authority showing that a lot of ships using this port, this number is only based on ships that entering Port Klang. Therefore, no study has been done to investigate on how dense the traffic is in Port Klang, Malaysia the surrounding sea including Strait of Malacca . This paper has investigated on traffic density over Port Klang Malaysia and its surrounding sea using statistical analysis from AIS data. As a preliminary study, this study only collected AIS data for 7 days to represent daily traffic weekly. As a result, an hourly number of vessels, daily number of vessels, vessels classification and sizes and also traffic paths will be plotted.

  14. Statistical study of the non-linear propagation of a partially coherent laser beam

    International Nuclear Information System (INIS)

    Ayanides, J.P.

    2001-01-01

    This research thesis is related to the LMJ project (Laser MegaJoule) and thus to the study and development of thermonuclear fusion. It reports the study of the propagation of a partially-coherent laser beam by using a statistical modelling in order to obtain mean values for the field, and thus bypassing a complex and costly calculation of deterministic quantities. Random fluctuations of the propagated field are supposed to comply with a Gaussian statistics; the laser central wavelength is supposed to be small with respect with fluctuation magnitude; a scale factor is introduced to clearly distinguish the scale of the random and fast variations of the field fluctuations, and the scale of the slow deterministic variations of the field envelopes. The author reports the study of propagation through a purely linear media and through a non-dispersive media, and then through slow non-dispersive and non-linear media (in which the reaction time is large with respect to grain correlation duration, but small with respect to the variation scale of the field macroscopic envelope), and thirdly through an instantaneous dispersive and non linear media (which instantaneously reacts to the field) [fr

  15. Characterization of Sensory-Motor Behavior Under Cognitive Load Using a New Statistical Platform for Studies of Embodied Cognition

    Directory of Open Access Journals (Sweden)

    Jihye Ryu

    2018-04-01

    Full Text Available The field of enacted/embodied cognition has emerged as a contemporary attempt to connect the mind and body in the study of cognition. However, there has been a paucity of methods that enable a multi-layered approach tapping into different levels of functionality within the nervous systems (e.g., continuously capturing in tandem multi-modal biophysical signals in naturalistic settings. The present study introduces a new theoretical and statistical framework to characterize the influences of cognitive demands on biophysical rhythmic signals harnessed from deliberate, spontaneous and autonomic activities. In this study, nine participants performed a basic pointing task to communicate a decision while they were exposed to different levels of cognitive load. Within these decision-making contexts, we examined the moment-by-moment fluctuations in the peak amplitude and timing of the biophysical time series data (e.g., continuous waveforms extracted from hand kinematics and heart signals. These spike-trains data offered high statistical power for personalized empirical statistical estimation and were well-characterized by a Gamma process. Our approach enabled the identification of different empirically estimated families of probability distributions to facilitate inference regarding the continuous physiological phenomena underlying cognitively driven decision-making. We found that the same pointing task revealed shifts in the probability distribution functions (PDFs of the hand kinematic signals under study and were accompanied by shifts in the signatures of the heart inter-beat-interval timings. Within the time scale of an experimental session, marked changes in skewness and dispersion of the distributions were tracked on the Gamma parameter plane with 95% confidence. The results suggest that traditional theoretical assumptions of stationarity and normality in biophysical data from the nervous systems are incongruent with the true statistical nature of

  16. Characterization of Sensory-Motor Behavior Under Cognitive Load Using a New Statistical Platform for Studies of Embodied Cognition

    Science.gov (United States)

    Ryu, Jihye; Torres, Elizabeth B.

    2018-01-01

    The field of enacted/embodied cognition has emerged as a contemporary attempt to connect the mind and body in the study of cognition. However, there has been a paucity of methods that enable a multi-layered approach tapping into different levels of functionality within the nervous systems (e.g., continuously capturing in tandem multi-modal biophysical signals in naturalistic settings). The present study introduces a new theoretical and statistical framework to characterize the influences of cognitive demands on biophysical rhythmic signals harnessed from deliberate, spontaneous and autonomic activities. In this study, nine participants performed a basic pointing task to communicate a decision while they were exposed to different levels of cognitive load. Within these decision-making contexts, we examined the moment-by-moment fluctuations in the peak amplitude and timing of the biophysical time series data (e.g., continuous waveforms extracted from hand kinematics and heart signals). These spike-trains data offered high statistical power for personalized empirical statistical estimation and were well-characterized by a Gamma process. Our approach enabled the identification of different empirically estimated families of probability distributions to facilitate inference regarding the continuous physiological phenomena underlying cognitively driven decision-making. We found that the same pointing task revealed shifts in the probability distribution functions (PDFs) of the hand kinematic signals under study and were accompanied by shifts in the signatures of the heart inter-beat-interval timings. Within the time scale of an experimental session, marked changes in skewness and dispersion of the distributions were tracked on the Gamma parameter plane with 95% confidence. The results suggest that traditional theoretical assumptions of stationarity and normality in biophysical data from the nervous systems are incongruent with the true statistical nature of empirical data

  17. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    Science.gov (United States)

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. The Statistics and Mathematics of High Dimension Low Sample Size Asymptotics.

    Science.gov (United States)

    Shen, Dan; Shen, Haipeng; Zhu, Hongtu; Marron, J S

    2016-10-01

    The aim of this paper is to establish several deep theoretical properties of principal component analysis for multiple-component spike covariance models. Our new results reveal an asymptotic conical structure in critical sample eigendirections under the spike models with distinguishable (or indistinguishable) eigenvalues, when the sample size and/or the number of variables (or dimension) tend to infinity. The consistency of the sample eigenvectors relative to their population counterparts is determined by the ratio between the dimension and the product of the sample size with the spike size. When this ratio converges to a nonzero constant, the sample eigenvector converges to a cone, with a certain angle to its corresponding population eigenvector. In the High Dimension, Low Sample Size case, the angle between the sample eigenvector and its population counterpart converges to a limiting distribution. Several generalizations of the multi-spike covariance models are also explored, and additional theoretical results are presented.

  19. On Statistical Modeling of Sequencing Noise in High Depth Data to Assess Tumor Evolution

    Science.gov (United States)

    Rabadan, Raul; Bhanot, Gyan; Marsilio, Sonia; Chiorazzi, Nicholas; Pasqualucci, Laura; Khiabanian, Hossein

    2017-12-01

    One cause of cancer mortality is tumor evolution to therapy-resistant disease. First line therapy often targets the dominant clone, and drug resistance can emerge from preexisting clones that gain fitness through therapy-induced natural selection. Such mutations may be identified using targeted sequencing assays by analysis of noise in high-depth data. Here, we develop a comprehensive, unbiased model for sequencing error background. We find that noise in sufficiently deep DNA sequencing data can be approximated by aggregating negative binomial distributions. Mutations with frequencies above noise may have prognostic value. We evaluate our model with simulated exponentially expanded populations as well as data from cell line and patient sample dilution experiments, demonstrating its utility in prognosticating tumor progression. Our results may have the potential to identify significant mutations that can cause recurrence. These results are relevant in the pretreatment clinical setting to determine appropriate therapy and prepare for potential recurrence pretreatment.

  20. An inter-hemispheric, statistical study of nightside spectral width distributions from coherent HF scatter radars

    Directory of Open Access Journals (Sweden)

    E. E. Woodfield

    2002-12-01

    Full Text Available A statistical investigation of the Doppler spectral width parameter routinely observed by HF coherent radars has been conducted between the Northern and Southern Hemispheres for the nightside ionosphere. Data from the SuperDARN radars at Thykkvibær, Iceland and Syowa East, Antarctica have been employed for this purpose. Both radars frequently observe regions of high (>200 ms-1 spectral width polewards of low (<200 ms-1 spectral width. Three years of data from both radars have been analysed both for the spectral width and line of sight velocity. The pointing direction of these two radars is such that the flow reversal boundary may be estimated from the velocity data, and therefore, we have an estimate of the open/closed field line boundary location for comparison with the high spectral widths. Five key observations regarding the behaviour of the spectral width on the nightside have been made. These are (i the two radars observe similar characteristics on a statistical basis; (ii a latitudinal dependence related to magnetic local time is found in both hemispheres; (iii a seasonal dependence of the spectral width is observed by both radars, which shows a marked absence of latitudinal dependence during the summer months; (iv in general, the Syowa East spectral width tends to be larger than that from Iceland East, and (v the highest spectral widths seem to appear on both open and closed field lines. Points (i and (ii indicate that the cause of high spectral width is magnetospheric in origin. Point (iii suggests that either the propagation of the HF radio waves to regions of high spectral width or the generating mechanism(s for high spectral width is affected by solar illumination or other seasonal effects. Point (iv suggests that the radar beams from each of the radars are subject either to different instrumental or propagation effects, or different geophysical conditions due to their locations, although we suggest that this result is more likely to

  1. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    Science.gov (United States)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  2. Addressing the "Replication Crisis": Using Original Studies to Design Replication Studies with Appropriate Statistical Power.

    Science.gov (United States)

    Anderson, Samantha F; Maxwell, Scott E

    2017-01-01

    Psychology is undergoing a replication crisis. The discussion surrounding this crisis has centered on mistrust of previous findings. Researchers planning replication studies often use the original study sample effect size as the basis for sample size planning. However, this strategy ignores uncertainty and publication bias in estimated effect sizes, resulting in overly optimistic calculations. A psychologist who intends to obtain power of .80 in the replication study, and performs calculations accordingly, may have an actual power lower than .80. We performed simulations to reveal the magnitude of the difference between actual and intended power based on common sample size planning strategies and assessed the performance of methods that aim to correct for effect size uncertainty and/or bias. Our results imply that even if original studies reflect actual phenomena and were conducted in the absence of questionable research practices, popular approaches to designing replication studies may result in a low success rate, especially if the original study is underpowered. Methods correcting for bias and/or uncertainty generally had higher actual power, but were not a panacea for an underpowered original study. Thus, it becomes imperative that 1) original studies are adequately powered and 2) replication studies are designed with methods that are more likely to yield the intended level of power.

  3. Study of beta-delayed neutron with proton-neutron QRPA plus statistical model

    International Nuclear Information System (INIS)

    Minato, Futoshi; Iwamoto, Osamu

    2015-01-01

    β-delayed neutron is known to be important for safety operation of nuclear reactor and prediction of elemental abundance after freeze-out of r-process. A lot of researches on it have been performed. However, the experimental data are far from complete since the lifetime of most of the relevant nuclei is so short that one cannot measure in a high efficiency. In order to estimate half-lives and delayed neutron emission probabilities of unexplored nuclei, we developed a new theoretical method which combines a proton-neutron quasi-particle random-phase-approximation and the Hauser-Feshbach statistical model. The present method reproduces experimentally known β-decay half-lives within a factor of 10 and about 40% of within a factor of 2. However it fails to reproduce delayed neutron emission probabilities. We discuss the problems and remedy for them to be made in future. (author)

  4. Statistical inference for extended or shortened phase II studies based on Simon's two-stage designs.

    Science.gov (United States)

    Zhao, Junjun; Yu, Menggang; Feng, Xi-Ping

    2015-06-07

    Simon's two-stage designs are popular choices for conducting phase II clinical trials, especially in the oncology trials to reduce the number of patients placed on ineffective experimental therapies. Recently Koyama and Chen (2008) discussed how to conduct proper inference for such studies because they found that inference procedures used with Simon's designs almost always ignore the actual sampling plan used. In particular, they proposed an inference method for studies when the actual second stage sample sizes differ from planned ones. We consider an alternative inference method based on likelihood ratio. In particular, we order permissible sample paths under Simon's two-stage designs using their corresponding conditional likelihood. In this way, we can calculate p-values using the common definition: the probability of obtaining a test statistic value at least as extreme as that observed under the null hypothesis. In addition to providing inference for a couple of scenarios where Koyama and Chen's method can be difficult to apply, the resulting estimate based on our method appears to have certain advantage in terms of inference properties in many numerical simulations. It generally led to smaller biases and narrower confidence intervals while maintaining similar coverages. We also illustrated the two methods in a real data setting. Inference procedures used with Simon's designs almost always ignore the actual sampling plan. Reported P-values, point estimates and confidence intervals for the response rate are not usually adjusted for the design's adaptiveness. Proper statistical inference procedures should be used.

  5. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    Science.gov (United States)

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-02

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.

  6. Estimating summary statistics for electronic health record laboratory data for use in high-throughput phenotyping algorithms

    Science.gov (United States)

    Elhadad, N.; Claassen, J.; Perotte, R.; Goldstein, A.; Hripcsak, G.

    2018-01-01

    We study the question of how to represent or summarize raw laboratory data taken from an electronic health record (EHR) using parametric model selection to reduce or cope with biases induced through clinical care. It has been previously demonstrated that the health care process (Hripcsak and Albers, 2012, 2013), as defined by measurement context (Hripcsak and Albers, 2013; Albers et al., 2012) and measurement patterns (Albers and Hripcsak, 2010, 2012), can influence how EHR data are distributed statistically (Kohane and Weber, 2013; Pivovarov et al., 2014). We construct an algorithm, PopKLD, which is based on information criterion model selection (Burnham and Anderson, 2002; Claeskens and Hjort, 2008), is intended to reduce and cope with health care process biases and to produce an intuitively understandable continuous summary. The PopKLD algorithm can be automated and is designed to be applicable in high-throughput settings; for example, the output of the PopKLD algorithm can be used as input for phenotyping algorithms. Moreover, we develop the PopKLD-CAT algorithm that transforms the continuous PopKLD summary into a categorical summary useful for applications that require categorical data such as topic modeling. We evaluate our methodology in two ways. First, we apply the method to laboratory data collected in two different health care contexts, primary versus intensive care. We show that the PopKLD preserves known physiologic features in the data that are lost when summarizing the data using more common laboratory data summaries such as mean and standard deviation. Second, for three disease-laboratory measurement pairs, we perform a phenotyping task: we use the PopKLD and PopKLD-CAT algorithms to define high and low values of the laboratory variable that are used for defining a disease state. We then compare the relationship between the PopKLD-CAT summary disease predictions and the same predictions using empirically estimated mean and standard deviation to a

  7. Estimating summary statistics for electronic health record laboratory data for use in high-throughput phenotyping algorithms.

    Science.gov (United States)

    Albers, D J; Elhadad, N; Claassen, J; Perotte, R; Goldstein, A; Hripcsak, G

    2018-02-01

    We study the question of how to represent or summarize raw laboratory data taken from an electronic health record (EHR) using parametric model selection to reduce or cope with biases induced through clinical care. It has been previously demonstrated that the health care process (Hripcsak and Albers, 2012, 2013), as defined by measurement context (Hripcsak and Albers, 2013; Albers et al., 2012) and measurement patterns (Albers and Hripcsak, 2010, 2012), can influence how EHR data are distributed statistically (Kohane and Weber, 2013; Pivovarov et al., 2014). We construct an algorithm, PopKLD, which is based on information criterion model selection (Burnham and Anderson, 2002; Claeskens and Hjort, 2008), is intended to reduce and cope with health care process biases and to produce an intuitively understandable continuous summary. The PopKLD algorithm can be automated and is designed to be applicable in high-throughput settings; for example, the output of the PopKLD algorithm can be used as input for phenotyping algorithms. Moreover, we develop the PopKLD-CAT algorithm that transforms the continuous PopKLD summary into a categorical summary useful for applications that require categorical data such as topic modeling. We evaluate our methodology in two ways. First, we apply the method to laboratory data collected in two different health care contexts, primary versus intensive care. We show that the PopKLD preserves known physiologic features in the data that are lost when summarizing the data using more common laboratory data summaries such as mean and standard deviation. Second, for three disease-laboratory measurement pairs, we perform a phenotyping task: we use the PopKLD and PopKLD-CAT algorithms to define high and low values of the laboratory variable that are used for defining a disease state. We then compare the relationship between the PopKLD-CAT summary disease predictions and the same predictions using empirically estimated mean and standard deviation to a

  8. Linguistic Diversity and Traffic Accidents: Lessons from Statistical Studies of Cultural Traits

    Science.gov (United States)

    Roberts, Seán; Winters, James

    2013-01-01

    The recent proliferation of digital databases of cultural and linguistic data, together with new statistical techniques becoming available has lead to a rise in so-called nomothetic studies [1]–[8]. These seek relationships between demographic variables and cultural traits from large, cross-cultural datasets. The insights from these studies are important for understanding how cultural traits evolve. While these studies are fascinating and are good at generating testable hypotheses, they may underestimate the probability of finding spurious correlations between cultural traits. Here we show that this kind of approach can find links between such unlikely cultural traits as traffic accidents, levels of extra-martial sex, political collectivism and linguistic diversity. This suggests that spurious correlations, due to historical descent, geographic diffusion or increased noise-to-signal ratios in large datasets, are much more likely than some studies admit. We suggest some criteria for the evaluation of nomothetic studies and some practical solutions to the problems. Since some of these studies are receiving media attention without a widespread understanding of the complexities of the issue, there is a risk that poorly controlled studies could affect policy. We hope to contribute towards a general skepticism for correlational studies by demonstrating the ease of finding apparently rigorous correlations between cultural traits. Despite this, we see well-controlled nomothetic studies as useful tools for the development of theories. PMID:23967132

  9. Linguistic diversity and traffic accidents: lessons from statistical studies of cultural traits.

    Directory of Open Access Journals (Sweden)

    Seán Roberts

    Full Text Available The recent proliferation of digital databases of cultural and linguistic data, together with new statistical techniques becoming available has lead to a rise in so-called nomothetic studies [1]-[8]. These seek relationships between demographic variables and cultural traits from large, cross-cultural datasets. The insights from these studies are important for understanding how cultural traits evolve. While these studies are fascinating and are good at generating testable hypotheses, they may underestimate the probability of finding spurious correlations between cultural traits. Here we show that this kind of approach can find links between such unlikely cultural traits as traffic accidents, levels of extra-martial sex, political collectivism and linguistic diversity. This suggests that spurious correlations, due to historical descent, geographic diffusion or increased noise-to-signal ratios in large datasets, are much more likely than some studies admit. We suggest some criteria for the evaluation of nomothetic studies and some practical solutions to the problems. Since some of these studies are receiving media attention without a widespread understanding of the complexities of the issue, there is a risk that poorly controlled studies could affect policy. We hope to contribute towards a general skepticism for correlational studies by demonstrating the ease of finding apparently rigorous correlations between cultural traits. Despite this, we see well-controlled nomothetic studies as useful tools for the development of theories.

  10. Linguistic diversity and traffic accidents: lessons from statistical studies of cultural traits.

    Science.gov (United States)

    Roberts, Seán; Winters, James

    2013-01-01

    The recent proliferation of digital databases of cultural and linguistic data, together with new statistical techniques becoming available has lead to a rise in so-called nomothetic studies [1]-[8]. These seek relationships between demographic variables and cultural traits from large, cross-cultural datasets. The insights from these studies are important for understanding how cultural traits evolve. While these studies are fascinating and are good at generating testable hypotheses, they may underestimate the probability of finding spurious correlations between cultural traits. Here we show that this kind of approach can find links between such unlikely cultural traits as traffic accidents, levels of extra-martial sex, political collectivism and linguistic diversity. This suggests that spurious correlations, due to historical descent, geographic diffusion or increased noise-to-signal ratios in large datasets, are much more likely than some studies admit. We suggest some criteria for the evaluation of nomothetic studies and some practical solutions to the problems. Since some of these studies are receiving media attention without a widespread understanding of the complexities of the issue, there is a risk that poorly controlled studies could affect policy. We hope to contribute towards a general skepticism for correlational studies by demonstrating the ease of finding apparently rigorous correlations between cultural traits. Despite this, we see well-controlled nomothetic studies as useful tools for the development of theories.

  11. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  12. Altered glucose metabolism in juvenile myoclonic epilepsy: a PET study with statistical parametric mapping

    International Nuclear Information System (INIS)

    Lim, G. C.; Kim, J. H.; Kang, J. G.; Kim, J. S.; Yeo, J. S.; Lee, S. A.; Moon, D. H

    2004-01-01

    Juvenile myoclonic epilepsy (JME) is a hereditary, age-dependent epilepsy syndrome, characterized by myoclonic jerks on awakening and generalized tonic-clonic seizures. Although there have been considerable studies on the mechanism to elucidate pathogenesis of JME, the accurate pathogenesis of JME remains obscure. The aim of this study was to investigate alterations of cerebral glucose metabolism in patients with JME. We studied 16 JME patients (Mean age: 22 yrs, M/F: 9/7) with brain FDG-PET and simultaneous EEG recording. On the basis of the number of generalized spike-and-wave (GSW) discharges on the 30 min EEG recording after the injection of FDG (370MBq), we classified patients into two groups (patients in group A had 10 or more GSW and group B. 9 or less). We applied the automated and objective technique of statistical parametric mapping (SPM) to the analysis of FDG-PET to determine the significant hyper- and hypometabolic regions compared with those of 19 age matched normal control subjects. We found significant hypermetabolic regions in bilateral thalamus and central portion of upper brainstem in 16 patients with JME at a statistical threshold of uncorrected P < 0.05. These changes were also shown in group A (n=8), but not in group B (n=8). Additionally, we found significant hypometabolism in bilateral, widespread cortical regions in 16 patients with JME at a threshold of uncorrected P < 0.01. Similar hypometabolic patterns were also observed in both group A and group B, being more prominent in group A. This study provides evidence for the key role of the thalamus and brainstem reticular activating system in generating spontaneous GSW discharge, which is considered as a fundamental pathogenesis underlying JME. This study also suggests that patients with JME might suffer from subtle abnormalities of cognitive and executive cortical functions

  13. Altered glucose metabolism in juvenile myoclonic epilepsy: a PET study with statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lim, G. C.; Kim, J. H.; Kang, J. G.; Kim, J. S.; Yeo, J. S.; Lee, S. A.; Moon, D. H [Asan Medical Center, Seoul (Korea, Republic of)

    2004-07-01

    Juvenile myoclonic epilepsy (JME) is a hereditary, age-dependent epilepsy syndrome, characterized by myoclonic jerks on awakening and generalized tonic-clonic seizures. Although there have been considerable studies on the mechanism to elucidate pathogenesis of JME, the accurate pathogenesis of JME remains obscure. The aim of this study was to investigate alterations of cerebral glucose metabolism in patients with JME. We studied 16 JME patients (Mean age: 22 yrs, M/F: 9/7) with brain FDG-PET and simultaneous EEG recording. On the basis of the number of generalized spike-and-wave (GSW) discharges on the 30 min EEG recording after the injection of FDG (370MBq), we classified patients into two groups (patients in group A had 10 or more GSW and group B. 9 or less). We applied the automated and objective technique of statistical parametric mapping (SPM) to the analysis of FDG-PET to determine the significant hyper- and hypometabolic regions compared with those of 19 age matched normal control subjects. We found significant hypermetabolic regions in bilateral thalamus and central portion of upper brainstem in 16 patients with JME at a statistical threshold of uncorrected P < 0.05. These changes were also shown in group A (n=8), but not in group B (n=8). Additionally, we found significant hypometabolism in bilateral, widespread cortical regions in 16 patients with JME at a threshold of uncorrected P < 0.01. Similar hypometabolic patterns were also observed in both group A and group B, being more prominent in group A. This study provides evidence for the key role of the thalamus and brainstem reticular activating system in generating spontaneous GSW discharge, which is considered as a fundamental pathogenesis underlying JME. This study also suggests that patients with JME might suffer from subtle abnormalities of cognitive and executive cortical functions.

  14. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  15. TRAN-STAT: (statistics for environmental studies), Number 23, April 1983

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Simpson, J.C.

    1983-07-01

    Various statistical computing formulas have been used to estimate the transfer of radionuclides from one environmental component to another. The parameter being estimated is the ratio μ/sub V//μ/sub U/, where μ/sub U/ is the true mean concentration of the donor component (e.g., soil), and μ/sub V/ is the true mean concentration of the receiving component (e.g., vegetation). In radionuclide research μ/sub V//μ/sub U/ may be the concentration ratio (CR) or the inventory ratio (IR). A Monte Carlo computer simulation study in conjunction with mathematical derivations of bias and root expected mean square was conducted to provide guidance on which of eight estimators (computing formulae) are most likely to give the best estimate of μ/sub V//μ/sub U/. These estimators were evaluated for both laboratory and field studies when data are normally or lognormally distributed

  16. Designs and Methods for Association Studies and Population Size Inference in Statistical Genetics

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    method provides a simple goodness of t test by comparing the observed SFS with the expected SFS under a given model of population size changes. By the use of Monte Carlo estimation the expected time between coalescent events can be estimated and the expected SFS can thereby be evaluated. Using......). The OR is interpreted as the eect of an exposure on the probability of being diseased at the end of follow-up, while the interpretation of the IRR is the eect of an exposure on the probability of becoming diseased. Through a simulation study, the OR from a classical case-control study is shown to be an inconsistent...... the classical chi-square statistics we are able to infer single parameter models. Multiple parameter models, e.g. multiple epochs, are harder to identify. By introducing the inference of population size back in time as an inverse problem, the second procedure applies the theory of smoothing splines to infer...

  17. Statistical Methods for Unusual Count Data: Examples From Studies of Microchimerism

    Science.gov (United States)

    Guthrie, Katherine A.; Gammill, Hilary S.; Kamper-Jørgensen, Mads; Tjønneland, Anne; Gadi, Vijayakrishna K.; Nelson, J. Lee; Leisenring, Wendy

    2016-01-01

    Natural acquisition of small amounts of foreign cells or DNA, referred to as microchimerism, occurs primarily through maternal-fetal exchange during pregnancy. Microchimerism can persist long-term and has been associated with both beneficial and adverse human health outcomes. Quantitative microchimerism data present challenges for statistical analysis, including a skewed distribution, excess zero values, and occasional large values. Methods for comparing microchimerism levels across groups while controlling for covariates are not well established. We compared statistical models for quantitative microchimerism values, applied to simulated data sets and 2 observed data sets, to make recommendations for analytic practice. Modeling the level of quantitative microchimerism as a rate via Poisson or negative binomial model with the rate of detection defined as a count of microchimerism genome equivalents per total cell equivalents tested utilizes all available data and facilitates a comparison of rates between groups. We found that both the marginalized zero-inflated Poisson model and the negative binomial model can provide unbiased and consistent estimates of the overall association of exposure or study group with microchimerism detection rates. The negative binomial model remains the more accessible of these 2 approaches; thus, we conclude that the negative binomial model may be most appropriate for analyzing quantitative microchimerism data. PMID:27769989

  18. Effects of statistical models and items difficulties on making trait-level inferences: A simulation study

    Directory of Open Access Journals (Sweden)

    Nelson Hauck Filho

    2014-12-01

    Full Text Available Researchers dealing with the task of estimating locations of individuals on continuous latent variables may rely on several statistical models described in the literature. However, weighting costs and benefits of using one specific model over alternative models depends on empirical information that is not always clearly available. Therefore, the aim of this simulation study was to compare the performance of seven popular statistical models in providing adequate latent trait estimates in conditions of items difficulties targeted at the sample mean or at the tails of the latent trait distribution. Results suggested an overall tendency of models to provide more accurate estimates of true latent scores when using items targeted at the sample mean of the latent trait distribution. Rating Scale Model, Graded Response Model, and Weighted Least Squares Mean- and Variance-adjusted Confirmatory Factor Analysis yielded the most reliable latent trait estimates, even when applied to inadequate items for the sample distribution of the latent variable. These findings have important implications concerning some popular methodological practices in Psychology and related areas.

  19. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.

    2015-12-16

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  20. A statistical study of ion energization at 1700 km in the auroral region

    Directory of Open Access Journals (Sweden)

    M. Hamrin

    2002-12-01

    Full Text Available We present a comprehensive overview of several potentially relevant causes for the oxygen energization in the auroral region. Data from the Freja satellite near 1700 km altitude are used for an unconditional statistical investigation. The data are obtained in the Northern Hemisphere during 21 months in the declining phase of the solar cycle. The importance of various wave types for the ion energization is statistically studied. We also investigate the correlation of ion heating with precipitating protons, accelerated auroral electrons, suprathermal electron bursts, the electron density variations, Kp index and solar illumination of the nearest conjugate ionosphere. We find that sufficiently strong broad-band ELF waves, electromagnetic ion cyclotron waves, and waves around the lower hybrid frequency are foremost associated with the ion heating. However, magnetosonic waves, with a sharp, lower frequency cutoff just below the proton gyrofrequency, are not found to contribute to the ion heating. In the absence of the first three wave emissions, transversely energized ions are rare. These wave types are approximately equally efficient in heating the ions, but we find that the main source for the heating is broadband ELF waves, since they are most common in the auroral region. We have also observed that the conditions for ion heating are more favourable for smaller ratios of the spectral densities SE /SB of the broadband ELF waves at the oxygen gyrofrequency.Key words. Ionosphere (auroral ionosphere; wave propogation Magnetospheric physics (electric fields

  1. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    Energy Technology Data Exchange (ETDEWEB)

    Kendrew, S.; Robitaille, T. P. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); Simpson, R.; Lintott, C. J. [Department of Astrophysics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Bressert, E. [School of Physics, University of Exeter, Stocker Road, Exeter EX4 4QL (United Kingdom); Povich, M. S. [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States); Sherman, R. [Department of Astronomy and Astrophysics, University of Chicago, 5640 S. Ellis Avenue, Chicago, IL 60637 (United States); Schawinski, K. [Yale Center for Astronomy and Astrophysics, Yale University, P.O. Box 208121, New Haven, CT 06520 (United States); Wolf-Chase, G., E-mail: kendrew@mpia.de [Astronomy Department, Adler Planetarium, 1300 S. Lake Shore Drive, Chicago, IL 60605 (United States)

    2012-08-10

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% {+-} 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% {+-} 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  2. Design of nuclear fuel cells by means of a statistical analysis and a sensibility study

    International Nuclear Information System (INIS)

    Jauregui C, V.; Castillo M, J. A.; Ortiz S, J. J.; Montes T, J. L.; Perusquia del C, R.

    2013-10-01

    This work contains the results of the statistical analysis realized to study the nuclear fuel cells performance, considering the frequencies for the election of fuel bars used in the design of the same ones. The election of the bars used for the cells design are of 3 types, the first election shows that to the plotting the respective frequency is similar to a normal distribution, in the second case the frequencies graph is of type inverted square X 2 and the last election is when the bars are chosen in aleatory form. The heuristic techniques used for the cells design were the neural networks, the ant colonies and a hybrid between the dispersed search and the trajectories re-linkage. To carry out the statistical analysis in the cells design were considered the local power peak factor and the neutron infinite multiplication factor (k∞) of this. On the other hand, the performance of the designed cells was analyzed when verifying the position of the bars containing gadolinium. The results show that is possible to design cells of nuclear fuel with a good performance, when considering the frequency of the bars used in their design. (Author)

  3. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    International Nuclear Information System (INIS)

    Kendrew, S.; Robitaille, T. P.; Simpson, R.; Lintott, C. J.; Bressert, E.; Povich, M. S.; Sherman, R.; Schawinski, K.; Wolf-Chase, G.

    2012-01-01

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% ± 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% ± 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  4. More Than Filaments and Cores: Statistical Study of Structure Formation and Dynamics in Nearby Molecular Clouds

    Science.gov (United States)

    Chen, How-Huan; Goodman, Alyssa

    2018-01-01

    In the past decade, multiple attempts at understanding the connection between filaments and star forming cores have been made using observations across the entire epectrum. However, the filaments and the cores are usually treated as predefined--and well-defined--entities, instead of structures that often come at different sizes, shapes, with substantially different dynamics, and inter-connected at different scales. In my dissertation, I present an array of studies using different statistical methods, including the dendrogram and the probability distribution function (PDF), of structures at different size scales within nearby molecular clouds. These structures are identified using observations of different density tracers, and where possible, in the multi-dimensional parameter space of key dynamic properties--the LSR velocity, the velocity dispersion, and the column density. The goal is to give an overview of structure formation in nearby star-forming clouds, as well as of the dynamics in these structures. I find that the overall statistical properties of a larger structure is often the summation/superposition of sub-structures within, and that there could be significant variations due to local physical processes. I also find that the star formation process within molecular clouds could in fact take place in a non-monolithic manner, connecting potentially merging and/or transient structures, at different scales.

  5. Ten-year clinico-statistical study of oral squamous cell carcinoma

    International Nuclear Information System (INIS)

    Aoki, Shinjiro; Kawabe, Ryoichi; Chikumaru, Hiroshi; Saito, Tomokatsu; Hirota, Makoto; Miyake, Tetsumi; Omura, Susumu; Fujita, Kiyohide

    2003-01-01

    This clinico-statistical study includes 232 cases of oral squamous cell carcinoma that underwent radical treatment in the Department of Oral and Maxillofacial Surgery, Yokohama City University Hospital, during the decade from 1991 to 2000. Surgery was principally adopted as the first line for treatment in 199 cases, and radiotherapy in 33 cases. The 5-year overall survival rate was 73.4%. The results according to stage were as follows: stage I, 87.5%; Stage II, 77.9%; Stage III, 63.5%; and Stage IV A, 44.7%. The primary sites were classified as follows: upper gingiva, 85.2%; tongue, 73.7%; floor of mouth, 68.9%; lower gingiva, 66.3%; buccal mucosa, 63.9%; and hard palate, 50%. For tongue cancer, the 5-year overall survival rates by stage were: Stage I, 90.8%; Stage II, 82.1%; Stage III, 40.3%; and Stage IV A, 45.7%. Statistical significance was seen between cases of Stages I and II and those of Stages III and IV A stage. For lower gingival cancer, the 5-year overall survival rates by stage were: Stage I, 90.8%; Stage II, 82.1%; Stage III, 40.3%; and Stage IV A, 45.7%. Even in Stage I lower gingival cancers had unfavorable clinical outcomes. Preventive neck dissections were performed on 52 N 0 neck patients, but clinically negative nodes however showed metastasis in 14 patients (26.9%). (author)

  6. A STATISTICAL STUDY OF THE MASS AND DENSITY STRUCTURE OF INFRARED DARK CLOUDS

    International Nuclear Information System (INIS)

    Peretto, N.; Fuller, G. A.

    2010-01-01

    How and when the mass distribution of stars in the Galaxy is set is one of the main issues of modern astronomy. Here, we present a statistical study of mass and density distributions of infrared dark clouds (IRDCs) and fragments within them. These regions are pristine molecular gas structures and progenitors of stars and so provide insights into the initial conditions of star formation. This study makes use of an IRDC catalog, the largest sample of IRDC column density maps to date, containing a total of ∼11,000 IRDCs with column densities exceeding N H 2 = 1x10 22 cm -2 and over 50,000 single-peaked IRDC fragments. The large number of objects constitutes an important strength of this study, allowing a detailed analysis of the completeness of the sample and so statistically robust conclusions. Using a statistical approach to assigning distances to clouds, the mass and density distributions of the clouds and the fragments within them are constructed. The mass distributions show a steepening of the slope when switching from IRDCs to fragments, in agreement with previous results of similar structures. IRDCs and fragments are divided into unbound/bound objects by assuming Larson's relation and calculating their virial parameter. IRDCs are mostly gravitationally bound, while a significant fraction of the fragments are not. The density distribution of gravitationally unbound fragments shows a steep characteristic slope such as ΔN/Δlog(n) ∝ n -4.0±0.5 , rather independent of the range of fragment mass. However, the incompleteness limit at a number density of ∼10 3 cm -3 does not allow us to exclude a potential lognormal density distribution. In contrast, gravitationally bound fragments show a characteristic density peak at n ≅ 10 4 cm -3 but the shape of the density distributions changes with the range of fragment masses. An explanation for this could be the differential dynamical evolution of the fragment density with respect to their mass as more massive

  7. The relationship between VHF radar auroral backscatter amplitude and Doppler velocity: a statistical study

    Directory of Open Access Journals (Sweden)

    B. A. Shand

    Full Text Available A statistical investigation of the relationship between VHF radar auroral backscatter intensity and Doppler velocity has been undertaken with data collected from 8 years operation of the Wick site of the Sweden And Britain Radar-auroral Experiment (SABRE. The results indicate three different regimes within the statistical data set; firstly, for Doppler velocities <200 m s–1, the backscatter intensity (measured in decibels remains relatively constant. Secondly, a linear relationship is observed between the backscatter intensity (in decibels and Doppler velocity for velocities between 200 m s–1 and 700 m s–1. At velocities greater than 700 m s–1 the backscatter intensity saturates at a maximum value as the Doppler velocity increases. There are three possible geophysical mechanisms for the saturation in the backscatter intensity at high phase speeds: a saturation in the irregularity turbulence level, a maximisation of the scattering volume, and a modification of the local ambient electron density. There is also a difference in the dependence of the backscatter intensity on Doppler velocity for the flow towards and away from the radar. The results for flow towards the radar exhibit a consistent relationship between backscatter intensity and measured velocities throughout the solar cycle. For flow away from the radar, however, the relationship between backscatter intensity and Doppler velocity varies during the solar cycle. The geometry of the SABRE system ensures that flow towards the radar is predominantly associated with the eastward electrojet, and flow away is associated with the westward electrojet. The difference in the backscatter intensity variation as a function of Doppler velocity is attributed to asymmetries between the eastward and westward electrojets and the geophysical parameters controlling the backscatter amplitude.

  8. The relationship between VHF radar auroral backscatter amplitude and Doppler velocity: a statistical study

    Directory of Open Access Journals (Sweden)

    B. A. Shand

    1996-08-01

    Full Text Available A statistical investigation of the relationship between VHF radar auroral backscatter intensity and Doppler velocity has been undertaken with data collected from 8 years operation of the Wick site of the Sweden And Britain Radar-auroral Experiment (SABRE. The results indicate three different regimes within the statistical data set; firstly, for Doppler velocities <200 m s–1, the backscatter intensity (measured in decibels remains relatively constant. Secondly, a linear relationship is observed between the backscatter intensity (in decibels and Doppler velocity for velocities between 200 m s–1 and 700 m s–1. At velocities greater than 700 m s–1 the backscatter intensity saturates at a maximum value as the Doppler velocity increases. There are three possible geophysical mechanisms for the saturation in the backscatter intensity at high phase speeds: a saturation in the irregularity turbulence level, a maximisation of the scattering volume, and a modification of the local ambient electron density. There is also a difference in the dependence of the backscatter intensity on Doppler velocity for the flow towards and away from the radar. The results for flow towards the radar exhibit a consistent relationship between backscatter intensity and measured velocities throughout the solar cycle. For flow away from the radar, however, the relationship between backscatter intensity and Doppler velocity varies during the solar cycle. The geometry of the SABRE system ensures that flow towards the radar is predominantly associated with the eastward electrojet, and flow away is associated with the westward electrojet. The difference in the backscatter intensity variation as a function of Doppler velocity is attributed to asymmetries between the eastward and westward electrojets and the geophysical parameters controlling the backscatter amplitude.

  9. Using Fun in the Statistics Classroom: An Exploratory Study of College Instructors' Hesitations and Motivations

    Science.gov (United States)

    Lesser, Lawrence M.; Wall, Amitra A.; Carver, Robert H.; Pearl, Dennis K.; Martin, Nadia; Kuiper, Shonda; Posner, Michael A.; Erickson, Patricia; Liao, Shu-Min; Albert, Jim; Weber, John J., III

    2013-01-01

    This study examines statistics instructors' use of fun as well as their motivations, hesitations, and awareness of resources. In 2011, a survey was administered to attendees at a national statistics education conference, and follow-up qualitative interviews were conducted with 16 of those ("N" = 249) surveyed to provide further…

  10. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    International Nuclear Information System (INIS)

    Edjabou, Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn; Petersen, Claus; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2015-01-01

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  11. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Edjabou, Maklawe Essonanawe, E-mail: vine@env.dtu.dk [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Jensen, Morten Bang; Götze, Ramona; Pivnenko, Kostyantyn [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark); Petersen, Claus [Econet AS, Omøgade 8, 2.sal, 2100 Copenhagen (Denmark); Scheutz, Charlotte; Astrup, Thomas Fruergaard [Department of Environmental Engineering, Technical University of Denmark, 2800 Kgs. Lyngby (Denmark)

    2015-02-15

    Highlights: • Tiered approach to waste sorting ensures flexibility and facilitates comparison of solid waste composition data. • Food and miscellaneous wastes are the main fractions contributing to the residual household waste. • Separation of food packaging from food leftovers during sorting is not critical for determination of the solid waste composition. - Abstract: Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10–50 waste fractions, organised according to a three-level (tiered approach) facilitating comparison of the waste data between individual sub-areas with different fractionation (waste from one municipality was sorted at “Level III”, e.g. detailed, while the two others were sorted only at “Level I”). The results showed that residual household waste mainly contained food waste (42 ± 5%, mass per wet basis) and miscellaneous combustibles (18 ± 3%, mass per wet basis). The residual household waste generation rate in the study areas was 3–4 kg per person per week. Statistical analyses revealed that the waste composition was independent of variations in the waste generation rate. Both, waste composition and waste generation rates were statistically similar for each of the three municipalities. While the waste generation rates were similar for each of the two housing types (single

  12. The reduction method of statistic scale applied to study of climatic change

    International Nuclear Information System (INIS)

    Bernal Suarez, Nestor Ricardo; Molina Lizcano, Alicia; Martinez Collantes, Jorge; Pabon Jose Daniel

    2000-01-01

    In climate change studies the global circulation models of the atmosphere (GCMAs) enable one to simulate the global climate, with the field variables being represented on a grid points 300 km apart. One particular interest concerns the simulation of possible changes in rainfall and surface air temperature due to an assumed increase of greenhouse gases. However, the models yield the climatic projections on grid points that in most cases do not correspond to the sites of major interest. To achieve local estimates of the climatological variables, methods like the one known as statistical down scaling are applied. In this article we show a case in point by applying canonical correlation analysis (CCA) to the Guajira Region in the northeast of Colombia

  13. Municipal solid waste composition: Sampling methodology, statistical analyses, and case study evaluation

    DEFF Research Database (Denmark)

    Edjabou, Vincent Maklawe Essonanawe; Jensen, Morten Bang; Götze, Ramona

    2015-01-01

    Sound waste management and optimisation of resource recovery require reliable data on solid waste generation and composition. In the absence of standardised and commonly accepted waste characterisation methodologies, various approaches have been reported in literature. This limits both...... comparability and applicability of the results. In this study, a waste sampling and sorting methodology for efficient and statistically robust characterisation of solid waste was introduced. The methodology was applied to residual waste collected from 1442 households distributed among 10 individual sub......-areas in three Danish municipalities (both single and multi-family house areas). In total 17 tonnes of waste were sorted into 10-50 waste fractions, organised according to a three-level (tiered approach) facilitating,comparison of the waste data between individual sub-areas with different fractionation (waste...

  14. Operational benefits and challenges of the use of fingerprint statistical models: a field study.

    Science.gov (United States)

    Neumann, Cedric; Mateos-Garcia, Ismael; Langenburg, Glenn; Kostroski, Jennifer; Skerrett, James E; Koolen, Martin

    2011-10-10

    Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. NATO Advanced Study Institute on Statistical Treatments for Estimation of Mineral and Energy Resources

    CERN Document Server

    Fabbri, A; Sinding-Larsen, R

    1988-01-01

    This volume contains the edited papers prepared by lecturers and participants of the NATO Advanced Study Institute on "Statistical Treatments for Estimation of Mineral and Energy Resources" held at II Ciocco (Lucca), Italy, June 22 - July 4, 1986. During the past twenty years, tremendous efforts have been made to acquire quantitative geoscience information from ore deposits, geochemical, geophys ical and remotely-sensed measurements. In October 1981, a two-day symposium on "Quantitative Resource Evaluation" and a three-day workshop on "Interactive Systems for Multivariate Analysis and Image Processing for Resource Evaluation" were held in Ottawa, jointly sponsored by the Geological Survey of Canada, the International Association for Mathematical Geology, and the International Geological Correlation Programme. Thirty scientists from different countries in Europe and North America were invited to form a forum for the discussion of quantitative methods for mineral and energy resource assessment. Since then, not ...

  16. Statistical Study of Corrosion Types in Constructions in South Region of Rio De Janeiro – Brazil

    Directory of Open Access Journals (Sweden)

    Carolina Lacerda da Cruz

    2016-05-01

    Full Text Available Some of the most difficult and troubling problems encountered in construction are those caused by corrosive processes. The corrosion processes are constituted by some material degradation, generally metallic material, by means of chemical or electrochemical actions of environment in which the material are and can or cannot be combined with mechanical stress. Corrosion is present in the materials in general. Their deterioration is caused by such physical-chemical interaction between the material and the corrosive environment where it causes major problems in several activities. In order to prevent material losses, anticorrosive techniques are used which include coatings, medium modification techniques, anodic and cathodic protection, and corrosion inhibitors such as the organic compounds use. This article analyses the statistical study of corrosion types in construction in south region of Rio de Janeiro, Brazil.

  17. Luminosity excesses in low-mass young stellar objects - a statistical study

    International Nuclear Information System (INIS)

    Strom, K.M.; Strom, S.E.; Kenyon, S.J.; Hartmann, L.

    1988-01-01

    This paper presents a statistical study in which the observed total luminosity is compared quantitatively with an estimate of the stellar luminosity for a sample of 59 low-mass young stellar objects (YSOs) in the Taurus-Auriga complex. In 13 of the analyzed YSOs, luminosity excesses greater than 0.20 are observed together with greater than 0.6 IR excesses, which typically contribute the bulk of the observed excess luminosity and are characterized by spectral energy distributions which are flat or rise toward long wavelengths. The analysis suggests that YSOs showing the largest luminosity excesses typically power optical jets and/or molecular outflows or have strong winds, as evidenced by the presence of O I emission, indicating a possible correlation between accretion and mass-outflow properties. 38 references

  18. A statistical study of the upstream intermediate ion boundary in the Earth's foreshock

    Directory of Open Access Journals (Sweden)

    K. Meziane

    1998-02-01

    Full Text Available A statistical investigation of the location of onset of intermediate and gyrating ion populations in the Earth's foreshock is presented based on Fixed Voltage Analyzer data from ISEE 1. This study reveals the existence of a spatial boundary for intermediate and gyrating ion populations that coincides with the reported ULF wave boundary. This boundary position in the Earth's foreshock depends strongly upon the magnetic cone angle θBX and appears well defined for relatively large cone angles, though not for small cone angles. As reported in a previous study of the ULF wave boundary, the position of the intermediate-gyrating ion boundary is not compatible with a fixed growth rate of the waves resulting from the interaction between a uniform beam and the ambient plasma. The present work examines the momentum associated with protons which travel along this boundary, and we show that the variation of the boundary position (or equivalently, the associated particle momentum with the cone angle is related to classical acceleration mechanisms at the bow shock surface. The same functional behavior as a function of the cone angle is obtained for the momentum predicted by an acceleration model and for the particle momentum associated with the boundary. However, the model predicts systematically larger values of the momentum than the observation related values by a constant amount; we suggest that this difference may be due to some momentum exchange between the incident solar-wind population and the backstreaming particles through a wave-particle interaction resulting from a beam plasma instability.Key words. Intermediate ion boundary · Statistical investigation · Earth's foreshock · ISEE 1 spacecraft

  19. A statistical study of the upstream intermediate ion boundary in the Earth's foreshock

    Directory of Open Access Journals (Sweden)

    K. Meziane

    Full Text Available A statistical investigation of the location of onset of intermediate and gyrating ion populations in the Earth's foreshock is presented based on Fixed Voltage Analyzer data from ISEE 1. This study reveals the existence of a spatial boundary for intermediate and gyrating ion populations that coincides with the reported ULF wave boundary. This boundary position in the Earth's foreshock depends strongly upon the magnetic cone angle θBX and appears well defined for relatively large cone angles, though not for small cone angles. As reported in a previous study of the ULF wave boundary, the position of the intermediate-gyrating ion boundary is not compatible with a fixed growth rate of the waves resulting from the interaction between a uniform beam and the ambient plasma. The present work examines the momentum associated with protons which travel along this boundary, and we show that the variation of the boundary position (or equivalently, the associated particle momentum with the cone angle is related to classical acceleration mechanisms at the bow shock surface. The same functional behavior as a function of the cone angle is obtained for the momentum predicted by an acceleration model and for the particle momentum associated with the boundary. However, the model predicts systematically larger values of the momentum than the observation related values by a constant amount; we suggest that this difference may be due to some momentum exchange between the incident solar-wind population and the backstreaming particles through a wave-particle interaction resulting from a beam plasma instability.

    Key words. Intermediate ion boundary · Statistical investigation · Earth's foreshock · ISEE 1 spacecraft

  20. Statistical study of the correlation in the galaxy distribution - application to the baryonic acoustic oscillations

    International Nuclear Information System (INIS)

    Labatie, Antoine

    2012-01-01

    Baryon Acoustic Oscillations (BAOs) correspond to the acoustic phenomenon in the baryon-photon plasma before recombination. BAOs imprint a particular scale, corresponding to the sound horizon, that can be observed in large-scale structures of the Universe. Using this standard ruler property, BAOs can be used to probe the distance-redshift relation in galaxy catalogues, thus providing a very promising tool to study dark energy properties. BAOs can be studied from the second order statistics (the correlation function or the power spectrum) in the distribution of galaxies. In this thesis we restrict to the case of the correlation function. BAOs appear in the correlation function as a small localized bump at the scale of the sound horizon in co-moving coordinates. There are two major applications of BAO study: BAO detection and cosmological parameter constraints using the standard ruler property. The detection of BAOs at the expected scale enables to confirm the current cosmological model. As for cosmological parameter constraints, enabling the study of dark energy, it is a major goal of modern cosmology. In this thesis we tackle different statistical problems concerning the correlation function analysis in the galaxy distribution, with a focus on the study of BAOs. In the first part, we make both a theoretical and practical study of the bias due to the integral constraints in correlation function estimators. We show that this bias is very small for current galaxy surveys. In the second part we study the BAO detection. We show the limitations of the classical detection method and propose a new method, which is more rigorous. In particular our method enables to take into account the model-dependence of the covariance matrix of the estimators. In the third part, we focus again on the model-dependence of the covariance matrix, but this time for parameter constraints. We estimate a model-dependent covariance matrix and compare our constraints with constraints obtained by

  1. The use of adaptive statistical iterative reconstruction in pediatric head CT: a feasibility study.

    Science.gov (United States)

    Vorona, G A; Zuccoli, G; Sutcavage, T; Clayton, B L; Ceschin, R C; Panigrahy, A

    2013-01-01

    Iterative reconstruction techniques facilitate CT dose reduction; though to our knowledge, no group has explored using iterative reconstruction with pediatric head CT. Our purpose was to perform a feasibility study to assess the use of ASIR in a small group of pediatric patients undergoing head CT. An Alderson-Rando head phantom was scanned at decreasing 10% mA intervals relative to our standard protocol, and each study was then reconstructed at 10% ASIR intervals. An intracranial region of interest was consistently placed to estimate noise. Our ventriculoperitoneal shunt CT protocol was subsequently modified, and patients were scanned at 20% ASIR with approximately 20% mA reductions. ASIR studies were anonymously compared with older non-ASIR studies from the same patients by 2 attending pediatric neuroradiologists for diagnostic utility, sharpness, noise, and artifacts. The phantom study demonstrated similar noise at 100% mA/0% ASIR (3.9) and 80% mA/20% ASIR (3.7). Twelve pediatric patients were scanned at reduced dose at 20% ASIR. The average CTDI(vol) and DLP values of the 20% ASIR studies were 22.4 mGy and 338.4 mGy-cm, and for the non-ASIR studies, they were 28.8 mGy and 444.5 mGy-cm, representing statistically significant decreases in the CTDI(vol) (22.1%, P = .00007) and DLP (23.9%, P = .0005) values. There were no significant differences between the ASIR studies and non-ASIR studies with respect to diagnostic acceptability, sharpness, noise, or artifacts. Our findings suggest that 20% ASIR can provide approximately 22% dose reduction in pediatric head CT without affecting image quality.

  2. Studies of high temperature superconductors

    International Nuclear Information System (INIS)

    Narlikar, A.

    1989-01-01

    The high temperature superconductors (HTSCs) discovered are from the family of ceramic oxides. Their large scale utilization in electrical utilities and in microelectronic devices are the frontal challenges which can perhaps be effectively met only through consolidated efforts and expertise of a multidisciplinary nature. During the last two years the growth of the new field has occurred on an international scale and perhaps has been more rapid than in most other fields. There has been an extraordinary rush of data and results which are continually being published as short texts dispersed in many excellent journals, some of which were started to ensure rapid publication exclusively in this field. As a result, the literature on HTSCs has indeed become so massive and so diffuse that it is becoming increasingly difficult to keep abreast with the important and reliable facets of this fast-growing field. This provided the motivation to evolve a process whereby both professional investigators and students can have ready access to up-to- date in-depth accounts of major technical advances happening in this field. The present series Studies of High Temperature Superconductors has been launched to, at least in part, fulfill this need

  3. A high-statistics measurement of the pp→nn charge-exchange reaction at 875 MeV/c

    International Nuclear Information System (INIS)

    Lamanna, M.; Ahmidouch, A.; Birsa, R.; Bradamante, F.; Bressan, A.; Bressani, T.; Dalla Torre-Colautti, S.; Giorgi, M.; Heer, E.; Hess, R.; Kunne, R.A.; Lechanoine-Le Luc, C.; Martin, A.; Mascarini, C.; Masoni, A.; Penzo, A.; Rapin, D.; Schiavon, P.; Tessarotto, F.

    1995-01-01

    A new measurement of the differential cross section and of the analysing power A 0n of the charge-exchange reaction pp→nn at 875 MeV/c is presented. The A 0n data cover the entire angular range and constitute a considerable improvement over previously published data, both in the forward and in the backward hemisphere. The cross-section data cover only the backward region, but are unique at this energy. A careful study of the long-term drifts of the apparatus has allowed to fully exploit the good statistics of the data. ((orig.))

  4. Statistical correlations for thermophysical properties of Supercritical Argon (SCAR) used in cooling of futuristic High Temperature Superconducting (HTS) cables

    Energy Technology Data Exchange (ETDEWEB)

    Kalsia, Mohit [School of Mechanical Engineering, Lovely Professional University, Phagwara, 144 401 (India); Dondapati, Raja Sekhar, E-mail: drsekhar@ieee.org [School of Mechanical Engineering, Lovely Professional University, Phagwara, 144 401 (India); Usurumarti, Preeti Rao [Department of Mechanical Engineering, PVK Institute of Technology, Anantpur, 515 001 (India)

    2017-05-15

    Highlights: • The developed correlations can be integrated into thermohydraulic analysis of HTS cables. • This work also explains the phenomenon of flow with less pumping power and maximum heat transfer in HTS cables. • Pumping power required to circulate the SCAR for cooling of HTS cables would be significantly lower. • For Hg-based high temperature superconductors (T{sub c} > 134 K), SCAR found to be a suitable coolant. - Abstract: High Temperature Superconducting (HTS) cables are emerging as an alternative to conventional cables in efficient power transmission. However, these HTS cables require cooling below the critical temperature of superconductors used to transmit larger currents. With the invention of high temperature superconductors whose critical temperatures are up to 134 K (Hg based), it is a great challenge to identify a suitable coolant which can carry away the heating load on the superconductors. In order to accomplish such challenge, an attempt has been made in the present work to propose supercritical Argon (SCAR) as the alternative to cool the HTS cables. Further, a statistical correlation has been developed for the thermophysical properties such as density, viscosity, specific heat and thermal conductivity of SCAR. In addition, the accuracy of developed correlations is established with the help of few statistical parameters and validated with standard database available in the literature. These temperature dependent accurate correlations are useful in predicting the pressure drop and heat transfer behaviour in HTS cables using numerical or computational techniques. In recent times, with the sophistication of computer technology, solving of various complex transport equations along with the turbulence models became popular and hence the developed correlations would benefit the technological community. It is observed that, a decrease in pressure, density and viscosity are found to be decreasing whereas the thermal conductivity and specific

  5. Statistical correlations for thermophysical properties of Supercritical Argon (SCAR) used in cooling of futuristic High Temperature Superconducting (HTS) cables

    International Nuclear Information System (INIS)

    Kalsia, Mohit; Dondapati, Raja Sekhar; Usurumarti, Preeti Rao

    2017-01-01

    Highlights: • The developed correlations can be integrated into thermohydraulic analysis of HTS cables. • This work also explains the phenomenon of flow with less pumping power and maximum heat transfer in HTS cables. • Pumping power required to circulate the SCAR for cooling of HTS cables would be significantly lower. • For Hg-based high temperature superconductors (T_c > 134 K), SCAR found to be a suitable coolant. - Abstract: High Temperature Superconducting (HTS) cables are emerging as an alternative to conventional cables in efficient power transmission. However, these HTS cables require cooling below the critical temperature of superconductors used to transmit larger currents. With the invention of high temperature superconductors whose critical temperatures are up to 134 K (Hg based), it is a great challenge to identify a suitable coolant which can carry away the heating load on the superconductors. In order to accomplish such challenge, an attempt has been made in the present work to propose supercritical Argon (SCAR) as the alternative to cool the HTS cables. Further, a statistical correlation has been developed for the thermophysical properties such as density, viscosity, specific heat and thermal conductivity of SCAR. In addition, the accuracy of developed correlations is established with the help of few statistical parameters and validated with standard database available in the literature. These temperature dependent accurate correlations are useful in predicting the pressure drop and heat transfer behaviour in HTS cables using numerical or computational techniques. In recent times, with the sophistication of computer technology, solving of various complex transport equations along with the turbulence models became popular and hence the developed correlations would benefit the technological community. It is observed that, a decrease in pressure, density and viscosity are found to be decreasing whereas the thermal conductivity and specific heat

  6. The development of mini project interactive media on junior statistical materials (developmental research in junior high school)

    Science.gov (United States)

    Fauziah, D.; Mardiyana; Saputro, D. R. S.

    2018-05-01

    Assessment is an integral part in the learning process. The process and the result should be in line, regarding to measure the ability of learners. Authentic assessment refers to a form of assessment that measures the competence of attitudes, knowledge, and skills. In fact, many teachers including mathematics teachers who have implemented curriculum based teaching 2013 feel confuse and difficult in mastering the use of authentic assessment instruments. Therefore, it is necessary to design an authentic assessment instrument with an interactive mini media project where teacher can adopt it in the assessment. The type of this research is developmental research. The developmental research refers to the 4D models development, which consist of four stages: define, design, develop and disseminate. The research purpose is to create a valid mini project interactive media on statistical materials in junior high school. The retrieved valid instrument based on expert judgment are 3,1 for eligibility constructions aspect, and 3,2 for eligibility presentation aspect, 3,25 for eligibility contents aspect, and 2,9 for eligibility didactic aspect. The research results obtained interactive mini media projects on statistical materials using Adobe Flash so it can help teachers and students in achieving learning objectives.

  7. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  8. Collaborative Professional Development for Statistics Teaching: A Case Study of Two Middle-School Mathematics Teachers

    Science.gov (United States)

    de Oliveira Souza, Leandro; Lopes, Celi Espasandin; Pfannkuch, Maxine

    2015-01-01

    The recent introduction of statistics into the Brazilian curriculum has presented a multi-problematic situation for teacher professional development. Drawing on research in the areas of teacher development and statistical inquiry, we propose a Teacher Professional Development Cycle (TPDC) model. This paper focuses on two teachers who planned a…

  9. Concussion Education for High School Football Players: A Pilot Study

    Science.gov (United States)

    Manasse-Cohick, Nancy J.; Shapley, Kathy L.

    2014-01-01

    This survey study compared high school football players' knowledge and attitudes about concussion before and after receiving concussion education. There were no significant changes in the Concussion Attitude Index. Results revealed a statistically significant difference in the athletes' scores for the Concussion Knowledge Index, "t"(244)…

  10. Adaptive statistical iterative reconstruction reduces patient radiation dose in neuroradiology CT studies

    Energy Technology Data Exchange (ETDEWEB)

    Komlosi, Peter; Zhang, Yanrong; Leiva-Salinas, Carlos; Ornan, David; Grady, Deborah [University of Virginia, Department of Radiology and Medical Imaging, Division of Neuroradiology, PO Box 800170, Charlottesville, VA (United States); Patrie, James T.; Xin, Wenjun [University of Virginia, Department of Public Health Sciences, Charlottesville, VA (United States); Wintermark, Max [University of Virginia, Department of Radiology and Medical Imaging, Division of Neuroradiology, PO Box 800170, Charlottesville, VA (United States); Centre Hospitalier Universitaire Vaudois, Department of Radiology, Lausanne (Switzerland)

    2014-03-15

    Adaptive statistical iterative reconstruction (ASIR) can decrease image noise, thereby generating CT images of comparable diagnostic quality with less radiation. The purpose of this study is to quantify the effect of systematic use of ASIR versus filtered back projection (FBP) for neuroradiology CT protocols on patients' radiation dose and image quality. We evaluated the effect of ASIR on six types of neuroradiologic CT studies: adult and pediatric unenhanced head CT, adult cervical spine CT, adult cervical and intracranial CT angiography, adult soft tissue neck CT with contrast, and adult lumbar spine CT. For each type of CT study, two groups of 100 consecutive studies were retrospectively reviewed: 100 studies performed with FBP and 100 studies performed with ASIR/FBP blending factor of 40 %/60 % with appropriate noise indices. The weighted volume CT dose index (CTDI{sub vol}), dose-length product (DLP) and noise were recorded. Each study was also reviewed for image quality by two reviewers. Continuous and categorical variables were compared by t test and free permutation test, respectively. For adult unenhanced brain CT, CT cervical myelography, cervical and intracranial CT angiography and lumbar spine CT both CTDI{sub vol} and DLP were lowered by up to 10.9 % (p < 0.001), 17.9 % (p = 0.005), 20.9 % (p < 0.001), and 21.7 % (p = 0.001), respectively, by using ASIR compared with FBP alone. Image quality and noise were similar for both FBP and ASIR. We recommend routine use of iterative reconstruction for neuroradiology CT examinations because this approach affords a significant dose reduction while preserving image quality. (orig.)

  11. Adaptive statistical iterative reconstruction reduces patient radiation dose in neuroradiology CT studies

    International Nuclear Information System (INIS)

    Komlosi, Peter; Zhang, Yanrong; Leiva-Salinas, Carlos; Ornan, David; Grady, Deborah; Patrie, James T.; Xin, Wenjun; Wintermark, Max

    2014-01-01

    Adaptive statistical iterative reconstruction (ASIR) can decrease image noise, thereby generating CT images of comparable diagnostic quality with less radiation. The purpose of this study is to quantify the effect of systematic use of ASIR versus filtered back projection (FBP) for neuroradiology CT protocols on patients' radiation dose and image quality. We evaluated the effect of ASIR on six types of neuroradiologic CT studies: adult and pediatric unenhanced head CT, adult cervical spine CT, adult cervical and intracranial CT angiography, adult soft tissue neck CT with contrast, and adult lumbar spine CT. For each type of CT study, two groups of 100 consecutive studies were retrospectively reviewed: 100 studies performed with FBP and 100 studies performed with ASIR/FBP blending factor of 40 %/60 % with appropriate noise indices. The weighted volume CT dose index (CTDI vol ), dose-length product (DLP) and noise were recorded. Each study was also reviewed for image quality by two reviewers. Continuous and categorical variables were compared by t test and free permutation test, respectively. For adult unenhanced brain CT, CT cervical myelography, cervical and intracranial CT angiography and lumbar spine CT both CTDI vol and DLP were lowered by up to 10.9 % (p < 0.001), 17.9 % (p = 0.005), 20.9 % (p < 0.001), and 21.7 % (p = 0.001), respectively, by using ASIR compared with FBP alone. Image quality and noise were similar for both FBP and ASIR. We recommend routine use of iterative reconstruction for neuroradiology CT examinations because this approach affords a significant dose reduction while preserving image quality. (orig.)

  12. A statistical study of {sup 238}U and {sup 234}U/{sup 238}U distributions in coral samples from the Egyptian shoreline of the north-western Red sea and in fossil mollusk shells from the Atlantic coast of High Atlas in Morocco: implications for {sup 230}Th/{sup 234}U dating

    Energy Technology Data Exchange (ETDEWEB)

    Choukri, A.; Hakam, O.K. [Lab. des Faibles Radioactivites et d' Environnements, UFR: Faibles Radioactivites, Mathematiques physiques et environnement, Kenitra (Morocco); Reyss, J.L. [Lab. des Sciences de Climat et de l' Environnement, Domaine du CNRS, Gif sur Yvette (France); Plaziat, J.C. [Univ. de Paris-Sud, Dept. des Sciences de la Terre, Orsay (France)

    2002-07-01

    In this work, radiochemical analysis results of 126 uncrystallized coral samples from the Egyptian shoreline of northwestern Red Sea and 120 fossil mollusk shell samples from the Atlantic coast of Moroccan High Atlas at the North of Agadir City in Morocco are presented and discussed. The coral samples were collected in Egypt from the emerged coral reef terraces over 500 km from The Ras Gharib-Ras Shukeir depression (28 10') in the north to Wadi Lahami (north of Ras Banas, 24 10') in the south. The fossil mollusk shells were collected in Morocco from Agadir-Harbour in the south to Tamri village in the north extending over about 50 km. The statistical distributions of results ({sup 238}U content, {sup 234}U/{sup 238}U activity ratio and ages) obtained on the dated materials in the two different regions were compared for three fossil sea levels corresponding to three different climatic stages (Holocene, 5e, 7 and/or 9) in the aim to establish methodological criteria for judging validity of the measured ages. For corals, {sup 238}U content varies in narrow interval around the same average value of 3 ppm for the three sea levels, the calculated initial {sup 234}U/{sup 238}U values are in agreement with the actual sea water ratio (1.15) with some values slightly higher than for the older sea levels. The obtained ages are in good agreement with the ages reported previously for the three emerged fossil sea levels on unrecrystallized corals by alpha spectrometry and by mass spectrometry. For mollusk shells, except for Holocene sea level, {sup 238}U and initial {sup 234}U/{sup 238}U activity ratios vary for the older levels in wide intervals, independent of species and calcite contents of samples. The high {sup 238}U contents and {sup 234}U/{sup 238}U activity ratio are due eventually to a post-incorporation of secondary uranium from sea water or from continental waters drained away rivers. This incorporation leads to a rejuvenation of mollusk shell ages and is

  13. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  14. Using Statistical and Probabilistic Methods to Evaluate Health Risk Assessment: A Case Study

    Directory of Open Access Journals (Sweden)

    Hongjing Wu

    2014-06-01

    Full Text Available The toxic chemical and heavy metals within wastewater can cause serious adverse impacts on human health. Health risk assessment (HRA is an effective tool for supporting decision-making and corrective actions in water quality management. HRA can also help people understand the water quality and quantify the adverse effects of pollutants on human health. Due to the imprecision of data, measurement error and limited available information, uncertainty is inevitable in the HRA process. The purpose of this study is to integrate statistical and probabilistic methods to deal with censored and limited numbers of input data to improve the reliability of the non-cancer HRA of dermal contact exposure to contaminated river water by considering uncertainty. A case study in the Kelligrews River in St. John’s, Canada, was conducted to demonstrate the feasibility and capacity of the proposed approach. Five heavy metals were selected to evaluate the risk level, including arsenic, molybdenum, zinc, uranium and manganese. The results showed that the probability of the total hazard index of dermal exposure exceeding 1 is very low, and there is no obvious evidence of risk in the study area.

  15. Short-period atmospheric gravity waves - A study of their statistical properties and source mechanisms

    Science.gov (United States)

    Gedzelman, S. D.

    1983-01-01

    Gravity waves for the one year period beginning 19 October 1976 around Palisades, New York, are investigated to determine their statistical properties and sources. The waves have typical periods of 10 min, pressure amplitudes of 3 Pa and velocities of 30 m/s. In general, the largest, amplitude waves occur during late fall and early winter when the upper tropospheric winds directly overhead are fastest and the static stability of the lower troposphere is greatest. Mean wave amplitudes correlate highly with the product of the mean maximum wind speed and the mean low level stratification directly aloft. A distinct diurnal variation of wave amplitudes with the largest waves occurring in the pre-dawn hours is also observed as a result of the increased static stability then. The majority of waves are generated by shear instability; however, a number of waves are generated by distant sources such as nuclear detonations or large thunderstorms. The waves with distant sources can be distinguished on the basis of their generally much higher coherency across the grid and velocities that depart markedly from the wind velocity at any point in the sounding.

  16. A case study on the design and development of minigames for research methods and statistics

    Directory of Open Access Journals (Sweden)

    P. Van Rosmalen

    2014-08-01

    Full Text Available Research methodology involves logical reasoning and critical thinking skills which are core competences in developing a more sophisticated understanding of the world. Acquiring expertise in research methods and statistics is not easy and poses a significant challenge for many students. The subject material is challenging because it is highly abstract and complex and requires the coordination of different but inter-related knowledge and skills that are all necessary to develop a coherent and usable skills base in this area. Additionally, while many students embrace research methods enthusiastically, others find the area dry, abstract and boring. In this paper we discuss the design and the first evaluation of a set of mini-games to practice research methods. Games are considered to be engaging and allow students to test out scenarios which provide concrete examples in a way that they typically only do once they are out in the field. The design of a game is a complex task. First, we describe how we used cognitive task analysis to identify the knowledge and competences required to develop a comprehensive and usable understanding of research methods. Next, we describe the games designed and how 4C-ID, an instructional design model, was used to underpin the games with a sound instructional design basis. Finally, the evaluation approach is discussed and how the findings of the first evaluation phase were used to improve the games.

  17. Accounting for undetected compounds in statistical analyses of mass spectrometry 'omic studies.

    Science.gov (United States)

    Taylor, Sandra L; Leiserowitz, Gary S; Kim, Kyoungmi

    2013-12-01

    Mass spectrometry is an important high-throughput technique for profiling small molecular compounds in biological samples and is widely used to identify potential diagnostic and prognostic compounds associated with disease. Commonly, this data generated by mass spectrometry has many missing values resulting when a compound is absent from a sample or is present but at a concentration below the detection limit. Several strategies are available for statistically analyzing data with missing values. The accelerated failure time (AFT) model assumes all missing values result from censoring below a detection limit. Under a mixture model, missing values can result from a combination of censoring and the absence of a compound. We compare power and estimation of a mixture model to an AFT model. Based on simulated data, we found the AFT model to have greater power to detect differences in means and point mass proportions between groups. However, the AFT model yielded biased estimates with the bias increasing as the proportion of observations in the point mass increased while estimates were unbiased with the mixture model except if all missing observations came from censoring. These findings suggest using the AFT model for hypothesis testing and mixture model for estimation. We demonstrated this approach through application to glycomics data of serum samples from women with ovarian cancer and matched controls.

  18. Studies in the statistical and thermal properties of hadronic matter under some extreme conditions

    International Nuclear Information System (INIS)

    Chase, K.C.; Mekjian, A.Z.; Bhattacharyya, P.

    1997-01-01

    The thermal and statistical properties of hadronic matter under some extreme conditions are investigated using an exactly solvable canonical ensemble model. A unified model describing both the fragmentation of nuclei and the thermal properties of hadronic matter is developed. Simple expressions are obtained for quantities such as the hadronic equation of state, specific heat, compressibility, entropy, and excitation energy as a function of temperature and density. These expressions encompass the fermionic aspect of nucleons, such as degeneracy pressure and Fermi energy at low temperatures and the ideal gas laws at high temperatures and low density. Expressions are developed which connect these two extremes with behavior that resembles an ideal Bose gas with its associated Bose condensation. In the thermodynamic limit, an infinite cluster exists below a certain critical condition in a manner similar to the sudden appearance of the infinite cluster in percolation theory. The importance of multiplicity fluctuations is discussed and some recent data from the EOS collaboration on critical point behavior of nuclei can be accounted for using simple expressions obtained from the model. copyright 1997 The American Physical Society

  19. Application of statistical methods (SPC) for an optimized control of the irradiation process of high-power semiconductors

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Zwanziger, P.

    2000-01-01

    High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfill special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfill these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes. (author)

  20. A simple and robust statistical framework for planning, analysing and interpreting faecal egg count reduction test (FECRT) studies

    DEFF Research Database (Denmark)

    Denwood, M.J.; McKendrick, I.J.; Matthews, L.

    Introduction. There is an urgent need for a method of analysing FECRT data that is computationally simple and statistically robust. A method for evaluating the statistical power of a proposed FECRT study would also greatly enhance the current guidelines. Methods. A novel statistical framework has...... been developed that evaluates observed FECRT data against two null hypotheses: (1) the observed efficacy is consistent with the expected efficacy, and (2) the observed efficacy is inferior to the expected efficacy. The method requires only four simple summary statistics of the observed data. Power...... that the notional type 1 error rate of the new statistical test is accurate. Power calculations demonstrate a power of only 65% with a sample size of 20 treatment and control animals, which increases to 69% with 40 control animals or 79% with 40 treatment animals. Discussion. The method proposed is simple...