WorldWideScience

Sample records for previous statistical studies

  1. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    Science.gov (United States)

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  2. A study of statistics anxiety levels of graduate dental hygiene students.

    Science.gov (United States)

    Welch, Paul S; Jacks, Mary E; Smiley, Lynn A; Walden, Carolyn E; Clark, William D; Nguyen, Carol A

    2015-02-01

    In light of increased emphasis on evidence-based practice in the profession of dental hygiene, it is important that today's dental hygienist comprehend statistical measures to fully understand research articles, and thereby apply scientific evidence to practice. Therefore, the purpose of this study was to investigate statistics anxiety among graduate dental hygiene students in the U.S. A web-based self-report, anonymous survey was emailed to directors of 17 MSDH programs in the U.S. with a request to distribute to graduate students. The survey collected data on statistics anxiety, sociodemographic characteristics and evidence-based practice. Statistic anxiety was assessed using the Statistical Anxiety Rating Scale. Study significance level was α=0.05. Only 8 of the 17 invited programs participated in the study. Statistical Anxiety Rating Scale data revealed graduate dental hygiene students experience low to moderate levels of statistics anxiety. Specifically, the level of anxiety on the Interpretation Anxiety factor indicated this population could struggle with making sense of scientific research. A decisive majority (92%) of students indicated statistics is essential for evidence-based practice and should be a required course for all dental hygienists. This study served to identify statistics anxiety in a previously unexplored population. The findings should be useful in both theory building and in practical applications. Furthermore, the results can be used to direct future research. Copyright © 2015 The American Dental Hygienists’ Association.

  3. Murder-suicide of the jealous paranoia type: a multicenter statistical pilot study.

    Science.gov (United States)

    Palermo, G B; Smith, M B; Jenzten, J M; Henry, T E; Konicek, P J; Peterson, G F; Singh, R P; Witeck, M J

    1997-12-01

    The authors present a pilot statistical study of murder-suicide comprising 32 cases from the years 1990-1992, collected from the offices of the medical examiners of seven counties in five of the United States. The study includes brief reviews of previous statistical surveys of murder, murder-suicide, and suicide. This present study's conclusions parallel the findings of previous research on the demographic characteristics of the perpetrators of murder-suicide, the relationship between killers and victims, the types of weapon used, locations of the incidents, and the time intervals between the murder and suicide. It also highlights the similarities between the characteristics of the perpetrator of murder-suicide and those of persons who commit only suicide, supporting the thesis that murder-suicide is an extended suicide. Suggestions for prevention of such a type of crime are offered.

  4. Estimating the effect of current, previous and never use of drugs in studies based on prescription registries

    DEFF Research Database (Denmark)

    Nielsen, Lars Hougaard; Løkkegaard, Ellen; Andreasen, Anne Helms

    2009-01-01

    of this misclassification for analysing the risk of breast cancer. MATERIALS AND METHODS: Prescription data were obtained from Danish Registry of Medicinal Products Statistics and we applied various methods to approximate treatment episodes. We analysed the duration of HT episodes to study the ability to identify......PURPOSE: Many studies which investigate the effect of drugs categorize the exposure variable into never, current, and previous use of the study drug. When prescription registries are used to make this categorization, the exposure variable possibly gets misclassified since the registries do...... not carry any information on the time of discontinuation of treatment.In this study, we investigated the amount of misclassification of exposure (never, current, previous use) to hormone therapy (HT) when the exposure variable was based on prescription data. Furthermore, we evaluated the significance...

  5. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  6. Study on Semi-Parametric Statistical Model of Safety Monitoring of Cracks in Concrete Dams

    Directory of Open Access Journals (Sweden)

    Chongshi Gu

    2013-01-01

    Full Text Available Cracks are one of the hidden dangers in concrete dams. The study on safety monitoring models of concrete dam cracks has always been difficult. Using the parametric statistical model of safety monitoring of cracks in concrete dams, with the help of the semi-parametric statistical theory, and considering the abnormal behaviors of these cracks, the semi-parametric statistical model of safety monitoring of concrete dam cracks is established to overcome the limitation of the parametric model in expressing the objective model. Previous projects show that the semi-parametric statistical model has a stronger fitting effect and has a better explanation for cracks in concrete dams than the parametric statistical model. However, when used for forecast, the forecast capability of the semi-parametric statistical model is equivalent to that of the parametric statistical model. The modeling of the semi-parametric statistical model is simple, has a reasonable principle, and has a strong practicality, with a good application prospect in the actual project.

  7. The association between previous and future severe exacerbations of chronic obstructive pulmonary disease: Updating the literature using robust statistical methodology.

    Science.gov (United States)

    Sadatsafavi, Mohsen; Xie, Hui; Etminan, Mahyar; Johnson, Kate; FitzGerald, J Mark

    2018-01-01

    There is minimal evidence on the extent to which the occurrence of a severe acute exacerbation of COPD that results in hospitalization affects the subsequent disease course. Previous studies on this topic did not generate causally-interpretable estimates. Our aim was to use corrected methodology to update previously reported estimates of the associations between previous and future exacerbations in these patients. Using administrative health data in British Columbia, Canada (1997-2012), we constructed a cohort of patients with at least one severe exacerbation, defined as an episode of inpatient care with the main diagnosis of COPD based on international classification of diseases (ICD) codes. We applied a random-effects 'joint frailty' survival model that is particularly developed for the analysis of recurrent events in the presence of competing risk of death and heterogeneity among individuals in their rate of events. Previous severe exacerbations entered the model as dummy-coded time-dependent covariates, and the model was adjusted for several observable patient and disease characteristics. 35,994 individuals (mean age at baseline 73.7, 49.8% female, average follow-up 3.21 years) contributed 34,271 severe exacerbations during follow-up. The first event was associated with a hazard ratio (HR) of 1.75 (95%CI 1.69-1.82) for the risk of future severe exacerbations. This risk decreased to HR = 1.36 (95%CI 1.30-1.42) for the second event and to 1.18 (95%CI 1.12-1.25) for the third event. The first two severe exacerbations that occurred during follow-up were also significantly associated with increased risk of all-cause mortality. There was substantial heterogeneity in the individual-specific rate of severe exacerbations. Even after adjusting for observable characteristics, individuals in the 97.5th percentile of exacerbation rate had 5.6 times higher rate of severe exacerbations than those in the 2.5th percentile. Using robust statistical methodology that controlled

  8. A Statistical Study of Eiscat Electron and Ion Temperature Measurements In The E-region

    Science.gov (United States)

    Hussey, G.; Haldoupis, C.; Schlegel, K.; Bösinger, T.

    Motivated by the large EISCAT data base, which covers over 15 years of common programme operation, and previous statistical work with EISCAT data (e.g., C. Hal- doupis, K. Schlegel, and G. Hussey, Auroral E-region electron density gradients mea- sured with EISCAT, Ann. Geopshysicae, 18, 1172-1181, 2000), a detailed statistical analysis of electron and ion EISCAT temperature measurements has been undertaken. This study was specifically concerned with the statistical dependence of heating events with other ambient parameters such as the electric field and electron density. The re- sults showed previously reported dependences such as the electron temperature being directly correlated with the ambient electric field and inversely related to the electron density. However, these correlations were found to be also dependent upon altitude. There was also evidence of the so called "Schlegel effect" (K. Schlegel, Reduced effective recombination coefficient in the disturbed polar E-region, J. Atmos. Terr. Phys., 44, 183-185, 1982); that is, the heated electron gas leads to increases in elec- tron density through a reduction in the recombination rate. This paper will present the statistical heating results and attempt to offer physical explanations and interpretations of the findings.

  9. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  10. Statistical learning of multisensory regularities is enhanced in musicians: An MEG study.

    Science.gov (United States)

    Paraskevopoulos, Evangelos; Chalas, Nikolas; Kartsidis, Panagiotis; Wollbrink, Andreas; Bamidis, Panagiotis

    2018-07-15

    The present study used magnetoencephalography (MEG) to identify the neural correlates of audiovisual statistical learning, while disentangling the differential contributions of uni- and multi-modal statistical mismatch responses in humans. The applied paradigm was based on a combination of a statistical learning paradigm and a multisensory oddball one, combining an audiovisual, an auditory and a visual stimulation stream, along with the corresponding deviances. Plasticity effects due to musical expertise were investigated by comparing the behavioral and MEG responses of musicians to non-musicians. The behavioral results indicated that the learning was successful for both musicians and non-musicians. The unimodal MEG responses are consistent with previous studies, revealing the contribution of Heschl's gyrus for the identification of auditory statistical mismatches and the contribution of medial temporal and visual association areas for the visual modality. The cortical network underlying audiovisual statistical learning was found to be partly common and partly distinct from the corresponding unimodal networks, comprising right temporal and left inferior frontal sources. Musicians showed enhanced activation in superior temporal and superior frontal gyrus. Connectivity and information processing flow amongst the sources comprising the cortical network of audiovisual statistical learning, as estimated by transfer entropy, was reorganized in musicians, indicating enhanced top-down processing. This neuroplastic effect showed a cross-modal stability between the auditory and audiovisual modalities. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Dust grain resonant capture: A statistical study

    Science.gov (United States)

    Marzari, F.; Vanzani, V.; Weidenschilling, S. J.

    1993-01-01

    A statistical approach, based on a large number of simultaneous numerical integrations, is adopted to study the capture in external mean motion resonances with the Earth of micron size dust grains perturbed by solar radiation and wind forces. We explore the dependence of the resonant capture phenomenon on the initial eccentricity e(sub 0) and perihelion argument w(sub 0) of the dust particle orbit. The intensity of both the resonant and dissipative (Poynting-Robertson and wind drag) perturbations strongly depends on the eccentricity of the particle while the perihelion argument determines, for low inclination, the mutual geometrical configuration of the particle's orbit with respect to the Earth's orbit. We present results for three j:j+1 commensurabilities (2:3, 4:5 and 6:7) and also for particle sizes s = 15, 30 microns. This study extends our previous work on the long term orbital evolution of single dust particles trapped into resonances with the Earth.

  12. Statistical principles for prospective study protocols:

    DEFF Research Database (Denmark)

    Christensen, Robin; Langberg, Henning

    2012-01-01

    In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means...... the statistical principles for trial protocols in terms of design, analysis, and reporting of findings....

  13. Statistical analysis and application of quasi experiments to antimicrobial resistance intervention studies.

    Science.gov (United States)

    Shardell, Michelle; Harris, Anthony D; El-Kamary, Samer S; Furuno, Jon P; Miller, Ram R; Perencevich, Eli N

    2007-10-01

    Quasi-experimental study designs are frequently used to assess interventions that aim to limit the emergence of antimicrobial-resistant pathogens. However, previous studies using these designs have often used suboptimal statistical methods, which may result in researchers making spurious conclusions. Methods used to analyze quasi-experimental data include 2-group tests, regression analysis, and time-series analysis, and they all have specific assumptions, data requirements, strengths, and limitations. An example of a hospital-based intervention to reduce methicillin-resistant Staphylococcus aureus infection rates and reduce overall length of stay is used to explore these methods.

  14. The power and statistical behaviour of allele-sharing statistics when ...

    Indian Academy of Sciences (India)

    , using seven statistics, of which five are implemented in the computer program SimWalk2, and two are implemented in GENEHUNTER. Unlike most previous reports which involve evaluations of the power of allele-sharing statistics for a single ...

  15. 1997 statistical yearbook

    International Nuclear Information System (INIS)

    1998-01-01

    The international office of energy information and studies (Enerdata), has published the second edition of its 1997 statistical yearbook which includes consolidated 1996 data with respect to the previous version from June 1997. The CD-Rom comprises the annual worldwide petroleum, natural gas, coal and electricity statistics from 1991 to 1996 with information about production, external trade, consumption, market shares, sectoral distribution of consumption and energy balance sheets. The world is divided into 12 zones (52 countries available). It contains also energy indicators: production and consumption tendencies, supply and production structures, safety of supplies, energy efficiency, and CO 2 emissions. (J.S.)

  16. On application of non—extensive statistical mechanics to studying ecological diversity

    International Nuclear Information System (INIS)

    Van Xuan, Le; Lan, Nguyen Tri; Viet, Nguyen Ai

    2016-01-01

    The concept of Tsallis entropy provides an extension of thermodynamics and statistical physics. In the ecology, Tsallis entropy is proposed to be a new class of diversity indices S_q which covers many common diversity indices found in ecological literature. As a new statistical model for the Whittaker plots describing species abundance distribution, the truncated exponential distribution is used to calculate the diversity and evenness indices. The obtained results in new model are graphically compared with those in previous publication in the same field of interests, and shows a good agreement. A further development of a thermodynamic theory of ecological systems that is consistent with entropic approach of statistical physics is motivated. (paper)

  17. 40 CFR 152.93 - Citation of a previously submitted valid study.

    Science.gov (United States)

    2010-07-01

    ... Data Submitters' Rights § 152.93 Citation of a previously submitted valid study. An applicant may demonstrate compliance for a data requirement by citing a valid study previously submitted to the Agency. The... the original data submitter, the applicant may cite the study only in accordance with paragraphs (b...

  18. Statistical principles for prospective study protocols:

    DEFF Research Database (Denmark)

    Christensen, Robin; Langberg, Henning

    2012-01-01

    In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means......, risk differences, and other quantities that convey information. One of the goals in biomedical research is to develop parsimonious models - meaning as simple as possible. This approach is valid if the subsequent research report (the article) is written independent of whether the results...

  19. Is email a reliable means of contacting authors of previously published papers? A study of the Emergency Medicine Journal for 2001.

    Science.gov (United States)

    O'Leary, F

    2003-07-01

    To determine whether it is possible to contact authors of previously published papers via email. A cross sectional study of the Emergency Medicine Journal for 2001. 118 articles were included in the study. The response rate from those with valid email addresses was 73%. There was no statistical difference between the type of email address used and the address being invalid (p=0.392) or between the type of article and the likelihood of a reply (p=0.197). More responses were obtained from work addresses when compared with Hotmail addresses (86% v 57%, p=0.02). Email is a valid means of contacting authors of previously published articles, particularly within the emergency medicine specialty. A work based email address may be a more valid means of contact than a Hotmail address.

  20. A generalized model to estimate the statistical power in mitochondrial disease studies involving 2×k tables.

    Directory of Open Access Journals (Sweden)

    Jacobo Pardo-Seco

    Full Text Available BACKGROUND: Mitochondrial DNA (mtDNA variation (i.e. haplogroups has been analyzed in regards to a number of multifactorial diseases. The statistical power of a case-control study determines the a priori probability to reject the null hypothesis of homogeneity between cases and controls. METHODS/PRINCIPAL FINDINGS: We critically review previous approaches to the estimation of the statistical power based on the restricted scenario where the number of cases equals the number of controls, and propose a methodology that broadens procedures to more general situations. We developed statistical procedures that consider different disease scenarios, variable sample sizes in cases and controls, and variable number of haplogroups and effect sizes. The results indicate that the statistical power of a particular study can improve substantially by increasing the number of controls with respect to cases. In the opposite direction, the power decreases substantially when testing a growing number of haplogroups. We developed mitPower (http://bioinformatics.cesga.es/mitpower/, a web-based interface that implements the new statistical procedures and allows for the computation of the a priori statistical power in variable scenarios of case-control study designs, or e.g. the number of controls needed to reach fixed effect sizes. CONCLUSIONS/SIGNIFICANCE: The present study provides with statistical procedures for the computation of statistical power in common as well as complex case-control study designs involving 2×k tables, with special application (but not exclusive to mtDNA studies. In order to reach a wide range of researchers, we also provide a friendly web-based tool--mitPower--that can be used in both retrospective and prospective case-control disease studies.

  1. Study of statistical properties of hybrid statistic in coherent multi-detector compact binary coalescences Search

    OpenAIRE

    Haris, K; Pai, Archana

    2015-01-01

    In this article, we revisit the problem of coherent multi-detector search of gravitational wave from compact binary coalescence with Neutron stars and Black Holes using advanced interferometers like LIGO-Virgo. Based on the loss of optimal multi-detector signal-to-noise ratio (SNR), we construct a hybrid statistic as a best of maximum-likelihood-ratio(MLR) statistic tuned for face-on and face-off binaries. The statistical properties of the hybrid statistic is studied. The performance of this ...

  2. A statistical perspective on association studies of psychiatric disorders

    DEFF Research Database (Denmark)

    Foldager, Leslie

    2014-01-01

    Gene-gene (GxG) and gene-environment (GxE) interactions likely play an important role in the aetiology of complex diseases like psychiatric disorders. Thus, we aim at investigating methodological aspects of and apply methods from statistical genetics taking interactions into account. In addition we...... genes and maternal infection by virus. Paper 3 presents the initial steps (mainly data construction) of an ongoing simulation study aiming at guiding decisions by comparing methods for GxE interaction analysis including both traditional two-step logistic regression, exhaustive searches using efficient...... these markers. However, the validity of the identified haplotypes is also checked by inferring phased haplotypes from genotypes. Haplotype analysis is also used in paper 5 which is otherwise an example of a focused approach to narrow down a previously found signal to search for more precise positions of disease...

  3. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  4. Statistical power of model selection strategies for genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Zheyang Wu

    2009-07-01

    Full Text Available Genome-wide association studies (GWAS aim to identify genetic variants related to diseases by examining the associations between phenotypes and hundreds of thousands of genotyped markers. Because many genes are potentially involved in common diseases and a large number of markers are analyzed, it is crucial to devise an effective strategy to identify truly associated variants that have individual and/or interactive effects, while controlling false positives at the desired level. Although a number of model selection methods have been proposed in the literature, including marginal search, exhaustive search, and forward search, their relative performance has only been evaluated through limited simulations due to the lack of an analytical approach to calculating the power of these methods. This article develops a novel statistical approach for power calculation, derives accurate formulas for the power of different model selection strategies, and then uses the formulas to evaluate and compare these strategies in genetic model spaces. In contrast to previous studies, our theoretical framework allows for random genotypes, correlations among test statistics, and a false-positive control based on GWAS practice. After the accuracy of our analytical results is validated through simulations, they are utilized to systematically evaluate and compare the performance of these strategies in a wide class of genetic models. For a specific genetic model, our results clearly reveal how different factors, such as effect size, allele frequency, and interaction, jointly affect the statistical power of each strategy. An example is provided for the application of our approach to empirical research. The statistical approach used in our derivations is general and can be employed to address the model selection problems in other random predictor settings. We have developed an R package markerSearchPower to implement our formulas, which can be downloaded from the

  5. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when

  6. Statistics available for site studies in registers and surveys at Statistics Sweden

    International Nuclear Information System (INIS)

    Haldorson, Marie

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  7. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  8. Determination of the minimum size of a statistical representative volume element from a fibre-reinforced composite based on point pattern statistics

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Brøndsted, Povl

    2013-01-01

    In a previous study, Trias et al. [1] determined the minimum size of a statistical representative volume element (SRVE) of a unidirectional fibre-reinforced composite primarily based on numerical analyses of the stress/strain field. In continuation of this, the present study determines the minimu...... size of an SRVE based on a statistical analysis on the spatial statistics of the fibre packing patterns found in genuine laminates, and those generated numerically using a microstructure generator. © 2012 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved....

  9. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  10. Statistical Power in Longitudinal Network Studies

    NARCIS (Netherlands)

    Stadtfeld, Christoph; Snijders, Tom A. B.; Steglich, Christian; van Duijn, Marijtje

    2018-01-01

    Longitudinal social network studies may easily suffer from a lack of statistical power. This is the case in particular for studies that simultaneously investigate change of network ties and change of nodal attributes. Such selection and influence studies have become increasingly popular due to the

  11. All of statistics a concise course in statistical inference

    CERN Document Server

    Wasserman, Larry

    2004-01-01

    This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...

  12. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  13. Statistical study of auroral omega bands

    Directory of Open Access Journals (Sweden)

    N. Partamies

    2017-09-01

    Full Text Available The presence of very few statistical studies on auroral omega bands motivated us to test-use a semi-automatic method for identifying large-scale undulations of the diffuse aurora boundary and to investigate their occurrence. Five identical all-sky cameras with overlapping fields of view provided data for 438 auroral omega-like structures over Fennoscandian Lapland from 1996 to 2007. The results from this set of omega band events agree remarkably well with previous observations of omega band occurrence in magnetic local time (MLT, lifetime, location between the region 1 and 2 field-aligned currents, as well as current density estimates. The average peak emission height of omega forms corresponds to the estimated precipitation energies of a few keV, which experienced no significant change during the events. Analysis of both local and global magnetic indices demonstrates that omega bands are observed during substorm expansion and recovery phases that are more intense than average substorm expansion and recovery phases in the same region. The omega occurrence with respect to the substorm expansion and recovery phases is in a very good agreement with an earlier observed distribution of fast earthward flows in the plasma sheet during expansion and recovery phases. These findings support the theory that omegas are produced by fast earthward flows and auroral streamers, despite the rarity of good conjugate observations.

  14. Journal data sharing policies and statistical reporting inconsistencies in psychology.

    NARCIS (Netherlands)

    Nuijten, M.B.; Borghuis, J.; Veldkamp, C.L.S.; Dominguez Alvarez, L.; van Assen, M.A.L.M.; Wicherts, J.M.

    2018-01-01

    In this paper, we present three retrospective observational studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of

  15. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  16. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  17. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  18. Statistical studies of energetic electrons in the outer radiation belt

    Energy Technology Data Exchange (ETDEWEB)

    Johnstone, A.D.; Rodgers, D.J.; Jones, G.H. E-mail: g.h.jones@ic.ac.uk

    1999-10-01

    The medium electron A (MEA) instrument aboard the CRRES spacecraft provided data on terrestrial radiation belt electrons in the energy range from 153 to 1582 keV, during 1990-91. These data have previously been used to produce an empirical model of the radiation belts from L=1.1 to 8.9, ordered according to 17 energy bands, 18 pitch angle bins, and 5 Kp ranges. Empirical models such as this are very valuable, but are prone to statistical fluctuations and gaps in coverage. In this study, in order to smooth the data and make it more easy to interpolate within data gaps, the pitch angle distribution at each energy in the model was fitted with a Bessel function. This provided a way to characterize the pitch angle in terms of only two parameters for each energy. It was not possible to model fluxes reliably within the loss cone because of poor statistics. The fitted distributions give an indication of the way in which pitch angle diffusion varies in the outer radiation belts. The two parameters of the Bessel function were found to vary systematically with L value, energy and Kp. Through the fitting of a simple function to these systematic variations, the number of parameters required to describe the model could be reduced drastically.

  19. Statistical literacy and sample survey results

    Science.gov (United States)

    McAlevey, Lynn; Sullivan, Charles

    2010-10-01

    Sample surveys are widely used in the social sciences and business. The news media almost daily quote from them, yet they are widely misused. Using students with prior managerial experience embarking on an MBA course, we show that common sample survey results are misunderstood even by those managers who have previously done a statistics course. In general, they fare no better than managers who have never studied statistics. There are implications for teaching, especially in business schools, as well as for consulting.

  20. Infant Directed Speech Enhances Statistical Learning in Newborn Infants: An ERP Study.

    Directory of Open Access Journals (Sweden)

    Alexis N Bosseler

    Full Text Available Statistical learning and the social contexts of language addressed to infants are hypothesized to play important roles in early language development. Previous behavioral work has found that the exaggerated prosodic contours of infant-directed speech (IDS facilitate statistical learning in 8-month-old infants. Here we examined the neural processes involved in on-line statistical learning and investigated whether the use of IDS facilitates statistical learning in sleeping newborns. Event-related potentials (ERPs were recorded while newborns were exposed to12 pseudo-words, six spoken with exaggerated pitch contours of IDS and six spoken without exaggerated pitch contours (ADS in ten alternating blocks. We examined whether ERP amplitudes for syllable position within a pseudo-word (word-initial vs. word-medial vs. word-final, indicating statistical word learning and speech register (ADS vs. IDS would interact. The ADS and IDS registers elicited similar ERP patterns for syllable position in an early 0-100 ms component but elicited different ERP effects in both the polarity and topographical distribution at 200-400 ms and 450-650 ms. These results provide the first evidence that the exaggerated pitch contours of IDS result in differences in brain activity linked to on-line statistical learning in sleeping newborns.

  1. Statistics for X-chromosome associations.

    Science.gov (United States)

    Özbek, Umut; Lin, Hui-Min; Lin, Yan; Weeks, Daniel E; Chen, Wei; Shaffer, John R; Purcell, Shaun M; Feingold, Eleanor

    2018-06-13

    In a genome-wide association study (GWAS), association between genotype and phenotype at autosomal loci is generally tested by regression models. However, X-chromosome data are often excluded from published analyses of autosomes because of the difference between males and females in number of X chromosomes. Failure to analyze X-chromosome data at all is obviously less than ideal, and can lead to missed discoveries. Even when X-chromosome data are included, they are often analyzed with suboptimal statistics. Several mathematically sensible statistics for X-chromosome association have been proposed. The optimality of these statistics, however, is based on very specific simple genetic models. In addition, while previous simulation studies of these statistics have been informative, they have focused on single-marker tests and have not considered the types of error that occur even under the null hypothesis when the entire X chromosome is scanned. In this study, we comprehensively tested several X-chromosome association statistics using simulation studies that include the entire chromosome. We also considered a wide range of trait models for sex differences and phenotypic effects of X inactivation. We found that models that do not incorporate a sex effect can have large type I error in some cases. We also found that many of the best statistics perform well even when there are modest deviations, such as trait variance differences between the sexes or small sex differences in allele frequencies, from assumptions. © 2018 WILEY PERIODICALS, INC.

  2. Statistical Considerations of Food Allergy Prevention Studies.

    Science.gov (United States)

    Bahnson, Henry T; du Toit, George; Lack, Gideon

    Clinical studies to prevent the development of food allergy have recently helped reshape public policy recommendations on the early introduction of allergenic foods. These trials are also prompting new research, and it is therefore important to address the unique design and analysis challenges of prevention trials. We highlight statistical concepts and give recommendations that clinical researchers may wish to adopt when designing future study protocols and analysis plans for prevention studies. Topics include selecting a study sample, addressing internal and external validity, improving statistical power, choosing alpha and beta, analysis innovations to address dilution effects, and analysis methods to deal with poor compliance, dropout, and missing data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Previous Employment and Job Satisfaction Conditions: The Case of Regional Administration

    Science.gov (United States)

    Amalia, Myronaki; Nikolaos, Antonakas

    2009-08-01

    In the present work we study the different dimensions of satisfaction and the way of constitution of satisfaction of an important sample of the employees in the Regional administration of Crete and in their connection with the variable of the previous employment. We found statistically important differences for the components of satisfaction from the life, collaboration in and outside from the department, in the social satisfaction and the variable of the years in the service (labour group with satisfaction than the other previous employment groups. In the other hand the group with many years in the service presents bigger satisfaction than the other groups. Finally, is important to note that the sample present some interesting characteristics.

  4. A study on the use of Gumbel approximation with the Bernoulli spatial scan statistic.

    Science.gov (United States)

    Read, S; Bath, P A; Willett, P; Maheswaran, R

    2013-08-30

    The Bernoulli version of the spatial scan statistic is a well established method of detecting localised spatial clusters in binary labelled point data, a typical application being the epidemiological case-control study. A recent study suggests the inferential accuracy of several versions of the spatial scan statistic (principally the Poisson version) can be improved, at little computational cost, by using the Gumbel distribution, a method now available in SaTScan(TM) (www.satscan.org). We study in detail the effect of this technique when applied to the Bernoulli version and demonstrate that it is highly effective, albeit with some increase in false alarm rates at certain significance thresholds. We explain how this increase is due to the discrete nature of the Bernoulli spatial scan statistic and demonstrate that it can affect even small p-values. Despite this, we argue that the Gumbel method is actually preferable for very small p-values. Furthermore, we extend previous research by running benchmark trials on 12 000 synthetic datasets, thus demonstrating that the overall detection capability of the Bernoulli version (i.e. ratio of power to false alarm rate) is not noticeably affected by the use of the Gumbel method. We also provide an example application of the Gumbel method using data on hospital admissions for chronic obstructive pulmonary disease. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  6. PCNL - a comparative study in nonoperated and in previously operated (open nephrolithotomy/pyelolithotomy patients - a single-surgeon experience

    Directory of Open Access Journals (Sweden)

    Rahul Gupta

    2011-12-01

    Full Text Available PURPOSE: Re-procedure in patients with history of open stone surgery is usually challenging due to the alteration in the retroperitoneal anatomy. The aim of this study was to determine the possible impact of open renal surgery on the efficacy and morbidity of subsequent percutaneous nephrolithotomy (PCNL. MATERIALS AND METHODS: From March 2009 until September 2010, 120 patients underwent PCNL. Of these, 20 patients were excluded (tubeless or bilateral simultaneous PCNL. Of the remaining 100, 55 primary patients were categorized as Group 1 and the remaining (previous open nephrolithotomy as Group 2. Standard preoperative evaluation was carried out prior to intervention, Statistical analysis was performed using SPSS v. 11 with the chi-square test, independent samples t-test, and Mann-Whitney U test. A p-value < 0.05 was taken as statistically significant. RESULTS: Both groups were similar in demographic profile and stone burden. Attempts to access the PCS was less in Group 1 compared to Group 2 (1.2 + 1 2 vs 3 + 1.3 respectively and this was statistically significant (p < 0.04. However, the mean operative time between the two groups was not statistically significant (p = 0.44. Blood transfusion rate was comparable in the two groups (p = 0.24. One patient in Group 2 developed hemothorax following a supra-11th puncture. Remaining complications were comparable in both groups. CONCLUSION: Patients with past history of renal stone surgery may need more attempts to access the pelvicaliceal system and have difficulty in tract dilation secondary to retroperitoneal scarring. But overall morbidity and efficacy is same in both groups.

  7. An Entropy-Based Statistic for Genomewide Association Studies

    OpenAIRE

    Zhao, Jinying; Boerwinkle, Eric; Xiong, Momiao

    2005-01-01

    Efficient genotyping methods and the availability of a large collection of single-nucleotide polymorphisms provide valuable tools for genetic studies of human disease. The standard χ2 statistic for case-control studies, which uses a linear function of allele frequencies, has limited power when the number of marker loci is large. We introduce a novel test statistic for genetic association studies that uses Shannon entropy and a nonlinear function of allele frequencies to amplify the difference...

  8. The clinic-statistic study of osteoporosis

    Directory of Open Access Journals (Sweden)

    Florin MARCU

    2008-05-01

    Full Text Available Osteoporosis is the most common metabolic bone disease and is characterized by the shrinkage in bone mass and the distruction of bone quality, thus conferring a higher risk for fractures and injuries. Osteoporosis reaches clinical attention when it is severe enough to induce microfractures and the collapsing of vertebral bodies manifesting with back aches or predisposition to other bone fractures. The aim of the study was to establish a statistic-numeric report between women and men in subjects diagnosed with osteoporosis through DEXA that present with a clinical simptomatology. We studied a group of subjects of masculine and feminine genders that have been diagnosed with osteoporosis through DEXA at the EURORAD clinic in Oradea from 01.01.2007-to present time .The result of the study was that the simptomatology of osteoporosis with pain and even cases of fractures is more obvious in female subjects then in male patients; statistically ,a woman/man report of 6.1/1 was established.

  9. Study of functional-performance deficits in athletes with previous ankle sprains

    Directory of Open Access Journals (Sweden)

    hamid Babaee

    2008-04-01

    Full Text Available Abstract Background: Despite the importance of functional-performance deficits in athletes with history of ankle sprain few, studies have been carried out in this area. The aim of this research was to study relationship between previous ankle sprains and functional-performance deficits in athletes. Materials and methods: The subjects were 40 professional athletes selected through random sampling among volunteer participants in soccer, basketball, volleyball and handball teams of Lorestan province. The subjects were divided into 2 groups: Injured group (athletes with previous ankle sprains and healthy group (athletes without previous ankle sprains. In this descriptive study we used Functional-performance tests (figure 8 hop test and side hop test to determine ankle deficits and limitations. They participated in figure 8 hop test including hopping in 8 shape course with the length of 5 meters and side hop test including 10 side hop repetitions in course with the length of 30 centimeters. Time were recorded via stopwatch. Results: After data gathering and assessing information distributions, Pearson correlation was used to assess relationships, and independent T test to assess differences between variables. Finally the results showed that there is a significant relationship between previous ankle sprains and functional-performance deficits in the athletes. Conclusion: The athletes who had previous ankle sprains indicated functional-performance deficits more than healthy athletes in completion of mentioned functional-performance tests. The functional-performance tests (figure 8 hop test and side hop test are sensitive and suitable to assess and detect functional-performance deficits in athletes. Therefore we can use the figure 8 hop and side hop tests for goals such as prevention, assessment and rehabilitation of ankle sprains without spending too much money and time.

  10. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  11. Statistical Learning and Dyslexia: A Systematic Review

    Science.gov (United States)

    Schmalz, Xenia; Altoè, Gianmarco; Mulatti, Claudio

    2017-01-01

    The existing literature on developmental dyslexia (hereafter: dyslexia) often focuses on isolating cognitive skills which differ across dyslexic and control participants. Among potential correlates, previous research has studied group differences between dyslexic and control participants in performance on statistical learning tasks. A statistical…

  12. Personality disorders in previously detained adolescent females: a prospective study

    NARCIS (Netherlands)

    Krabbendam, A.; Colins, O.F.; Doreleijers, T.A.H.; van der Molen, E.; Beekman, A.T.F.; Vermeiren, R.R.J.M.

    2015-01-01

    This longitudinal study investigated the predictive value of trauma and mental health problems for the development of antisocial personality disorder (ASPD) and borderline personality disorder (BPD) in previously detained women. The participants were 229 detained adolescent females who were assessed

  13. Survival of dental implants placed in sites of previously failed implants.

    Science.gov (United States)

    Chrcanovic, Bruno R; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    2017-11-01

    To assess the survival of dental implants placed in sites of previously failed implants and to explore the possible factors that might affect the outcome of this reimplantation procedure. Patients that had failed dental implants, which were replaced with the same implant type at the same site, were included. Descriptive statistics were used to describe the patients and implants; survival analysis was also performed. The effect of systemic, environmental, and local factors on the survival of the reoperated implants was evaluated. 175 of 10,096 implants in 98 patients were replaced by another implant at the same location (159, 14, and 2 implants at second, third, and fourth surgeries, respectively). Newly replaced implants were generally of similar diameter but of shorter length compared to the previously placed fixtures. A statistically significant greater percentage of lost implants were placed in sites with low bone quantity. There was a statistically significant difference (P = 0.032) in the survival rates between implants that were inserted for the first time (94%) and implants that replaced the ones lost (73%). There was a statistically higher failure rate of the reoperated implants for patients taking antidepressants and antithrombotic agents. Dental implants replacing failed implants had lower survival rates than the rates reported for the previous attempts of implant placement. It is suggested that a site-specific negative effect may possibly be associated with this phenomenon, as well as the intake of antidepressants and antithrombotic agents. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  15. 7th International Workshop on Statistical Simulation

    CERN Document Server

    Mignani, Stefania; Monari, Paola; Salmaso, Luigi

    2014-01-01

    The Department of Statistical Sciences of the University of Bologna in collaboration with the Department of Management and Engineering of the University of Padova, the Department of Statistical Modelling of Saint Petersburg State University, and INFORMS Simulation Society sponsored the Seventh Workshop on Simulation. This international conference was devoted to statistical techniques in stochastic simulation, data collection, analysis of scientific experiments, and studies representing broad areas of interest. The previous workshops took place in St. Petersburg, Russia in 1994, 1996, 1998, 2001, 2005, and 2009. The Seventh Workshop took place in the Rimini Campus of the University of Bologna, which is in Rimini’s historical center.

  16. Statistics in the pharmacy literature.

    Science.gov (United States)

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  17. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  18. Case-control study for colorectal cancer genetic susceptibility in EPICOLON: previously identified variants and mucins

    Directory of Open Access Journals (Sweden)

    Moreno Victor

    2011-08-01

    Full Text Available Abstract Background Colorectal cancer (CRC is the second leading cause of cancer death in developed countries. Familial aggregation in CRC is also important outside syndromic forms and, in this case, a polygenic model with several common low-penetrance alleles contributing to CRC genetic predisposition could be hypothesized. Mucins and GALNTs (N-acetylgalactosaminyltransferase are interesting candidates for CRC genetic susceptibility and have not been previously evaluated. We present results for ten genetic variants linked to CRC risk in previous studies (previously identified category and 18 selected variants from the mucin gene family in a case-control association study from the Spanish EPICOLON consortium. Methods CRC cases and matched controls were from EPICOLON, a prospective, multicenter, nationwide Spanish initiative, comprised of two independent stages. Stage 1 corresponded to 515 CRC cases and 515 controls, whereas stage 2 consisted of 901 CRC cases and 909 controls. Also, an independent cohort of 549 CRC cases and 599 controls outside EPICOLON was available for additional replication. Genotyping was performed for ten previously identified SNPs in ADH1C, APC, CCDN1, IL6, IL8, IRS1, MTHFR, PPARG, VDR and ARL11, and 18 selected variants in the mucin gene family. Results None of the 28 SNPs analyzed in our study was found to be associated with CRC risk. Although four SNPs were significant with a P-value ADH1C (OR = 1.63, 95% CI = 1.06-2.50, P-value = 0.02, recessive, rs1800795 in IL6 (OR = 1.62, 95% CI = 1.10-2.37, P-value = 0.01, recessive, rs3803185 in ARL11 (OR = 1.58, 95% CI = 1.17-2.15, P-value = 0.007, codominant, and rs2102302 in GALNTL2 (OR = 1.20, 95% CI = 1.00-1.44, P-value = 0.04, log-additive 0, 1, 2 alleles], only rs3803185 achieved statistical significance in EPICOLON stage 2 (OR = 1.34, 95% CI = 1.06-1.69, P-value = 0.01, recessive. In the joint analysis for both stages, results were only significant for rs3803185 (OR = 1

  19. Case-control study for colorectal cancer genetic susceptibility in EPICOLON: previously identified variants and mucins

    International Nuclear Information System (INIS)

    Abulí, Anna; Morillas, Juan D; Rigau, Joaquim; Latorre, Mercedes; Fernández-Bañares, Fernando; Peña, Elena; Riestra, Sabino; Payá, Artemio; Jover, Rodrigo; Xicola, Rosa M; Llor, Xavier; Fernández-Rozadilla, Ceres; Carvajal-Carmona, Luis; Villanueva, Cristina M; Moreno, Victor; Piqué, Josep M; Carracedo, Angel; Castells, Antoni; Andreu, Montserrat; Ruiz-Ponte, Clara; Castellví-Bel, Sergi; Alonso-Espinaco, Virginia; Muñoz, Jenifer; Gonzalo, Victoria; Bessa, Xavier; González, Dolors; Clofent, Joan; Cubiella, Joaquin

    2011-01-01

    Colorectal cancer (CRC) is the second leading cause of cancer death in developed countries. Familial aggregation in CRC is also important outside syndromic forms and, in this case, a polygenic model with several common low-penetrance alleles contributing to CRC genetic predisposition could be hypothesized. Mucins and GALNTs (N-acetylgalactosaminyltransferase) are interesting candidates for CRC genetic susceptibility and have not been previously evaluated. We present results for ten genetic variants linked to CRC risk in previous studies (previously identified category) and 18 selected variants from the mucin gene family in a case-control association study from the Spanish EPICOLON consortium. CRC cases and matched controls were from EPICOLON, a prospective, multicenter, nationwide Spanish initiative, comprised of two independent stages. Stage 1 corresponded to 515 CRC cases and 515 controls, whereas stage 2 consisted of 901 CRC cases and 909 controls. Also, an independent cohort of 549 CRC cases and 599 controls outside EPICOLON was available for additional replication. Genotyping was performed for ten previously identified SNPs in ADH1C, APC, CCDN1, IL6, IL8, IRS1, MTHFR, PPARG, VDR and ARL11, and 18 selected variants in the mucin gene family. None of the 28 SNPs analyzed in our study was found to be associated with CRC risk. Although four SNPs were significant with a P-value < 0.05 in EPICOLON stage 1 [rs698 in ADH1C (OR = 1.63, 95% CI = 1.06-2.50, P-value = 0.02, recessive), rs1800795 in IL6 (OR = 1.62, 95% CI = 1.10-2.37, P-value = 0.01, recessive), rs3803185 in ARL11 (OR = 1.58, 95% CI = 1.17-2.15, P-value = 0.007, codominant), and rs2102302 in GALNTL2 (OR = 1.20, 95% CI = 1.00-1.44, P-value = 0.04, log-additive 0, 1, 2 alleles], only rs3803185 achieved statistical significance in EPICOLON stage 2 (OR = 1.34, 95% CI = 1.06-1.69, P-value = 0.01, recessive). In the joint analysis for both stages, results were only significant for rs3803185 (OR = 1.12, 95% CI = 1

  20. GALEX-SDSS CATALOGS FOR STATISTICAL STUDIES

    International Nuclear Information System (INIS)

    Budavari, Tamas; Heinis, Sebastien; Szalay, Alexander S.; Nieto-Santisteban, Maria; Bianchi, Luciana; Gupchup, Jayant; Shiao, Bernie; Smith, Myron; Chang Ruixiang; Kauffmann, Guinevere; Morrissey, Patrick; Wyder, Ted K.; Martin, D. Christopher; Barlow, Tom A.; Forster, Karl; Friedman, Peter G.; Schiminovich, David; Milliard, Bruno; Donas, Jose; Seibert, Mark

    2009-01-01

    We present a detailed study of the Galaxy Evolution Explorer's (GALEX) photometric catalogs with special focus on the statistical properties of the All-sky and Medium Imaging Surveys. We introduce the concept of primaries to resolve the issue of multiple detections and follow a geometric approach to define clean catalogs with well understood selection functions. We cross-identify the GALEX sources (GR2+3) with Sloan Digital Sky Survey (SDSS; DR6) observations, which indirectly provides an invaluable insight into the astrometric model of the UV sources and allows us to revise the band merging strategy. We derive the formal description of the GALEX footprints as well as their intersections with the SDSS coverage along with analytic calculations of their areal coverage. The crossmatch catalogs are made available for the public. We conclude by illustrating the implementation of typical selection criteria in SQL for catalog subsets geared toward statistical analyses, e.g., correlation and luminosity function studies.

  1. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  2. IGESS: a statistical approach to integrating individual-level genotype data and summary statistics in genome-wide association studies.

    Science.gov (United States)

    Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben

    2017-09-15

    Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  3. Statistical studies of powerful extragalactic radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, J T

    1981-01-01

    This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.

  4. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  5. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  6. Studies in Theoretical and Applied Statistics

    CERN Document Server

    Pratesi, Monica; Ruiz-Gazen, Anne

    2018-01-01

    This book includes a wide selection of the papers presented at the 48th Scientific Meeting of the Italian Statistical Society (SIS2016), held in Salerno on 8-10 June 2016. Covering a wide variety of topics ranging from modern data sources and survey design issues to measuring sustainable development, it provides a comprehensive overview of the current Italian scientific research in the fields of open data and big data in public administration and official statistics, survey sampling, ordinal and symbolic data, statistical models and methods for network data, time series forecasting, spatial analysis, environmental statistics, economic and financial data analysis, statistics in the education system, and sustainable development. Intended for researchers interested in theoretical and empirical issues, this volume provides interesting starting points for further research.

  7. Matched case-control studies: a review of reported statistical methodology

    Directory of Open Access Journals (Sweden)

    Niven DJ

    2012-04-01

    Full Text Available Daniel J Niven1, Luc R Berthiaume2, Gordon H Fick1, Kevin B Laupland11Department of Critical Care Medicine, Peter Lougheed Centre, Calgary, 2Department of Community Health Sciences, University of Calgary, Calgary, Alberta, CanadaBackground: Case-control studies are a common and efficient means of studying rare diseases or illnesses with long latency periods. Matching of cases and controls is frequently employed to control the effects of known potential confounding variables. The analysis of matched data requires specific statistical methods.Methods: The objective of this study was to determine the proportion of published, peer reviewed matched case-control studies that used statistical methods appropriate for matched data. Using a comprehensive set of search criteria we identified 37 matched case-control studies for detailed analysis.Results: Among these 37 articles, only 16 studies were analyzed with proper statistical techniques (43%. Studies that were properly analyzed were more likely to have included case patients with cancer and cardiovascular disease compared to those that did not use proper statistics (10/16 or 63%, versus 5/21 or 24%, P = 0.02. They were also more likely to have matched multiple controls for each case (14/16 or 88%, versus 13/21 or 62%, P = 0.08. In addition, studies with properly analyzed data were more likely to have been published in a journal with an impact factor listed in the top 100 according to the Journal Citation Reports index (12/16 or 69%, versus 1/21 or 5%, P ≤ 0.0001.Conclusion: The findings of this study raise concern that the majority of matched case-control studies report results that are derived from improper statistical analyses. This may lead to errors in estimating the relationship between a disease and exposure, as well as the incorrect adaptation of emerging medical literature.Keywords: case-control, matched, dependent data, statistics

  8. Dynamics of EEG functional connectivity during statistical learning.

    Science.gov (United States)

    Tóth, Brigitta; Janacsek, Karolina; Takács, Ádám; Kóbor, Andrea; Zavecz, Zsófia; Nemeth, Dezso

    2017-10-01

    Statistical learning is a fundamental mechanism of the brain, which extracts and represents regularities of our environment. Statistical learning is crucial in predictive processing, and in the acquisition of perceptual, motor, cognitive, and social skills. Although previous studies have revealed competitive neurocognitive processes underlying statistical learning, the neural communication of the related brain regions (functional connectivity, FC) has not yet been investigated. The present study aimed to fill this gap by investigating FC networks that promote statistical learning in humans. Young adults (N=28) performed a statistical learning task while 128-channels EEG was acquired. The task involved probabilistic sequences, which enabled to measure incidental/implicit learning of conditional probabilities. Phase synchronization in seven frequency bands was used to quantify FC between cortical regions during the first, second, and third periods of the learning task, respectively. Here we show that statistical learning is negatively correlated with FC of the anterior brain regions in slow (theta) and fast (beta) oscillations. These negative correlations increased as the learning progressed. Our findings provide evidence that dynamic antagonist brain networks serve a hallmark of statistical learning. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Study of developing a database of energy statistics

    Energy Technology Data Exchange (ETDEWEB)

    Park, T.S. [Korea Energy Economics Institute, Euiwang (Korea, Republic of)

    1997-08-01

    An integrated energy database should be prepared in advance for managing energy statistics comprehensively. However, since much manpower and budget is required for developing an integrated energy database, it is difficult to establish a database within a short period of time. Therefore, this study sets the purpose in drawing methods to analyze existing statistical data lists and to consolidate insufficient data as first stage work for the energy database, and at the same time, in analyzing general concepts and the data structure of the database. I also studied the data content and items of energy databases in operation in international energy-related organizations such as IEA, APEC, Japan, and the USA as overseas cases as well as domestic conditions in energy databases, and the hardware operating systems of Japanese databases. I analyzed the making-out system of Korean energy databases, discussed the KEDB system which is representative of total energy databases, and present design concepts for new energy databases. In addition, I present the establishment directions and their contents of future Korean energy databases, data contents that should be collected by supply and demand statistics, and the establishment of data collection organization, etc. by analyzing the Korean energy statistical data and comparing them with the system of OECD/IEA. 26 refs., 15 figs., 11 tabs.

  10. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  11. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    Science.gov (United States)

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Comparing Student Success and Understanding in Introductory Statistics under Consensus and Simulation-Based Curricula

    Science.gov (United States)

    Hldreth, Laura A.; Robison-Cox, Jim; Schmidt, Jade

    2018-01-01

    This study examines the transferability of results from previous studies of simulation-based curriculum in introductory statistics using data from 3,500 students enrolled in an introductory statistics course at Montana State University from fall 2013 through spring 2016. During this time, four different curricula, a traditional curriculum and…

  13. Matched cohort study of external cephalic version in women with previous cesarean delivery.

    Science.gov (United States)

    Keepanasseril, Anish; Anand, Keerthana; Soundara Raghavan, Subrahmanian

    2017-07-01

    To evaluate the efficacy and safety of external cephalic version (ECV) among women with previous cesarean delivery. A retrospective study was conducted using data for women with previous cesarean delivery and breech presentation who underwent ECV at or after 36 weeks of pregnancy during 2011-2016. For every case, two multiparous women without previous cesarean delivery who underwent ECV and were matched for age and pregnancy duration were included. Characteristics and outcomes were compared between groups. ECV was successful for 32 (84.2%) of 38 women with previous cesarean delivery and 62 (81.6%) in the control group (P=0.728). Multivariate regression analysis confirmed that previous cesarean was not associated with ECV success (odds ratio 1.89, 95% confidence interval 0.19-18.47; P=0.244). Successful vaginal delivery after successful ECV was reported for 19 (59.4%) women in the previous cesarean delivery group and 52 (83.9%) in the control group (P<0.001). No ECV-associated complications occurred in women with previous cesarean delivery. To avoid a repeat cesarean delivery, ECV can be offered to women with breech presentation and previous cesarean delivery who are otherwise eligible for a trial of labor. © 2017 International Federation of Gynecology and Obstetrics.

  14. West Valley high-level nuclear waste glass development: a statistically designed mixture study

    Energy Technology Data Exchange (ETDEWEB)

    Chick, L.A.; Bowen, W.M.; Lokken, R.O.; Wald, J.W.; Bunnell, L.R.; Strachan, D.M.

    1984-10-01

    The first full-scale conversion of high-level commercial nuclear wastes to glass in the United States will be conducted at West Valley, New York, by West Valley Nuclear Services Company, Inc. (WVNS), for the US Department of Energy. Pacific Northwest Laboratory (PNL) is supporting WVNS in the design of the glass-making process and the chemical formulation of the glass. This report describes the statistically designed study performed by PNL to develop the glass composition recommended for use at West Valley. The recommended glass contains 28 wt% waste, as limited by process requirements. The waste loading and the silica content (45 wt%) are similar to those in previously developed waste glasses; however, the new formulation contains more calcium and less boron. A series of tests verified that the increased calcium results in improved chemical durability and does not adversely affect the other modeled properties. The optimization study assessed the effects of seven oxide components on glass properties. Over 100 melts combining the seven components into a wide variety of statistically chosen compositions were tested. Viscosity, electrical conductivity, thermal expansion, crystallinity, and chemical durability were measured and empirically modeled as a function of the glass composition. The mathematical models were then used to predict the optimum formulation. This glass was tested and adjusted to arrive at the final composition recommended for use at West Valley. 56 references, 49 figures, 18 tables.

  15. Potential errors and misuse of statistics in studies on leakage in endodontics.

    Science.gov (United States)

    Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J

    2013-04-01

    To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.

  16. Perceptual statistical learning over one week in child speech production.

    Science.gov (United States)

    Richtsmeier, Peter T; Goffman, Lisa

    2017-07-01

    What cognitive mechanisms account for the trajectory of speech sound development, in particular, gradually increasing accuracy during childhood? An intriguing potential contributor is statistical learning, a type of learning that has been studied frequently in infant perception but less often in child speech production. To assess the relevance of statistical learning to developing speech accuracy, we carried out a statistical learning experiment with four- and five-year-olds in which statistical learning was examined over one week. Children were familiarized with and tested on word-medial consonant sequences in novel words. There was only modest evidence for statistical learning, primarily in the first few productions of the first session. This initial learning effect nevertheless aligns with previous statistical learning research. Furthermore, the overall learning effect was similar to an estimate of weekly accuracy growth based on normative studies. The results implicate other important factors in speech sound development, particularly learning via production. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  18. Mathematical problem solving ability of sport students in the statistical study

    Science.gov (United States)

    Sari, E. F. P.; Zulkardi; Putri, R. I. I.

    2017-12-01

    This study aims to determine the problem-solving ability of sport students of PGRI Palembang semester V in the statistics course. Subjects in this study were sport students of PGRI Palembang semester V which amounted to 31 people. The research method used is quasi experiment type one case shoot study. Data collection techniques in this study use the test and data analysis used is quantitative descriptive statistics. The conclusion of this study shown that the mathematical problem solving ability of PGRI Palembang sport students of V semester in the statistical course is categorized well with the average of the final test score of 80.3.

  19. The influence of previous subject experience on interactions during peer instruction in an introductory physics course: A mixed methods analysis

    Science.gov (United States)

    Vondruska, Judy A.

    Over the past decade, peer instruction and the introduction of student response systems has provided a means of improving student engagement and achievement in large-lecture settings. While the nature of the student discourse occurring during peer instruction is less understood, existing studies have shown student ideas about the subject, extraneous cues, and confidence level appear to matter in the student-student discourse. Using a mixed methods research design, this study examined the influence of previous subject experience on peer instruction in an introductory, one-semester Survey of Physics course. Quantitative results indicated students in discussion pairs where both had previous subject experience were more likely to answer clicker question correctly both before and after peer discussion compared to student groups where neither partner had previous subject experience. Students in mixed discussion pairs were not statistically different in correct response rates from the other pairings. There was no statistically significant difference between the experience pairs on unit exam scores or the Peer Instruction Partner Survey. Although there was a statistically significant difference between the pre-MPEX and post-MPEX scores, there was no difference between the members of the various subject experience peer discussion pairs. The qualitative study, conducted after the quantitative study, helped to inform the quantitative results by exploring the nature of the peer interactions through survey questions and a series of focus groups discussions. While the majority of participants described a benefit to the use of clickers in the lecture, their experience with their discussion partners varied. Students with previous subject experience tended to describe peer instruction more positively than students who did not have previous subject experience, regardless of the experience level of their partner. They were also more likely to report favorable levels of comfort with

  20. Statistical reporting inconsistencies in experimental philosophy.

    Science.gov (United States)

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science.

  1. Statistical reporting inconsistencies in experimental philosophy

    Science.gov (United States)

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B.; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science. PMID:29649220

  2. Retention of Statistical Concepts in a Preliminary Randomization-Based Introductory Statistics Curriculum

    Science.gov (United States)

    Tintle, Nathan; Topliff, Kylie; VanderStoep, Jill; Holmes, Vicki-Lynn; Swanson, Todd

    2012-01-01

    Previous research suggests that a randomization-based introductory statistics course may improve student learning compared to the consensus curriculum. However, it is unclear whether these gains are retained by students post-course. We compared the conceptual understanding of a cohort of students who took a randomization-based curriculum (n = 76)…

  3. Managing Macroeconomic Risks by Using Statistical Simulation

    Directory of Open Access Journals (Sweden)

    Merkaš Zvonko

    2017-06-01

    Full Text Available The paper analyzes the possibilities of using statistical simulation in the macroeconomic risks measurement. At the level of the whole world, macroeconomic risks are, due to the excessive imbalance, significantly increased. Using analytical statistical methods and Monte Carlo simulation, the authors interpret the collected data sets, compare and analyze them in order to mitigate potential risks. The empirical part of the study is a qualitative case study that uses statistical methods and Monte Carlo simulation for managing macroeconomic risks, which is the central theme of this work. Application of statistical simulation is necessary because the system, for which it is necessary to specify the model, is too complex for an analytical approach. The objective of the paper is to point out the previous need for consideration of significant macroeconomic risks, particularly in terms of the number of the unemployed in the society, the movement of gross domestic product and the country’s credit rating, and the use of data previously processed by statistical methods, through statistical simulation, to analyze the existing model of managing the macroeconomic risks and suggest elements for a management model development that will allow, with the lowest possible probability and consequences, the emergence of the recent macroeconomic risks. The stochastic characteristics of the system, defined by random variables as input values defined by probability distributions, require the performance of a large number of iterations on which to record the output of the model and calculate the mathematical expectations. The paper expounds the basic procedures and techniques of discrete statistical simulation applied to systems that can be characterized by a number of events which represent a set of circumstances that have caused a change in the system’s state and the possibility of its application in the field of assessment of macroeconomic risks. The method has no

  4. Teaching Statistics to Doctoral Students with Lonergan's Insight-Based Critical Realism

    DEFF Research Database (Denmark)

    Tackney, Charles T.; Gwozdz, Wencke

    2014-01-01

    offers guided study in the statistical use of SPSS using a common EU data set. Course evaluations indicate students who had previously felt disinterested or unaware of the significance and role of quantitative studies emerged from the three day intensive with a better understanding and sense...

  5. SRB states and nonequilibrium statistical mechanics close to equilibrium

    OpenAIRE

    Gallavotti, Giovannni; Ruelle, David

    1996-01-01

    Nonequilibrium statistical mechanics close to equilibrium is studied using SRB states and a formula for their derivatives with respect to parameters. We write general expressions for the thermodynamic fluxes (or currents) and the transport coefficients, generalizing previous results. In this framework we give a general proof of the Onsager reciprocity relations.

  6. Energy statistics yearbook 2002

    International Nuclear Information System (INIS)

    2005-01-01

    The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  7. Energy statistics yearbook 2001

    International Nuclear Information System (INIS)

    2004-01-01

    The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  8. Energy statistics yearbook 2000

    International Nuclear Information System (INIS)

    2002-01-01

    The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  9. A Validity Study: Attitudes towards Statistics among Japanese College Students

    Science.gov (United States)

    Satake, Eike

    2015-01-01

    This cross-cultural study investigated the relationship between attitudes toward statistics (ATS) and course achievement (CA) among Japanese college students. The sample consisted of 135 male and 134 female students from the first two-year liberal arts program of a four-year college in Tokyo, Japan. Attitudes about statistics were measured using…

  10. Female Sexual Dysfunction in the Late Postpartum Period Among Women with Previous Gestational Diabetes Mellitus

    International Nuclear Information System (INIS)

    Sargin, M. A.; Yassa, M.; Taymur, B. D.; Akca, G.; Tug, N.; Taymur, B.

    2017-01-01

    Objective: To compare the status of female sexual dysfunction (FSD) between women with a history of previous gestational diabetes mellitus (GDM) and those with follow-up of a healthy pregnancy, using the female sexual function index (FSFI) questionnaire. Study Design: Cross-sectional study. Place and Duration of Study: Department of Obstetrics and Gynecology, Fatih Sultan Mehmet Training and Research Hospital, Istanbul, Turkey, from September to December 2015. Methodology: Healthy sexually active adult parous females were included. Participants were asked to complete the validated Turkish versions of the FSFI and Hospital Anxiety and Depression Scale (HADS) questionnaires. Student's t-test was used for two-group comparisons of normally distributed variables and quantitative data. Mann-Whitney U-test was used for two-group comparisons of non-normally distributed variables. Pearson's chi-squared test, the Fisher-Freeman-Halton test, Fisher's exact test, and Yates' continuity correction test were used for comparison of qualitative data. Results: The mean FSFI scores of the 179 participants was 23.50 +- 3.94. FSFI scores and scores of desire, arousal, lubrication, orgasm, satisfaction, and pain were not statistically significantly different (p>0.05), according to a history of GDM and types of FSD (none, mild, severe). HADS scores and anxiety and depression types did not statistically significantly differ according to the history of GDM (p>0.05). Conclusion: An association could not be found in FSFI scores between participants with both the history of previous GDM and with healthy pregnancy; subclinical sexual dysfunction may be observed in the late postpartum period among women with a history of previous GDM. This may adversely affect their sexual health. (author)

  11. Average wind statistics for SRP area meteorological towers

    International Nuclear Information System (INIS)

    Laurinat, J.E.

    1987-01-01

    A quality assured set of average wind Statistics for the seven SRP area meteorological towers has been calculated for the five-year period 1982--1986 at the request of DOE/SR. A Similar set of statistics was previously compiled for the years 1975-- 1979. The updated wind statistics will replace the old statistics as the meteorological input for calculating atmospheric radionuclide doses from stack releases, and will be used in the annual environmental report. This report details the methods used to average the wind statistics and to screen out bad measurements and presents wind roses generated by the averaged statistics

  12. Audit sampling: A qualitative study on the role of statistical and non-statistical sampling approaches on audit practices in Sweden

    OpenAIRE

    Ayam, Rufus Tekoh

    2011-01-01

    PURPOSE: The two approaches to audit sampling; statistical and nonstatistical have been examined in this study. The overall purpose of the study is to explore the current extent at which statistical and nonstatistical sampling approaches are utilized by independent auditors during auditing practices. Moreover, the study also seeks to achieve two additional purposes; the first is to find out whether auditors utilize different sampling techniques when auditing SME´s (Small and Medium-Sized Ente...

  13. Statistics as a Foreign Language--Part 2: More Things to Consider in Reading Statistical Language Studies.

    Science.gov (United States)

    Brown, James Dean

    1992-01-01

    Five new strategies are proposed to help language teachers understand statistical studies. Each strategy is discussed with appropriate tables, figures, and examples drawn from recent articles of the "TESOL Quarterly." (18 references) (Author/LB)

  14. The effect of warm-up, static stretching and dynamic stretching on hamstring flexibility in previously injured subjects

    Directory of Open Access Journals (Sweden)

    Murray Elaine

    2009-04-01

    Full Text Available Abstract Background Warm-up and stretching are suggested to increase hamstring flexibility and reduce the risk of injury. This study examined the short-term effects of warm-up, static stretching and dynamic stretching on hamstring flexibility in individuals with previous hamstring injury and uninjured controls. Methods A randomised crossover study design, over 2 separate days. Hamstring flexibility was assessed using passive knee extension range of motion (PKE ROM. 18 previously injured individuals and 18 uninjured controls participated. On both days, four measurements of PKE ROM were recorded: (1 at baseline; (2 after warm-up; (3 after stretch (static or dynamic and (4 after a 15-minute rest. Participants carried out both static and dynamic stretches, but on different days. Data were analysed using Anova. Results Across both groups, there was a significant main effect for time (p 0.05. Using ANCOVA to adjust for the non-significant (p = 0.141 baseline difference between groups, the previously injured group demonstrated a greater response to warm-up and static stretching, however this was not statistically significant (p = 0.05. Conclusion Warm-up significantly increased hamstring flexibility. Static stretching also increased hamstring flexibility, whereas dynamic did not, in agreement with previous findings on uninjured controls. The effect of warm-up and static stretching on flexibility was greater in those with reduced flexibility post-injury, but this did not reach statistical significance. Further prospective research is required to validate the hypothesis that increased flexibility improves outcomes. Trial Registration ACTRN12608000638336

  15. Statistical analyses in the study of solar wind-magnetosphere coupling

    International Nuclear Information System (INIS)

    Baker, D.N.

    1985-01-01

    Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms

  16. Infants generalize representations of statistically segmented words

    Directory of Open Access Journals (Sweden)

    Katharine eGraf Estes

    2012-10-01

    Full Text Available The acoustic variation in language presents learners with a substantial challenge. To learn by tracking statistical regularities in speech, infants must recognize words across tokens that differ based on characteristics such as the speaker’s voice, affect, or the sentence context. Previous statistical learning studies have not investigated how these types of surface form variation affect learning. The present experiments used tasks tailored to two distinct developmental levels to investigate the robustness of statistical learning to variation. Experiment 1 examined statistical word segmentation in 11-month-olds and found that infants can recognize statistically segmented words across a change in the speaker’s voice from segmentation to testing. The direction of infants’ preferences suggests that recognizing words across a voice change is more difficult than recognizing them in a consistent voice. Experiment 2 tested whether 17-month-olds can generalize the output of statistical learning across variation to support word learning. The infants were successful in their generalization; they associated referents with statistically defined words despite a change in voice from segmentation to label learning. Infants’ learning patterns also indicate that they formed representations of across-word syllable sequences during segmentation. Thus, low probability sequences can act as object labels in some conditions. The findings of these experiments suggest that the units that emerge during statistical learning are not perceptually constrained, but rather are robust to naturalistic acoustic variation.

  17. Underestimation of Severity of Previous Whiplash Injuries

    Science.gov (United States)

    Naqui, SZH; Lovell, SJ; Lovell, ME

    2008-01-01

    INTRODUCTION We noted a report that more significant symptoms may be expressed after second whiplash injuries by a suggested cumulative effect, including degeneration. We wondered if patients were underestimating the severity of their earlier injury. PATIENTS AND METHODS We studied recent medicolegal reports, to assess subjects with a second whiplash injury. They had been asked whether their earlier injury was worse, the same or lesser in severity. RESULTS From the study cohort, 101 patients (87%) felt that they had fully recovered from their first injury and 15 (13%) had not. Seventy-six subjects considered their first injury of lesser severity, 24 worse and 16 the same. Of the 24 that felt the violence of their first accident was worse, only 8 had worse symptoms, and 16 felt their symptoms were mainly the same or less than their symptoms from their second injury. Statistical analysis of the data revealed that the proportion of those claiming a difference who said the previous injury was lesser was 76% (95% CI 66–84%). The observed proportion with a lesser injury was considerably higher than the 50% anticipated. CONCLUSIONS We feel that subjects may underestimate the severity of an earlier injury and associated symptoms. Reasons for this may include secondary gain rather than any proposed cumulative effect. PMID:18201501

  18. Statistical approach for derivation of quantitative acceptance criteria for radioactive wastes to near surface disposal facility

    International Nuclear Information System (INIS)

    Park, Jin Beak; Park, Joo Wan; Lee, Eun Yong; Kim, Chang Lak

    2003-01-01

    For reference human intrusion scenarios constructed in previous study, a probabilistic safety assessment to derive the radionuclide concentration limits for the low- and intermediate- level radioactive waste disposal facility is conducted. Statistical approach by the latin hypercube sampling method is introduced and new assumptions about the disposal facility system are examined and discussed. In our previous study of deterministic approach, the post construction scenarios appeared as most limiting scenario to derive the radionuclide concentration limits. Whereas, in this statistical approach, the post drilling and the post construction scenarios are mutually competing for the scenario selection according to which radionuclides are more important in safety assessment context. Introduction of new assumption shows that the post drilling scenario can play an important role as the limiting scenario instead of the post-construction scenario. When we compare the concentration limits between the previous and this study, concentrations of radionuclides such as Nb-94, Cs-137 and alpha-emitting radionuclides show elevated values than the case of the previous study. Remaining radionuclides such as Sr-90, Tc-99 I-129, Ni-59 and Ni-63 show lower values than the case of the previous study

  19. A classical trajectory study of the photodissociation of T1 acetaldehyde: The transition from impulsive to statistical dynamics

    International Nuclear Information System (INIS)

    Thompson, Keiran C.; Crittenden, Deborah L.; Kable, Scott H.; Jordan, Meredith J.T.

    2006-01-01

    Previous experimental and theoretical studies of the radical dissociation channel of T 1 acetaldehyde show conflicting behavior in the HCO and CH 3 product distributions. To resolve these conflicts, a full-dimensional potential-energy surface for the dissociation of CH 3 CHO into HCO and CH 3 fragments over the barrier on the T 1 surface is developed based on RO-CCSD(T)/cc-pVTZ(DZ) ab initio calculations. 20 000 classical trajectories are calculated on this surface at each of five initial excess energies, spanning the excitation energies used in previous experimental studies, and translational, vibrational, and rotational distributions of the radical products are determined. For excess energies near the dissociation threshold, both the HCO and CH 3 products are vibrationally cold; there is a small amount of HCO rotational excitation and little CH 3 rotational excitation, and the reaction energy is partitioned dominantly (>90% at threshold) into relative translational motion. Close to threshold the HCO and CH 3 rotational distributions are symmetrically shaped, resembling a Gaussian function, in agreement with observed experimental HCO rotational distributions. As the excess energy increases the calculated HCO and CH 3 rotational distributions are observed to change from a Gaussian shape at threshold to one more resembling a Boltzmann distribution, a behavior also seen by various experimental groups. Thus the distribution of energy in these rotational degrees of freedom is observed to change from nonstatistical to apparently statistical, as excess energy increases. As the energy above threshold increases all the internal and external degrees of freedom are observed to gain population at a similar rate, broadly consistent with equipartitioning of the available energy at the transition state. These observations generally support the practice of separating the reaction dynamics into two reservoirs: an impulsive reservoir, fed by the exit channel dynamics, and a

  20. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    Science.gov (United States)

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural

  1. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics.

    Science.gov (United States)

    Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona

    2017-01-01

    In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural

  2. Mathematics Anxiety and Statistics Anxiety. Shared but Also Unshared Components and Antagonistic Contributions to Performance in Statistics

    Directory of Open Access Journals (Sweden)

    Manuela Paechter

    2017-07-01

    Full Text Available In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men. Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in

  3. A cross-sectional study of tuberculosis drug resistance among previously treated patients in a tertiary hospital in Accra, Ghana: public health implications of standardized regimens.

    Science.gov (United States)

    Forson, Audrey; Kwara, Awewura; Kudzawu, Samuel; Omari, Michael; Otu, Jacob; Gehre, Florian; de Jong, Bouke; Antonio, Martin

    2018-04-02

    Mycobacterium tuberculosis drug resistance is a major challenge to the use of standardized regimens for tuberculosis (TB) therapy, especially among previously treated patients. We aimed to investigate the frequency and pattern of drug resistance among previously treated patients with smear-positive pulmonary tuberculosis at the Korle-Bu Teaching Hospital Chest Clinic, Accra. This was a cross-sectional survey of mycobacterial isolates from previously treated patients referred to the Chest Clinic Laboratory between October 2010 and October 2013. The Bactec MGIT 960 system for mycobactrerial culture and drug sensitivity testing (DST) was used for sputum culture of AFB smear-positive patients with relapse, treatment failure, failure of smear conversion, or default. Descriptive statistics were used to summarize patient characteristics, and frequency and patterns of drug resistance. A total of 112 isolates were studied out of 155 from previously treated patients. Twenty contaminated (12.9%) and 23 non-viable isolates (14.8%) were excluded. Of the 112 studied isolates, 53 (47.3%) were pan-sensitive to all first-line drugs tested Any resistance (mono and poly resistance) to isoniazid was found in 44 isolates (39.3%) and any resistance to streptomycin in 43 (38.4%). Thirty-one (27.7%) were MDR-TB. Eleven (35.5%) out of 31 MDR-TB isolates were pre-XDR. MDR-TB isolates were more likely than non-MDR isolates to have streptomycin and ethambutol resistance. The main findings of this study were the high prevalence of MDR-TB and streptomycin resistance among previously treated TB patients, as well as a high prevalence of pre-XDR-TB among the MDR-TB patients, which suggest that first-line and second-line DST is essential to aid the design of effective regimens for these groups of patients in Ghana.

  4. Selecting the most appropriate inferential statistical test for your quantitative research study.

    Science.gov (United States)

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  5. General statistical data structure for epidemiologic studies of DOE workers

    International Nuclear Information System (INIS)

    Frome, E.L.; Hudson, D.R.

    1981-01-01

    Epidemiologic studies to evaluate the occupational risks associated with employment in the nuclear industry are currently being conducted by the Department of Energy. Data that have potential value in evaluating any long-term health effects of occupational exposure to low levels of radiation are obtained for each individual at a given facility. We propose a general data structure for statistical analysis that is used to define transformations from the data management system into the data analysis system. Statistical methods of interest in epidemiologic studies include contingency table analysis and survival analysis procedures that can be used to evaluate potential associations between occupational radiation exposure and mortality. The purposes of this paper are to discuss (1) the adequacy of this data structure for single- and multiple-facility analysis and (2) the statistical computing problems encountered in dealing with large populations over extended periods of time

  6. Microvariability in AGNs: study of different statistical methods - I. Observational analysis

    Science.gov (United States)

    Zibecchi, L.; Andruchow, I.; Cellone, S. A.; Carpintero, D. D.; Romero, G. E.; Combi, J. A.

    2017-05-01

    We present the results of a study of different statistical methods currently used in the literature to analyse the (micro)variability of active galactic nuclei (AGNs) from ground-based optical observations. In particular, we focus on the comparison between the results obtained by applying the so-called C and F statistics, which are based on the ratio of standard deviations and variances, respectively. The motivation for this is that the implementation of these methods leads to different and contradictory results, making the variability classification of the light curves of a certain source dependent on the statistics implemented. For this purpose, we re-analyse the results on an AGN sample observed along several sessions with the 2.15 m 'Jorge Sahade' telescope (CASLEO), San Juan, Argentina. For each AGN, we constructed the nightly differential light curves. We thus obtained a total of 78 light curves for 39 AGNs, and we then applied the statistical tests mentioned above, in order to re-classify the variability state of these light curves and in an attempt to find the suitable statistical methodology to study photometric (micro)variations. We conclude that, although the C criterion is not proper a statistical test, it could still be a suitable parameter to detect variability and that its application allows us to get more reliable variability results, in contrast with the F test.

  7. Line identification studies using traditional techniques and wavelength coincidence statistics

    International Nuclear Information System (INIS)

    Cowley, C.R.; Adelman, S.J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum

  8. A statistical study on fracture toughness data of Japanese RPVS

    International Nuclear Information System (INIS)

    Sakai, Y.; Ogura, N.

    1987-01-01

    In a cooperative study for investigating fracture toughness on pressure vessel steels produced in Japan, a number of heats of ASTM A533B cl.1 and A508 cl.3 steels have been studied. Approximately 3000 fracture toughness data and 8000 mechanical properties data were obtained and filed in a computer data bank. Statistical characterization of toughness data in the transition region has been carried out using the computer data bank. Curve fitting technique for toughness data has been examined. Approach using the function to model the transition behaviours of each toughness has been applied. The aims of fitting curve technique were as follows; (1) Summarization of an enormous toughness data base to permit comparison heats, materials and testing methods; (2) Investigating the relationships among static, dynamic and arrest toughness; (3) Examining the ASME K(IR) curve statistically. The methodology used in this study for analyzing a large quantity of fracture toughness data was found to be useful for formulating a statistically based K(IR) curve. (orig./HP)

  9. Wind energy statistics 2012; Vindkraftsstatistik 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    The publication 'Wind Energy Statistics' is an annual publication. Since 2010, the reported statistics of installed power, number of plants and regional distribution, even semi-annually, and in tabular form on the Agency's website. The publication is produced in a new way this year, which will result in some data differ from previous publications. Due to the certificate system there is basically full statistics on wind energy in this publication which are presented in different styles. Here we present the regional distribution, ie. how the number of turbines and installed capacity is allocated to counties and municipalities. The electricity produced divided by county, where for reasons of confidentiality possible, are also reported. The wind power is becoming increasingly important in the Swedish energy system which provides an increased demand for statistics and other divisions than that presented in the official statistics. Therefore, this publication, which are not official statistics, has been developed.

  10. Statistical ensembles for money and debt

    Science.gov (United States)

    Viaggiu, Stefano; Lionetto, Andrea; Bargigli, Leonardo; Longo, Michele

    2012-10-01

    We build a statistical ensemble representation of two economic models describing respectively, in simplified terms, a payment system and a credit market. To this purpose we adopt the Boltzmann-Gibbs distribution where the role of the Hamiltonian is taken by the total money supply (i.e. including money created from debt) of a set of interacting economic agents. As a result, we can read the main thermodynamic quantities in terms of monetary ones. In particular, we define for the credit market model a work term which is related to the impact of monetary policy on credit creation. Furthermore, with our formalism we recover and extend some results concerning the temperature of an economic system, previously presented in the literature by considering only the monetary base as a conserved quantity. Finally, we study the statistical ensemble for the Pareto distribution.

  11. Selection of the Maximum Spatial Cluster Size of the Spatial Scan Statistic by Using the Maximum Clustering Set-Proportion Statistic.

    Science.gov (United States)

    Ma, Yue; Yin, Fei; Zhang, Tao; Zhou, Xiaohua Andrew; Li, Xiaosong

    2016-01-01

    Spatial scan statistics are widely used in various fields. The performance of these statistics is influenced by parameters, such as maximum spatial cluster size, and can be improved by parameter selection using performance measures. Current performance measures are based on the presence of clusters and are thus inapplicable to data sets without known clusters. In this work, we propose a novel overall performance measure called maximum clustering set-proportion (MCS-P), which is based on the likelihood of the union of detected clusters and the applied dataset. MCS-P was compared with existing performance measures in a simulation study to select the maximum spatial cluster size. Results of other performance measures, such as sensitivity and misclassification, suggest that the spatial scan statistic achieves accurate results in most scenarios with the maximum spatial cluster sizes selected using MCS-P. Given that previously known clusters are not required in the proposed strategy, selection of the optimal maximum cluster size with MCS-P can improve the performance of the scan statistic in applications without identified clusters.

  12. Progressive statistics for studies in sports medicine and exercise science.

    Science.gov (United States)

    Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri

    2009-01-01

    Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.

  13. Improving statistical reasoning theoretical models and practical implications

    CERN Document Server

    Sedlmeier, Peter

    1999-01-01

    This book focuses on how statistical reasoning works and on training programs that can exploit people''s natural cognitive capabilities to improve their statistical reasoning. Training programs that take into account findings from evolutionary psychology and instructional theory are shown to have substantially larger effects that are more stable over time than previous training regimens. The theoretical implications are traced in a neural network model of human performance on statistical reasoning problems. This book apppeals to judgment and decision making researchers and other cognitive scientists, as well as to teachers of statistics and probabilistic reasoning.

  14. Theoretical physics 8 statistical physics

    CERN Document Server

    Nolting, Wolfgang

    2018-01-01

    This textbook offers a clear and comprehensive introduction to statistical physics, one of the core components of advanced undergraduate physics courses. It follows on naturally from the previous volumes in this series, using methods of probability theory and statistics to solve physical problems. The first part of the book gives a detailed overview on classical statistical physics and introduces all mathematical tools needed. The second part of the book covers topics related to quantized states, gives a thorough introduction to quantum statistics, followed by a concise treatment of quantum gases. Ideally suited to undergraduate students with some grounding in quantum mechanics, the book is enhanced throughout with learning features such as boxed inserts and chapter summaries, with key mathematical derivations highlighted to aid understanding. The text is supported by numerous worked examples and end of chapter problem sets. About the Theoretical Physics series Translated from the renowned and highly successf...

  15. Mathematical background and attitudes toward statistics in a sample of Spanish college students.

    Science.gov (United States)

    Carmona, José; Martínez, Rafael J; Sánchez, Manuel

    2005-08-01

    To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.

  16. Effects of previous ovarian surgery for endometriosis on the outcome of assisted reproduction treatment.

    Science.gov (United States)

    Geber, Selmo; Ferreira, Daniela Parreiras; Spyer Prates, Luis Felipe Víctor; Sales, Liana; Sampaio, Marcos

    2002-01-01

    Endometriosis affects 2-50% of women at reproductive age. Surgery is an option for treatment, but there is no convincing evidence that it promotes a significant improvement in fertility. Also, the removal of ovarian endometrioma might lead to a reduction in the follicular reserve and response to stimulation. Therefore, the aim of this study was to evaluate the effect of previous ovarian surgery for endometriosis on the ovarian response in assisted reproduction treatment cycles and its pregnancy outcome. A total of 61 women, with primary infertility and previously having undergone ovarian surgery for endometriosis, who had received 74 IVF/intracytoplasmic sperm injection (ICSI) cycles, were studied (study group). A further 74 patients with primary infertility who underwent 77 IVF/ICSI cycles within#10; the same period of time, at the same clinic and without previous ovarian surgery or endometriosis were studied as a control group. Patients were matched for age and treatment performed. Patients 35 years with previous ovarian surgery needed more ampoules for ovulation induction (P = 0.017) and had fewer follicles and oocytes than women in the control group (P = 0.001). Duration of folliculogenesis was similar in both groups, as was fertilization rate. A total of 10 patients achieved pregnancy in the study group (34.5%) and 14 (48.3%) in the control group. Although a lower pregnancy rate was observed in patients who had undergone previous ovarian surgery, this difference was not statistically significant (P = 0.424). In conclusion, ovarian surgery for the treatment of endometriosis reduces the ovarian outcome in IVF/ICSI cycles in women >35 years old, and might also decrease pregnancy rates. Therefore, for infertile patients, non-surgical treatment might be a better option to avoid reduction of the ovarian response.

  17. The Effect of Using Case Studies in Business Statistics

    Science.gov (United States)

    Pariseau, Susan E.; Kezim, Boualem

    2007-01-01

    The authors evaluated the effect on learning of using case studies in business statistics courses. The authors divided students into 3 groups: a control group, a group that completed 1 case study, and a group that completed 3 case studies. Results evidenced that, on average, students whom the authors required to complete a case analysis received…

  18. Implementing statistical equating for MRCP(UK) Parts 1 and 2.

    Science.gov (United States)

    McManus, I C; Chis, Liliana; Fox, Ray; Waller, Derek; Tang, Peter

    2014-09-26

    The MRCP(UK) exam, in 2008 and 2010, changed the standard-setting of its Part 1 and Part 2 examinations from a hybrid Angoff/Hofstee method to statistical equating using Item Response Theory, the reference group being UK graduates. The present paper considers the implementation of the change, the question of whether the pass rate increased amongst non-UK candidates, any possible role of Differential Item Functioning (DIF), and changes in examination predictive validity after the change. Analysis of data of MRCP(UK) Part 1 exam from 2003 to 2013 and Part 2 exam from 2005 to 2013. Inspection suggested that Part 1 pass rates were stable after the introduction of statistical equating, but showed greater annual variation probably due to stronger candidates taking the examination earlier. Pass rates seemed to have increased in non-UK graduates after equating was introduced, but was not associated with any changes in DIF after statistical equating. Statistical modelling of the pass rates for non-UK graduates found that pass rates, in both Part 1 and Part 2, were increasing year on year, with the changes probably beginning before the introduction of equating. The predictive validity of Part 1 for Part 2 was higher with statistical equating than with the previous hybrid Angoff/Hofstee method, confirming the utility of IRT-based statistical equating. Statistical equating was successfully introduced into the MRCP(UK) Part 1 and Part 2 written examinations, resulting in higher predictive validity than the previous Angoff/Hofstee standard setting. Concerns about an artefactual increase in pass rates for non-UK candidates after equating were shown not to be well-founded. Most likely the changes resulted from a genuine increase in candidate ability, albeit for reasons which remain unclear, coupled with a cognitive illusion giving the impression of a step-change immediately after equating began. Statistical equating provides a robust standard-setting method, with a better

  19. A Classification of Statistics Courses (A Framework for Studying Statistical Education)

    Science.gov (United States)

    Turner, J. C.

    1976-01-01

    A classification of statistics courses in presented, with main categories of "course type,""methods of presentation,""objectives," and "syllabus." Examples and suggestions for uses of the classification are given. (DT)

  20. Addressing economic development goals through innovative teaching of university statistics: a case study of statistical modelling in Nigeria

    Science.gov (United States)

    Oseloka Ezepue, Patrick; Ojo, Adegbola

    2012-12-01

    A challenging problem in some developing countries such as Nigeria is inadequate training of students in effective problem solving using the core concepts of their disciplines. Related to this is a disconnection between their learning and socio-economic development agenda of a country. These problems are more vivid in statistical education which is dominated by textbook examples and unbalanced assessment 'for' and 'of' learning within traditional curricula. The problems impede the achievement of socio-economic development objectives such as those stated in the Nigerian Vision 2020 blueprint and United Nations Millennium Development Goals. They also impoverish the ability of (statistics) graduates to creatively use their knowledge in relevant business and industry sectors, thereby exacerbating mass graduate unemployment in Nigeria and similar developing countries. This article uses a case study in statistical modelling to discuss the nature of innovations in statistics education vital to producing new kinds of graduates who can link their learning to national economic development goals, create wealth and alleviate poverty through (self) employment. Wider implications of the innovations for repositioning mathematical sciences education globally are explored in this article.

  1. Recommendations for describing statistical studies and results in general readership science and engineering journals.

    Science.gov (United States)

    Gardenier, John S

    2012-12-01

    This paper recommends how authors of statistical studies can communicate to general audiences fully, clearly, and comfortably. The studies may use statistical methods to explore issues in science, engineering, and society or they may address issues in statistics specifically. In either case, readers without explicit statistical training should have no problem understanding the issues, the methods, or the results at a non-technical level. The arguments for those results should be clear, logical, and persuasive. This paper also provides advice for editors of general journals on selecting high quality statistical articles without the need for exceptional work or expense. Finally, readers are also advised to watch out for some common errors or misuses of statistics that can be detected without a technical statistical background.

  2. Value and reliability of findings from previous epidemiologic studies in the assessment of radiation-related cancer risks. Pt. 3

    International Nuclear Information System (INIS)

    Frasch, G.; Martignoni, K.

    1990-01-01

    The theories put forward here are predominantly based on pooled data from previous studies in a number of cohorts made up by mostly non-average individuals. These studies were carried out by various researchers and differed in procedures and aims. Factors of major importance to the validity and reliability of the conclusions drawn from this study are pointed out. In one chapter some light is thrown on factors known to bear a relation to the incidence of radiation-induced cancer of the breast, even though at present this can only very vaguely be described on a quantitative basis. These factors include fractionated dose regimens, pregnancies and parturitions, menarche, menopause, synergisms as well as secondary cancer of the breast. The available body of evidence suggests that exposure of each of 1 million women to a dose of 10 mGy (rad) can be linked with approx. 3 additional cases of mammary cancer reported on an average per year after the latency period. The fact that there is some statistical scatter around this value is chiefly attributable to age-related causes at the beginning of exposure. Differences in ethnic and cultural characteristics between the populations investigated appeared to be less important here. (orig./MG) [de

  3. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  4. The (mis)reporting of statistical results in psychology journals.

    Science.gov (United States)

    Bakker, Marjan; Wicherts, Jelte M

    2011-09-01

    In order to study the prevalence, nature (direction), and causes of reporting errors in psychology, we checked the consistency of reported test statistics, degrees of freedom, and p values in a random sample of high- and low-impact psychology journals. In a second study, we established the generality of reporting errors in a random sample of recent psychological articles. Our results, on the basis of 281 articles, indicate that around 18% of statistical results in the psychological literature are incorrectly reported. Inconsistencies were more common in low-impact journals than in high-impact journals. Moreover, around 15% of the articles contained at least one statistical conclusion that proved, upon recalculation, to be incorrect; that is, recalculation rendered the previously significant result insignificant, or vice versa. These errors were often in line with researchers' expectations. We classified the most common errors and contacted authors to shed light on the origins of the errors.

  5. TRAN-STAT: statistics for environmental studies

    International Nuclear Information System (INIS)

    Gilbert, R.O.

    1984-09-01

    This issue of TRAN-STAT discusses statistical methods for assessing the uncertainty in predictions of pollutant transport models, particularly for radionuclides. Emphasis is placed on radionuclide transport models but the statistical assessment techniques also apply in general to other types of pollutants. The report begins with an outline of why an assessment of prediction uncertainties is important. This is followed by an introduction to several methods currently used in these assessments. This in turn is followed by more detailed discussion of the methods, including examples. 43 references, 2 figures

  6. Statistical Learning Is Not Affected by a Prior Bout of Physical Exercise.

    Science.gov (United States)

    Stevens, David J; Arciuli, Joanne; Anderson, David I

    2016-05-01

    This study examined the effect of a prior bout of exercise on implicit cognition. Specifically, we examined whether a prior bout of moderate intensity exercise affected performance on a statistical learning task in healthy adults. A total of 42 participants were allocated to one of three conditions-a control group, a group that exercised for 15 min prior to the statistical learning task, and a group that exercised for 30 min prior to the statistical learning task. The participants in the exercise groups cycled at 60% of their respective V˙O2 max. Each group demonstrated significant statistical learning, with similar levels of learning among the three groups. Contrary to previous research that has shown that a prior bout of exercise can affect performance on explicit cognitive tasks, the results of the current study suggest that the physiological stress induced by moderate-intensity exercise does not affect implicit cognition as measured by statistical learning. Copyright © 2015 Cognitive Science Society, Inc.

  7. Subjective randomness as statistical inference.

    Science.gov (United States)

    Griffiths, Thomas L; Daniels, Dylan; Austerweil, Joseph L; Tenenbaum, Joshua B

    2018-06-01

    Some events seem more random than others. For example, when tossing a coin, a sequence of eight heads in a row does not seem very random. Where do these intuitions about randomness come from? We argue that subjective randomness can be understood as the result of a statistical inference assessing the evidence that an event provides for having been produced by a random generating process. We show how this account provides a link to previous work relating randomness to algorithmic complexity, in which random events are those that cannot be described by short computer programs. Algorithmic complexity is both incomputable and too general to capture the regularities that people can recognize, but viewing randomness as statistical inference provides two paths to addressing these problems: considering regularities generated by simpler computing machines, and restricting the set of probability distributions that characterize regularity. Building on previous work exploring these different routes to a more restricted notion of randomness, we define strong quantitative models of human randomness judgments that apply not just to binary sequences - which have been the focus of much of the previous work on subjective randomness - but also to binary matrices and spatial clustering. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Statistical Analysis of Hypercalcaemia Data related to Transferability

    DEFF Research Database (Denmark)

    Frølich, Anne; Nielsen, Bo Friis

    2005-01-01

    In this report we describe statistical analysis related to a study of hypercalcaemia carried out in the Copenhagen area in the ten year period from 1984 to 1994. Results from the study have previously been publised in a number of papers [3, 4, 5, 6, 7, 8, 9] and in various abstracts and posters...... at conferences during the late eighties and early nineties. In this report we give a more detailed description of many of the analysis and provide some new results primarily by simultaneous studies of several databases....

  9. A statistical study of the upstream intermediate ion boundary in the Earth's foreshock

    Directory of Open Access Journals (Sweden)

    K. Meziane

    1998-02-01

    Full Text Available A statistical investigation of the location of onset of intermediate and gyrating ion populations in the Earth's foreshock is presented based on Fixed Voltage Analyzer data from ISEE 1. This study reveals the existence of a spatial boundary for intermediate and gyrating ion populations that coincides with the reported ULF wave boundary. This boundary position in the Earth's foreshock depends strongly upon the magnetic cone angle θBX and appears well defined for relatively large cone angles, though not for small cone angles. As reported in a previous study of the ULF wave boundary, the position of the intermediate-gyrating ion boundary is not compatible with a fixed growth rate of the waves resulting from the interaction between a uniform beam and the ambient plasma. The present work examines the momentum associated with protons which travel along this boundary, and we show that the variation of the boundary position (or equivalently, the associated particle momentum with the cone angle is related to classical acceleration mechanisms at the bow shock surface. The same functional behavior as a function of the cone angle is obtained for the momentum predicted by an acceleration model and for the particle momentum associated with the boundary. However, the model predicts systematically larger values of the momentum than the observation related values by a constant amount; we suggest that this difference may be due to some momentum exchange between the incident solar-wind population and the backstreaming particles through a wave-particle interaction resulting from a beam plasma instability.Key words. Intermediate ion boundary · Statistical investigation · Earth's foreshock · ISEE 1 spacecraft

  10. Extrusion product defects: a statistical study

    International Nuclear Information System (INIS)

    Qamar, S.Z.; Arif, A.F.M.; Sheikh, A.K.

    2003-01-01

    In any manufacturing environment, defects resulting in rework or rejection are directly related to product cost and quality, and indirectly linked with process, tooling and product design. An analysis of product defects is therefore integral to any attempt at improving productivity, efficiency and quality. Commercial aluminum extrusion is generally a hot working process and consists of a series of different but integrated operations: billet preheating and sizing, die set and container preheating, billet loading and deformation, product sizing and stretching/roll-correction, age hardening, and painting/anodizing. Product defects can be traced back to problems in billet material and preparation, die and die set design and maintenance, process variable aberrations (ram speed, extrusion pressure, container temperature, etc), and post-extrusion treatment (age hardening, painting/anodizing, etc). The current paper attempts to analyze statistically the product defects commonly encountered in a commercial hot aluminum extrusion setup. Real-world rejection data, covering a period of nine years, has been researched and collected from a local structural aluminum extrusion facility. Rejection probabilities have been calculated for all the defects studied. The nine-year rejection data have been statistically analyzed on the basis of (i) an overall breakdown of defects, (ii) year-wise rejection behavior, (iii) breakdown of defects in each of three cost centers: press, anodizing, and painting. (author)

  11. THE STATISTICS OF RADIO ASTRONOMICAL POLARIMETRY: BRIGHT SOURCES AND HIGH TIME RESOLUTION

    International Nuclear Information System (INIS)

    Van Straten, W.

    2009-01-01

    A four-dimensional statistical description of electromagnetic radiation is developed and applied to the analysis of radio pulsar polarization. The new formalism provides an elementary statistical explanation of the modal-broadening phenomenon in single-pulse observations. It is also used to argue that the degree of polarization of giant pulses has been poorly defined in past studies. Single- and giant-pulse polarimetry typically involves sources with large flux-densities and observations with high time-resolution, factors that necessitate consideration of source-intrinsic noise and small-number statistics. Self-noise is shown to fully explain the excess polarization dispersion previously noted in single-pulse observations of bright pulsars, obviating the need for additional randomly polarized radiation. Rather, these observations are more simply interpreted as an incoherent sum of covariant, orthogonal, partially polarized modes. Based on this premise, the four-dimensional covariance matrix of the Stokes parameters may be used to derive mode-separated pulse profiles without any assumptions about the intrinsic degrees of mode polarization. Finally, utilizing the small-number statistics of the Stokes parameters, it is established that the degree of polarization of an unresolved pulse is fundamentally undefined; therefore, previous claims of highly polarized giant pulses are unsubstantiated.

  12. Study of some physical aspects previous to design of an exponential experiment

    International Nuclear Information System (INIS)

    Caro, R.; Francisco, J. L. de

    1961-01-01

    This report presents the theoretical study of some physical aspects previous to the design of an exponential facility. The are: Fast and slow flux distribution in the multiplicative medium and in the thermal column, slowing down in the thermal column, geometrical distribution and minimum needed intensity of sources access channels and perturbations produced by possible variations in its position and intensity. (Author) 4 refs

  13. General quadrupolar statistical anisotropy: Planck limits

    Energy Technology Data Exchange (ETDEWEB)

    Ramazanov, S. [Gran Sasso Science Institute (INFN), Viale Francesco Crispi 7, I-67100 L' Aquila (Italy); Rubtsov, G. [Institute for Nuclear Research of the Russian Academy of Sciences, Prospect of the 60th Anniversary of October 7a, 117312 Moscow (Russian Federation); Thorsrud, M. [Faculty of Engineering, Østfold University College, P.O. Box 700, 1757 Halden (Norway); Urban, F.R., E-mail: sabir.ramazanov@gssi.infn.it, E-mail: grisha@ms2.inr.ac.ru, E-mail: mikjel.thorsrud@hiof.no, E-mail: federico.urban@kbfi.ee [National Institute of Chemical Physics and Biophysics, Rävala 10, 10143 Tallinn (Estonia)

    2017-03-01

    Several early Universe scenarios predict a direction-dependent spectrum of primordial curvature perturbations. This translates into the violation of the statistical isotropy of cosmic microwave background radiation. Previous searches for statistical anisotropy mainly focussed on a quadrupolar direction-dependence characterised by a single multipole vector and an overall amplitude g {sub *}. Generically, however, the quadrupole has a more complicated geometry described by two multipole vectors and g {sub *}. This is the subject of the present work. In particular, we limit the amplitude g {sub *} for different shapes of the quadrupole by making use of Planck 2015 maps. We also constrain certain inflationary scenarios which predict this kind of more general quadrupolar statistical anisotropy.

  14. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    International Nuclear Information System (INIS)

    Kendrew, S.; Robitaille, T. P.; Simpson, R.; Lintott, C. J.; Bressert, E.; Povich, M. S.; Sherman, R.; Schawinski, K.; Wolf-Chase, G.

    2012-01-01

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% ± 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% ± 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  15. THE MILKY WAY PROJECT: A STATISTICAL STUDY OF MASSIVE STAR FORMATION ASSOCIATED WITH INFRARED BUBBLES

    Energy Technology Data Exchange (ETDEWEB)

    Kendrew, S.; Robitaille, T. P. [Max-Planck-Institut fuer Astronomie, Koenigstuhl 17, D-69117 Heidelberg (Germany); Simpson, R.; Lintott, C. J. [Department of Astrophysics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford OX1 3RH (United Kingdom); Bressert, E. [School of Physics, University of Exeter, Stocker Road, Exeter EX4 4QL (United Kingdom); Povich, M. S. [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States); Sherman, R. [Department of Astronomy and Astrophysics, University of Chicago, 5640 S. Ellis Avenue, Chicago, IL 60637 (United States); Schawinski, K. [Yale Center for Astronomy and Astrophysics, Yale University, P.O. Box 208121, New Haven, CT 06520 (United States); Wolf-Chase, G., E-mail: kendrew@mpia.de [Astronomy Department, Adler Planetarium, 1300 S. Lake Shore Drive, Chicago, IL 60605 (United States)

    2012-08-10

    The Milky Way Project citizen science initiative recently increased the number of known infrared bubbles in the inner Galactic plane by an order of magnitude compared to previous studies. We present a detailed statistical analysis of this data set with the Red MSX Source (RMS) catalog of massive young stellar sources to investigate the association of these bubbles with massive star formation. We particularly address the question of massive triggered star formation near infrared bubbles. We find a strong positional correlation of massive young stellar objects (MYSOs) and H II regions with Milky Way Project bubbles at separations of <2 bubble radii. As bubble sizes increase, a statistically significant overdensity of massive young sources emerges in the region of the bubble rims, possibly indicating the occurrence of triggered star formation. Based on numbers of bubble-associated RMS sources, we find that 67% {+-} 3% of MYSOs and (ultra-)compact H II regions appear to be associated with a bubble. We estimate that approximately 22% {+-} 2% of massive young stars may have formed as a result of feedback from expanding H II regions. Using MYSO-bubble correlations, we serendipitously recovered the location of the recently discovered massive cluster Mercer 81, suggesting the potential of such analyses for discovery of heavily extincted distant clusters.

  16. Radon anomalies prior to earthquakes (1). Review of previous studies

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    The relationship between radon anomalies and earthquakes has been studied for more than 30 years. However, most of the studies dealt with radon in soil gas or in groundwater. Before the 1995 Hyogoken-Nanbu earthquake, an anomalous increase of atmospheric radon was observed at Kobe Pharmaceutical University. The increase was well fitted with a mathematical model related to earthquake fault dynamics. This paper reports the significance of this observation, reviewing previous studies on radon anomaly before earthquakes. Groundwater/soil radon measurements for earthquake prediction began in 1970's in Japan as well as foreign countries. One of the most famous studies in Japan is groundwater radon anomaly before the 1978 Izu-Oshima-kinkai earthquake. We have recognized the significance of radon in earthquake prediction research, but recently its limitation was also pointed out. Some researchers are looking for a better indicator for precursors; simultaneous measurements of radon and other gases are new trials in recent studies. Contrary to soil/groundwater radon, we have not paid much attention to atmospheric radon before earthquakes. However, it might be possible to detect precursors in atmospheric radon before a large earthquake. In the next issues, we will discuss the details of the anomalous atmospheric radon data observed before the Hyogoken-Nanbu earthquake. (author)

  17. Incidence of Acneform Lesions in Previously Chemically Damaged Persons-2004

    Directory of Open Access Journals (Sweden)

    N Dabiri

    2008-04-01

    Full Text Available ABSTRACT: Introduction & Objective: Chemical gas weapons especially nitrogen mustard which was used in Iraq-Iran war against Iranian troops have several harmful effects on skin. Some other chemical agents also can cause acne form lesions on skin. The purpose of this study was to compare the incidence of acneform in previously chemically damaged soldiers and non chemically damaged persons. Materials & Methods: In this descriptive and analytical study, 180 chemically damaged soldiers, who have been referred to dermatology clinic between 2000 – 2004, and forty non-chemically damaged people, were chosen randomly and examined for acneform lesions. SPSS software was used for statistic analysis of the data. Results: The mean age of the experimental group was 37.5 ± 5.2 and that of the control group was 38.7 ± 5.9 years. The mean percentage of chemical damage in cases was 31 percent and the time after the chemical damage was 15.2 ± 1.1 years. Ninety seven cases (53.9 percent of the subjects and 19 people (47.5 percent of the control group had some degree of acne. No significant correlation was found in incidence, degree of lesions, site of lesions and age of subjects between two groups. No significant correlation was noted between percentage of chemical damage and incidence and degree of lesions in case group. Conclusion: Incidence of acneform lesions among previously chemically injured peoples was not higher than the normal cases.

  18. Female Sexual Dysfunction in the Late Postpartum Period Among Women with Previous Gestational Diabetes Mellitus.

    Science.gov (United States)

    Sargin, Mehmet Akif; Yassa, Murat; Taymur, Bilge Dogan; Taymur, Bulent; Akca, Gizem; Tug, Niyazi

    2017-04-01

    To compare the status of female sexual dysfunction (FSD) between women with a history of previous gestational diabetes mellitus (GDM) and those with follow-up of a healthy pregnancy, using the female sexual function index (FSFI) questionnaire. Cross-sectional study. Department of Obstetrics and Gynecology, Fatih Sultan Mehmet Training and Research Hospital, Istanbul, Turkey, from September to December 2015. Healthy sexually active adult parous females were included. Participants were asked to complete the validated Turkish versions of the FSFI and Hospital Anxiety and Depression Scale (HADS) questionnaires. Student's t-test was used for two-group comparisons of normally distributed variables and quantitative data. Mann-Whitney U-test was used for two-group comparisons of non-normally distributed variables. Pearson's chi-squared test, the Fisher-FreemanHalton test, Fisher's exact test, and Yates' continuity correction test were used for comparison of qualitative data. The mean FSFI scores of the 179 participants was 23.50 ±3.94. FSFI scores and scores of desire, arousal, lubrication, orgasm, satisfaction, and pain were not statistically significantly different (p>0.05), according to a history of GDM and types of FSD (none, mild, severe). HADS scores and anxiety and depression types did not statistically significantly differ according to the history of GDM (p>0.05). An association could not be found in FSFI scores between participants with both the history of previous GDM and with healthy pregnancy; subclinical sexual dysfunction may be observed in the late postpartum period among women with a history of previous GDM. This may adversely affect their sexual health.

  19. Study design and statistical analysis of data in human population studies with the micronucleus assay.

    Science.gov (United States)

    Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano

    2011-01-01

    The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.

  20. [ANTITHROMBOTIC MEDICATION IN PREGNANT WOMEN WITH PREVIOUS INTRAUTERINE GROWTH RESTRICTION].

    Science.gov (United States)

    Neykova, K; Dimitrova, V; Dimitrov, R; Vakrilova, L

    2016-01-01

    To analyze pregnancy outcome in patients who were on antithrombotic medication (AM) because of previous pregnancy with fetal intrauterine growth restriction (IUGR). The studied group (SG) included 21 pregnancies in 15 women with history of previous IUGR. The patients were on low dose aspirin (LDA) and/or low molecular weight heparin (LMWH). Pregnancy outcome was compared to the one in two more groups: 1) primary group (PG) including the previous 15 pregnancies with IUGR of the same women; 2) control group (CG) including 45 pregnancies of women matched for parity with the ones in the SG, with no history of IUGR and without medication. The SG, PG and CG were compared for the following: mean gestational age (g.a.) at birth, mean birth weight (BW), proportion of cases with early preeclampsia (PE), IUGR (total, moderate, and severe), intrauterine fetal death (IUFD), neonatal death (NND), admission to NICU, cesarean section (CS) because of chronic or acute fetal distress (FD) related to IUGR, PE or placental abruption. Student's t-test was applied to assess differences between the groups. P values < 0.05 were considered statistically significant. The differences between the SG and the PG regarding mean g. a. at delivery (33.7 and 29.8 w.g. respectively) and the proportion of babies admitted to NICU (66.7% vs. 71.4%) were not statistically significant. The mean BW in the SG (2114,7 g.) was significantly higher than in the PG (1090.8 g.). In the SG compared with the PG there were significantly less cases of IUFD (14.3% and 53.3% respectively), early PE (9.5% vs. 46.7%) moderate and severe IUGR (10.5% and 36.8% vs. 41.7% and 58.3%). Neonatal mortality in the SG (5.6%) was significantly lower than in the PG (57.1%), The proportion of CS for FD was not significantly different--53.3% in the SG and 57.1% in the PG. On the other hand, comparison between the SG and the CG demonstrated significantly lower g.a. at delivery in the SG (33.7 vs. 38 w.g.) an lower BW (2114 vs. 3094 g

  1. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  2. GWAPower: a statistical power calculation software for genome-wide association studies with quantitative traits.

    Science.gov (United States)

    Feng, Sheng; Wang, Shengchu; Chen, Chia-Cheng; Lan, Lan

    2011-01-21

    In designing genome-wide association (GWA) studies it is important to calculate statistical power. General statistical power calculation procedures for quantitative measures often require information concerning summary statistics of distributions such as mean and variance. However, with genetic studies, the effect size of quantitative traits is traditionally expressed as heritability, a quantity defined as the amount of phenotypic variation in the population that can be ascribed to the genetic variants among individuals. Heritability is hard to transform into summary statistics. Therefore, general power calculation procedures cannot be used directly in GWA studies. The development of appropriate statistical methods and a user-friendly software package to address this problem would be welcomed. This paper presents GWAPower, a statistical software package of power calculation designed for GWA studies with quantitative traits, where genetic effect is defined as heritability. Based on several popular one-degree-of-freedom genetic models, this method avoids the need to specify the non-centrality parameter of the F-distribution under the alternative hypothesis. Therefore, it can use heritability information directly without approximation. In GWAPower, the power calculation can be easily adjusted for adding covariates and linkage disequilibrium information. An example is provided to illustrate GWAPower, followed by discussions. GWAPower is a user-friendly free software package for calculating statistical power based on heritability in GWA studies with quantitative traits. The software is freely available at: http://dl.dropbox.com/u/10502931/GWAPower.zip.

  3. A STATISTICAL STUDY OF THE MASS AND DENSITY STRUCTURE OF INFRARED DARK CLOUDS

    International Nuclear Information System (INIS)

    Peretto, N.; Fuller, G. A.

    2010-01-01

    How and when the mass distribution of stars in the Galaxy is set is one of the main issues of modern astronomy. Here, we present a statistical study of mass and density distributions of infrared dark clouds (IRDCs) and fragments within them. These regions are pristine molecular gas structures and progenitors of stars and so provide insights into the initial conditions of star formation. This study makes use of an IRDC catalog, the largest sample of IRDC column density maps to date, containing a total of ∼11,000 IRDCs with column densities exceeding N H 2 = 1x10 22 cm -2 and over 50,000 single-peaked IRDC fragments. The large number of objects constitutes an important strength of this study, allowing a detailed analysis of the completeness of the sample and so statistically robust conclusions. Using a statistical approach to assigning distances to clouds, the mass and density distributions of the clouds and the fragments within them are constructed. The mass distributions show a steepening of the slope when switching from IRDCs to fragments, in agreement with previous results of similar structures. IRDCs and fragments are divided into unbound/bound objects by assuming Larson's relation and calculating their virial parameter. IRDCs are mostly gravitationally bound, while a significant fraction of the fragments are not. The density distribution of gravitationally unbound fragments shows a steep characteristic slope such as ΔN/Δlog(n) ∝ n -4.0±0.5 , rather independent of the range of fragment mass. However, the incompleteness limit at a number density of ∼10 3 cm -3 does not allow us to exclude a potential lognormal density distribution. In contrast, gravitationally bound fragments show a characteristic density peak at n ≅ 10 4 cm -3 but the shape of the density distributions changes with the range of fragment masses. An explanation for this could be the differential dynamical evolution of the fragment density with respect to their mass as more massive

  4. The Role of Statistics in Business and Industry

    CERN Document Server

    Hahn, Gerald J

    2011-01-01

    An insightful guide to the use of statistics for solving key problems in modern-day business and industry This book has been awarded the Technometrics Ziegel Prize for the best book reviewed by the journal in 2010. Technometrics is a journal of statistics for the physical, chemical and engineering sciences, published jointly by the American Society for Quality and the American Statistical Association. Criteria for the award include that the book brings together in one volume a body of material previously only available in scattered research articles and having the potential to significantly im

  5. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  6. PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC

    International Nuclear Information System (INIS)

    James, J. Berian

    2012-01-01

    We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity that appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.

  7. Summary of Previous Chamber or Controlled Anthrax Studies and Recommendations for Possible Additional Studies

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Morrow, Jayne B.

    2010-12-29

    This report and an associated Excel file(a) summarizes the investigations and results of previous chamber and controlled studies(b) to characterize the performance of methods for collecting, storing and/or transporting, extracting, and analyzing samples from surfaces contaminated by Bacillus anthracis (BA) or related simulants. This report and the Excel are the joint work of the Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) for the Department of Homeland Security, Science and Technology Directorate. The report was originally released as PNNL-SA-69338, Rev. 0 in November 2009 with limited distribution, but was subsequently cleared for release with unlimited distribution in this Rev. 1. Only minor changes were made to Rev. 0 to yield Rev. 1. A more substantial update (including summarizing data from other studies and more condensed summary tables of data) is underway

  8. An fMRI study of neuronal activation in schizophrenia patients with and without previous cannabis use

    Directory of Open Access Journals (Sweden)

    Else-Marie eLøberg

    2012-10-01

    Full Text Available Previous studies have mostly shown positive effects of cannabis use on cognition in patients with schizophrenia, which could reflect lower neurocognitive vulnerability. There are however no studies comparing whether such cognitive differences have neuronal correlates. Thus, the aim of the present study was to compare whether patients with previous cannabis use differ in brain activation from patients who has never used cannabis. The patients groups were compared on the ability to up-regulate an effort mode network during a cognitive task and down-regulate activation in the same network during a task-absent condition. Task-present and task-absent brain activation was measured by functional magnetic resonance neuroimaging (fMRI. Twenty-six patients with a DSM-IV and ICD-10 diagnosis of schizophrenia were grouped into a previous cannabis user group and a no-cannabis group. An auditory dichotic listening task with instructions of attention focus on either the right or left ear stimulus was used to tap verbal processing, attention and cognitive control, calculated as an aggregate score. When comparing the two groups, there were remaining activations in the task-present condition for the cannabis group, not seen in the no-cannabis group, while there was remaining activation in the task-absent condition for the no-cannabis group, not seen in the cannabis group. Thus, the patients with previous cannabis use showed increased activation in an effort mode network and decreased activation in the default mode network as compared to the no-cannabis group. It is concluded that the present study show some differences in brain activation to a cognitively challenging task between previous cannabis and no-cannabis schizophrenia patients.

  9. Use of a statistical model of the whole femur in a large scale, multi-model study of femoral neck fracture risk.

    Science.gov (United States)

    Bryan, Rebecca; Nair, Prasanth B; Taylor, Mark

    2009-09-18

    Interpatient variability is often overlooked in orthopaedic computational studies due to the substantial challenges involved in sourcing and generating large numbers of bone models. A statistical model of the whole femur incorporating both geometric and material property variation was developed as a potential solution to this problem. The statistical model was constructed using principal component analysis, applied to 21 individual computer tomography scans. To test the ability of the statistical model to generate realistic, unique, finite element (FE) femur models it was used as a source of 1000 femurs to drive a study on femoral neck fracture risk. The study simulated the impact of an oblique fall to the side, a scenario known to account for a large proportion of hip fractures in the elderly and have a lower fracture load than alternative loading approaches. FE model generation, application of subject specific loading and boundary conditions, FE processing and post processing of the solutions were completed automatically. The generated models were within the bounds of the training data used to create the statistical model with a high mesh quality, able to be used directly by the FE solver without remeshing. The results indicated that 28 of the 1000 femurs were at highest risk of fracture. Closer analysis revealed the percentage of cortical bone in the proximal femur to be a crucial differentiator between the failed and non-failed groups. The likely fracture location was indicated to be intertrochantic. Comparison to previous computational, clinical and experimental work revealed support for these findings.

  10. A Descriptive Study of Individual and Cross-Cultural Differences in Statistics Anxiety

    Science.gov (United States)

    Baloglu, Mustafa; Deniz, M. Engin; Kesici, Sahin

    2011-01-01

    The present study investigated individual and cross-cultural differences in statistics anxiety among 223 Turkish and 237 American college students. A 2 x 2 between-subjects factorial multivariate analysis of covariance (MANCOVA) was performed on the six dependent variables which are the six subscales of the Statistical Anxiety Rating Scale.…

  11. Statistical studies on quasars and active nuclei of galaxies

    International Nuclear Information System (INIS)

    Stasinska, G.

    1987-01-01

    A catalogue of optical, radio and X-ray properties of quasars and other active galactic nuclei, now in elaboration, is presented. This catalogue may serve as a data base for statistical studies. As an example, we give some preliminary results concerning the determination of the quasar masses [fr

  12. COMPARISON OF STATISTICALLY CONTROLLED MACHINING SOLUTIONS OF TITANIUM ALLOYS USING USM

    Directory of Open Access Journals (Sweden)

    R. Singh

    2010-06-01

    Full Text Available The purpose of the present investigation is to compare the statistically controlled machining solution of titanium alloys using ultrasonic machining (USM. In this study, the previously developed Taguchi model for USM of titanium and its alloys has been investigated and compared. Relationships between the material removal rate, tool wear rate, surface roughness and other controllable machining parameters (power rating, tool type, slurry concentration, slurry type, slurry temperature and slurry size have been deduced. The results of this study suggest that at the best settings of controllable machining parameters for titanium alloys (based upon the Taguchi design, the machining solution with USM is statistically controlled, which is not observed for other settings of input parameters on USM.

  13. Measuring University Students' Approaches to Learning Statistics: An Invariance Study

    Science.gov (United States)

    Chiesi, Francesca; Primi, Caterina; Bilgin, Ayse Aysin; Lopez, Maria Virginia; del Carmen Fabrizio, Maria; Gozlu, Sitki; Tuan, Nguyen Minh

    2016-01-01

    The aim of the current study was to provide evidence that an abbreviated version of the Approaches and Study Skills Inventory for Students (ASSIST) was invariant across different languages and educational contexts in measuring university students' learning approaches to statistics. Data were collected on samples of university students attending…

  14. A Flexible-Dose Study of Paliperidone ER in Patients With Nonacute Schizophrenia Previously Treated Unsuccessfully With Oral Olanzapine.

    Science.gov (United States)

    Kotler, Moshe; Dilbaz, Nesrin; Rosa, Fernanda; Paterakis, Periklis; Milanova, Vihra; Smulevich, Anatoly B; Lahaye, Marjolein; Schreiner, Andreas

    2016-01-01

    The goal of this study was to explore the tolerability, safety, and treatment response of switching from oral olanzapine to paliperidone extended release (ER). Adult patients with nonacute schizophrenia who had been treated unsuccessfully with oral olanzapine were switched to flexible doses of paliperidone ER (3 to 12 mg/d). The primary efficacy outcome was a ≥ 20% improvement in Positive and Negative Syndrome Scale (PANSS) total scores from baseline to endpoint for patients who switched medications because of lack of efficacy with olanzapine and noninferiority versus previous olanzapine treatment (mean endpoint change in PANSS total scores vs. baseline of ≤ 5 points) for patients who switched for reasons other than lack of efficacy. Safety and tolerability were assessed by monitoring adverse events, extrapyramidal symptoms, and weight change. Of 396 patients, 65.2% were men, mean age was 40.0 ± 12.0 years, and 75.5% had paranoid schizophrenia. Among the patients whose main reason for switching was lack of efficacy, an improvement in the PANSS total score of ≥ 20% occurred in 57.4% of patients. Noninferiority was confirmed for each subgroup of patients whose main reason for switching was something other than lack of efficacy. Paliperidone ER was generally well tolerated. Extrapyramidal symptoms as measured by total Extrapyramidal Symptom Rating Scale scores showed statistically significant and clinically relevant improvements at endpoint, the average weight decreased by 0.8 ± 5.2 kg at endpoint, and a clinically relevant weight gain of ≥ 7% occurred in 8.0% of patients. Paliperidone ER flexibly-dosed over 6 months was well tolerated and associated with a meaningful clinical response in patients with nonacute schizophrenia who had previously been unsuccessfully treated with oral olanzapine.

  15. A statistical study of ionopause perturbation and associated boundary wave formation at Venus.

    Science.gov (United States)

    Chong, G. S.; Pope, S. A.; Walker, S. N.; Zhang, T.; Balikhin, M. A.

    2017-12-01

    In contrast to Earth, Venus does not possess an intrinsic magnetic field. Hence the interaction between solar wind and Venus is significantly different when compared to Earth, even though these two planets were once considered similar. Within the induced magnetosphere and ionosphere of Venus, previous studies have shown the existence of ionospheric boundary waves. These structures may play an important role in the atmospheric evolution of Venus. By using Venus Express data, the crossings of the ionopause boundary are determined based on the observations of photoelectrons during 2011. Pulses of dropouts in the electron energy spectrometer were observed in 92 events, which suggests potential perturbations of the boundary. Minimum variance analysis of the 1Hz magnetic field data for the perturbations is conducted and used to confirm the occurrence of the boundary waves. Statistical analysis shows that they were propagating mainly in the ±VSO-Y direction in the polar north terminator region. The generation mechanisms of boundary waves and their evolution into the potential nonlinear regime are discussed and analysed.

  16. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  17. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  18. Reading Statistics And Research

    OpenAIRE

    Akbulut, Reviewed By Yavuz

    2008-01-01

    The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and ku...

  19. Methodological Problems Of Statistical Study Of Regional Tourism And Tourist Expenditure

    Directory of Open Access Journals (Sweden)

    Anton Olegovich Ovcharov

    2015-03-01

    Full Text Available The aim of the work is the analysis of the problems of regional tourism statistics. The subject of the research is the tourism expenditure, the specificity of their recording and modeling. The methods of statistical observation and factor analysis are used. The article shows the features and directions of statistical methodology of tourism. A brief review of international publications on statistical studies of tourist expenditure is made. It summarizes the data from different statistical forms and shows the positive and negative trends in the development of tourism in Russia. It is concluded that the tourist industry in Russia is focused on outbound tourism rather than on inbound or internal. The features of statistical accounting and statistical analysis of tourism expenditure in Russian and international statistics are described. To assess the level of development of regional tourism the necessity of use the coefficient of efficiency of tourism. The reasons of the prevalence of imports over exports of tourism services are revealed using the data of the balance of payments. This is due to the raw material orientation of Russian exports and low specific weight of the account “Services” in the structure of the balance of payments. The additive model is also proposed in the paper. It describes the influence of three factors on the changes in tourist expenditure. These factors are the number of trips, the cost of a trip and structural changes in destinations and travel purposes. On the basis of the data from 2012–2013 we estimate the force and the direction of the influence of each factor. Testing of the model showed that the increase in tourism exports caused by the combined positive impact of all three factors, chief of which is the growing number of foreigners who visited Russia during the concerned period.

  20. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    Science.gov (United States)

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated

  1. Statistical Reasoning Ability, Self-Efficacy, and Value Beliefs in a University Statistics Course

    Science.gov (United States)

    Olani, A.; Hoekstra, R.; Harskamp, E.; van der Werf, G.

    2011-01-01

    Introduction: The study investigated the degree to which students' statistical reasoning abilities, statistics self-efficacy, and perceived value of statistics improved during a reform based introductory statistics course. The study also examined whether the changes in these learning outcomes differed with respect to the students' mathematical…

  2. Data from studies of previous radioactive waste disposal in Massachusetts Bay

    International Nuclear Information System (INIS)

    Curtis, W.R.; Mardis, H.M.

    1984-12-01

    This report presents the results of studies conducted in Massachusetts Bay during 1981 and 1982. Included are data from: (1) a side scan sonar survey of disposal areas in the Bay that was carried out by the National Oceanic and Atmospheric Administration (NOAA) for EPA; (2) Collections of sediment and biota by NOAA for radiochemical analysis by EPA; (3) collections of marketplace seafood samples by the Food and Drug Administration (FDA) for radioanalysis by both FDA and EPA; and (4) a radiological monitoring survey of LLW disposal areas by EPA to determine whether there should be any concern for public health resulting from previous LLW disposals in the Bay

  3. Assessing Statistical Change Indices in Selected Social Work Intervention Research Studies

    Science.gov (United States)

    Ham, Amanda D.; Huggins-Hoyt, Kimberly Y.; Pettus, Joelle

    2016-01-01

    Objectives: This study examined how evaluation and intervention research (IR) studies assessed statistical change to ascertain effectiveness. Methods: Studies from six core social work journals (2009-2013) were reviewed (N = 1,380). Fifty-two evaluation (n= 27) and intervention (n = 25) studies met the inclusion criteria. These studies were…

  4. Quantitative study on the statistical properties of fibre architecture of genuine and numerical composite microstructures

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Brøndsted, Povl

    2013-01-01

    A quantitative study is carried out regarding the statistical properties of the fibre architecture found in composite laminates and that generated numerically using Statistical Representative Volume Elements (SRVE’s). The aim is to determine the reliability and consistency of SRVE’s for represent......A quantitative study is carried out regarding the statistical properties of the fibre architecture found in composite laminates and that generated numerically using Statistical Representative Volume Elements (SRVE’s). The aim is to determine the reliability and consistency of SRVE...

  5. Statistical trend of radiation chemical studies

    International Nuclear Information System (INIS)

    Yoshida, Hiroshi

    1980-01-01

    In the field of radiation chemistry, over 1,000 reports are published year after year. Attempt has been made to review the trends in this field for more than five years, by looking through the lists of papers statistically. As for the period from 1974 to 1978, Annual Cumulation with Keyword and Author Indexes in the Biweekly List of Papers on Radiation Chemistry was referred to. For 1979, because of the unavailability of the Cumulation, Chemical Abstracts Search by Japan Information Center of Science and Technology was referred to. The contents are as follows: how far radiation chemistry is studied, what the trends of radiation chemistry is in recent years, who contributes to the advance of radiation chemistry, and where, the trend radiation chemistry takes in 1979. (J.P.N.)

  6. Significance levels for studies with correlated test statistics.

    Science.gov (United States)

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  7. Components of the Pearson-Fisher chi-squared statistic

    Directory of Open Access Journals (Sweden)

    G. D. Raynery

    2002-01-01

    interpretation of the corresponding test statistic components has not previously been investigated. This paper provides the necessary details, as well as an overview of the decomposition options available, and revisits two published examples.

  8. Introduction to probability and statistics for engineers and scientists

    CERN Document Server

    Ross, Sheldon M

    2009-01-01

    This updated text provides a superior introduction to applied probability and statistics for engineering or science majors. Ross emphasizes the manner in which probability yields insight into statistical problems; ultimately resulting in an intuitive understanding of the statistical procedures most often used by practicing engineers and scientists. Real data sets are incorporated in a wide variety of exercises and examples throughout the book, and this emphasis on data motivates the probability coverage.As with the previous editions, Ross' text has remendously clear exposition, plus real-data

  9. Statistical testing and power analysis for brain-wide association study.

    Science.gov (United States)

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Inferring Demographic History Using Two-Locus Statistics.

    Science.gov (United States)

    Ragsdale, Aaron P; Gutenkunst, Ryan N

    2017-06-01

    Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.

  11. Game statistics for the island of Olkiluoto in 2010-2011

    International Nuclear Information System (INIS)

    Nieminen, M.; Niemi, M.; Jussila, I.

    2011-10-01

    The game statistics for the island of Olkiluoto were updated in the summer 2011 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2010-2011. The moose population is stable when compared with the previous year. The white-tailed deer population is stable or slightly increasing when compared with the previous year. The changes in the roe deer population are not accurately known, but population size varies somewhat from year to year. The number of hunted raccoon dogs approximately doubled in the latest hunting season. Altogether two waterfowl were hunted in 2010 (17 in the previous year). The populations of mountain hare and red squirrel are abundant, and the number of hunted mountain hares approximately doubled when compared with the previous hunting season. The brown hare population is still small. In the winter, there were observations of one lynx spending time in the area. (orig.)

  12. Game statistics for the island of Olkiluoto in 2010-2011

    Energy Technology Data Exchange (ETDEWEB)

    Nieminen, M. [Faunatica Oy, Espoo (Finland); Niemi, M. [Helsinki Univ. (Finland), Dept. of Forest Sciences; Jussila, I. [Turku Univ. (Finland), Satakunta Environmental Research Inst.

    2011-10-15

    The game statistics for the island of Olkiluoto were updated in the summer 2011 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2010-2011. The moose population is stable when compared with the previous year. The white-tailed deer population is stable or slightly increasing when compared with the previous year. The changes in the roe deer population are not accurately known, but population size varies somewhat from year to year. The number of hunted raccoon dogs approximately doubled in the latest hunting season. Altogether two waterfowl were hunted in 2010 (17 in the previous year). The populations of mountain hare and red squirrel are abundant, and the number of hunted mountain hares approximately doubled when compared with the previous hunting season. The brown hare population is still small. In the winter, there were observations of one lynx spending time in the area. (orig.)

  13. Caregiver Statistics: Demographics

    Science.gov (United States)

    ... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...

  14. Statistical study of undulator radiated power by a classical detection system in the mm-wave regime

    Directory of Open Access Journals (Sweden)

    A. Eliran

    2009-05-01

    Full Text Available The statistics of FEL spontaneous emission power detected with a detector integration time much larger than the slippage time has been measured in many previous works at high frequencies. In such cases the quantum (shot noise generated in the detection process is dominant. We have measured spontaneous emission in the Israeli electrostatic accelerator FEL (EA-FEL operating in the mm-wave lengths. In this regime the detector is based on a diode rectifier for which the detector quantum noise is negligible. The measurements were repeated numerous times in order to create a sample space with sufficient data enabling evaluation of the statistical features of the radiated power. The probability density function of the radiated power was found and its moments were calculated. The results of analytical and numerical models are compared to those obtained in experimental measurements.

  15. Introduction to statistics using interactive MM*Stat elements

    CERN Document Server

    Härdle, Wolfgang Karl; Rönz, Bernd

    2015-01-01

    MM*Stat, together with its enhanced online version with interactive examples, offers a flexible tool that facilitates the teaching of basic statistics. It covers all the topics found in introductory descriptive statistics courses, including simple linear regression and time series analysis, the fundamentals of inferential statistics (probability theory, random sampling and estimation theory), and inferential statistics itself (confidence intervals, testing). MM*Stat is also designed to help students rework class material independently and to promote comprehension with the help of additional examples. Each chapter starts with the necessary theoretical background, which is followed by a variety of examples. The core examples are based on the content of the respective chapter, while the advanced examples, designed to deepen students’ knowledge, also draw on information and material from previous chapters. The enhanced online version helps students grasp the complexity and the practical relevance of statistical...

  16. A Study of Faculty Views of Statistics and Student Preparation beyond an Introductory Class

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura; Smith, Jessalyn

    2013-01-01

    The purpose of this research is to better understand the role of statistics in teaching and research by faculty from all disciplines and their perceptions of the statistical preparation of their students. This study reports the findings of a survey administered to faculty from seven colleges and universities regarding the use of statistics in…

  17. Subclinical delusional ideation and appreciation of sample size and heterogeneity in statistical judgment.

    Science.gov (United States)

    Galbraith, Niall D; Manktelow, Ken I; Morris, Neil G

    2010-11-01

    Previous studies demonstrate that people high in delusional ideation exhibit a data-gathering bias on inductive reasoning tasks. The current study set out to investigate the factors that may underpin such a bias by examining healthy individuals, classified as either high or low scorers on the Peters et al. Delusions Inventory (PDI). More specifically, whether high PDI scorers have a relatively poor appreciation of sample size and heterogeneity when making statistical judgments. In Expt 1, high PDI scorers made higher probability estimates when generalizing from a sample of 1 with regard to the heterogeneous human property of obesity. In Expt 2, this effect was replicated and was also observed in relation to the heterogeneous property of aggression. The findings suggest that delusion-prone individuals are less appreciative of the importance of sample size when making statistical judgments about heterogeneous properties; this may underpin the data gathering bias observed in previous studies. There was some support for the hypothesis that threatening material would exacerbate high PDI scorers' indifference to sample size.

  18. Development of a new statistical evaluation method for brain SPECT images

    International Nuclear Information System (INIS)

    Kawashima, Ryuta; Sato, Kazunori; Ito, Hiroshi; Koyama, Masamichi; Goto, Ryoui; Yoshioka, Seiro; Ono, Shuichi; Sato, Tachio; Fukuda, Hiroshi

    1996-01-01

    The purpose of this study was to develop a new statistical evaluation method for brain SPECT images. First, we made normal brain image databases using 99m Tc-ECD and SPECT in 10 normal subjects as described previously. Each SPECT images were globally normalized and anatomically standardized to the standard brain shape using Human Brain Atlas (HBA) of Roland et al. and each subject's X-CT. Then, mean and SD images were calculated voxel by voxel. For the next step, 99m Tc-ECD SPECT images of a patient were obtained, and global normalization and anatomical standardization were performed as the same way. Then, a statistical map was calculated as following voxel by voxel; (P-Mean)/SDx10+50, where P, mean and SD indicate voxel value of patient, mean and SD images of normal databases, respectively. We found this statistical map was helpful for clinical diagnosis of brain SPECT studies. (author)

  19. The influence of bilingualism on statistical word learning.

    Science.gov (United States)

    Poepsel, Timothy J; Weiss, Daniel J

    2016-07-01

    Statistical learning is a fundamental component of language acquisition, yet to date, relatively few studies have examined whether these abilities differ in bilinguals. In the present study, we examine this issue by comparing English monolinguals with Chinese-English and English-Spanish bilinguals in a cross-situational statistical learning (CSSL) task. In Experiment 1, we assessed the ability of both monolinguals and bilinguals on a basic CSSL task that contained only one-to-one mappings. In Experiment 2, learners were asked to form both one-to-one and two-to-one mappings, and were tested at three points during familiarization. Overall, monolinguals and bilinguals did not differ in their learning of one-to-one mappings. However, bilinguals more quickly acquired two-to-one mappings, while also exhibiting greater proficiency than monolinguals. We conclude that the fundamental SL mechanism may not be affected by language experience, in accord with previous studies. However, when the input contains greater variability, bilinguals may be more prone to detecting the presence of multiple structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. The effect of warm-up, static stretching and dynamic stretching on hamstring flexibility in previously injured subjects.

    LENUS (Irish Health Repository)

    O'Sullivan, Kieran

    2009-01-01

    BACKGROUND: Warm-up and stretching are suggested to increase hamstring flexibility and reduce the risk of injury. This study examined the short-term effects of warm-up, static stretching and dynamic stretching on hamstring flexibility in individuals with previous hamstring injury and uninjured controls. METHODS: A randomised crossover study design, over 2 separate days. Hamstring flexibility was assessed using passive knee extension range of motion (PKE ROM). 18 previously injured individuals and 18 uninjured controls participated. On both days, four measurements of PKE ROM were recorded: (1) at baseline; (2) after warm-up; (3) after stretch (static or dynamic) and (4) after a 15-minute rest. Participants carried out both static and dynamic stretches, but on different days. Data were analysed using Anova. RESULTS: Across both groups, there was a significant main effect for time (p < 0.001). PKE ROM significantly increased with warm-up (p < 0.001). From warm-up, PKE ROM further increased with static stretching (p = 0.04) but significantly decreased after dynamic stretching (p = 0.013). The increased flexibility after warm-up and static stretching reduced significantly (p < 0.001) after 15 minutes of rest, but remained significantly greater than at baseline (p < 0.001). Between groups, there was no main effect for group (p = 0.462), with no difference in mean PKE ROM values at any individual stage of the protocol (p > 0.05). Using ANCOVA to adjust for the non-significant (p = 0.141) baseline difference between groups, the previously injured group demonstrated a greater response to warm-up and static stretching, however this was not statistically significant (p = 0.05). CONCLUSION: Warm-up significantly increased hamstring flexibility. Static stretching also increased hamstring flexibility, whereas dynamic did not, in agreement with previous findings on uninjured controls. The effect of warm-up and static stretching on flexibility was greater in those with reduced

  1. From Research to Practice: Basic Mathematics Skills and Success in Introductory Statistics

    Science.gov (United States)

    Lunsford, M. Leigh; Poplin, Phillip

    2011-01-01

    Based on previous research of Johnson and Kuennen (2006), we conducted a study to determine factors that would possibly predict student success in an introductory statistics course. Our results were similar to Johnson and Kuennen in that we found students' basic mathematical skills, as measured on a test created by Johnson and Kuennen, were a…

  2. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  3. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    Science.gov (United States)

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  4. Local multiplicity adjustment for the spatial scan statistic using the Gumbel distribution.

    Science.gov (United States)

    Gangnon, Ronald E

    2012-03-01

    The spatial scan statistic is an important and widely used tool for cluster detection. It is based on the simultaneous evaluation of the statistical significance of the maximum likelihood ratio test statistic over a large collection of potential clusters. In most cluster detection problems, there is variation in the extent of local multiplicity across the study region. For example, using a fixed maximum geographic radius for clusters, urban areas typically have many overlapping potential clusters, whereas rural areas have relatively few. The spatial scan statistic does not account for local multiplicity variation. We describe a previously proposed local multiplicity adjustment based on a nested Bonferroni correction and propose a novel adjustment based on a Gumbel distribution approximation to the distribution of a local scan statistic. We compare the performance of all three statistics in terms of power and a novel unbiased cluster detection criterion. These methods are then applied to the well-known New York leukemia dataset and a Wisconsin breast cancer incidence dataset. © 2011, The International Biometric Society.

  5. Statistical hadronization and hadronic micro-canonical ensemble II

    International Nuclear Information System (INIS)

    Becattini, F.; Ferroni, L.

    2004-01-01

    We present a Monte Carlo calculation of the micro-canonical ensemble of the ideal hadron-resonance gas including all known states up to a mass of about 1.8 GeV and full quantum statistics. The micro-canonical average multiplicities of the various hadron species are found to converge to the canonical ones for moderately low values of the total energy, around 8 GeV, thus bearing out previous analyses of hadronic multiplicities in the canonical ensemble. The main numerical computing method is an importance sampling Monte Carlo algorithm using the product of Poisson distributions to generate multi-hadronic channels. It is shown that the use of this multi-Poisson distribution allows for an efficient and fast computation of averages, which can be further improved in the limit of very large clusters. We have also studied the fitness of a previously proposed computing method, based on the Metropolis Monte Carlo algorithm, for event generation in the statistical hadronization model. We find that the use of the multi-Poisson distribution as proposal matrix dramatically improves the computation performance. However, due to the correlation of subsequent samples, this method proves to be generally less robust and effective than the importance sampling method. (orig.)

  6. Infants' statistical learning: 2- and 5-month-olds' segmentation of continuous visual sequences.

    Science.gov (United States)

    Slone, Lauren Krogh; Johnson, Scott P

    2015-05-01

    Past research suggests that infants have powerful statistical learning abilities; however, studies of infants' visual statistical learning offer differing accounts of the developmental trajectory of and constraints on this learning. To elucidate this issue, the current study tested the hypothesis that young infants' segmentation of visual sequences depends on redundant statistical cues to segmentation. A sample of 20 2-month-olds and 20 5-month-olds observed a continuous sequence of looming shapes in which unit boundaries were defined by both transitional probability and co-occurrence frequency. Following habituation, only 5-month-olds showed evidence of statistically segmenting the sequence, looking longer to a statistically improbable shape pair than to a probable pair. These results reaffirm the power of statistical learning in infants as young as 5 months but also suggest considerable development of statistical segmentation ability between 2 and 5 months of age. Moreover, the results do not support the idea that infants' ability to segment visual sequences based on transitional probabilities and/or co-occurrence frequencies is functional at the onset of visual experience, as has been suggested previously. Rather, this type of statistical segmentation appears to be constrained by the developmental state of the learner. Factors contributing to the development of statistical segmentation ability during early infancy, including memory and attention, are discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. A new universality class in corpus of texts; A statistical physics study

    Science.gov (United States)

    Najafi, Elham; Darooneh, Amir H.

    2018-05-01

    Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.

  8. Daniel Goodman’s empirical approach to Bayesian statistics

    Science.gov (United States)

    Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina

    2016-01-01

    Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.

  9. Statistical prediction of parametric roll using FORM

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher; Choi, Ju-hyuck; Nielsen, Ulrik Dam

    2017-01-01

    Previous research has shown that the First Order Reliability Method (FORM) can be an efficient method for estimation of outcrossing rates and extreme value statistics for stationary stochastic processes. This is so also for bifurcation type of processes like parametric roll of ships. The present...

  10. Subjectivism as an unavoidable feature of ecological statistics

    Directory of Open Access Journals (Sweden)

    Martínez–Abraín, A.

    2014-12-01

    Full Text Available We approach here the handling of previous information when performing statistical inference in ecology, both when dealing with model specification and selection, and when dealing with parameter estimation. We compare the perspectives of this problem from the frequentist and Bayesian schools, including objective and subjective Bayesians. We show that the issue of making use of previous information and making a priori decisions is not only a reality for Bayesians but also for frequentists. However, the latter tend to overlook this because of the common difficulty of having previous information available on the magnitude of the effect that is thought to be biologically relevant. This prior information should be fed into a priori power tests when looking for the necessary sample sizes to couple statistical and biological significances. Ecologists should make a greater effort to make use of available prior information because this is their most legitimate contribution to the inferential process. Parameter estimation and model selection would benefit if this was done, allowing a more reliable accumulation of knowledge, and hence progress, in the biological sciences.

  11. Brief introduction to Lie-admissible formulations in statistical mechanics

    International Nuclear Information System (INIS)

    Fronteau, J.

    1981-01-01

    The present article is a summary of the essential ideas and results published in previous articles, the aim here being to describe the situation in a schematic way for the benefit of non-specialists. The non-truncated Liouville theorem and equation, natural introduction of Lie-admissible formulations into statistical mechanics, the notion of a statistical quasi-particle, and transition towards the notion of fine thermodynamics are discussed

  12. Fast optimization of statistical potentials for structurally constrained phylogenetic models

    Directory of Open Access Journals (Sweden)

    Rodrigue Nicolas

    2009-09-01

    Full Text Available Abstract Background Statistical approaches for protein design are relevant in the field of molecular evolutionary studies. In recent years, new, so-called structurally constrained (SC models of protein-coding sequence evolution have been proposed, which use statistical potentials to assess sequence-structure compatibility. In a previous work, we defined a statistical framework for optimizing knowledge-based potentials especially suited to SC models. Our method used the maximum likelihood principle and provided what we call the joint potentials. However, the method required numerical estimations by the use of computationally heavy Markov Chain Monte Carlo sampling algorithms. Results Here, we develop an alternative optimization procedure, based on a leave-one-out argument coupled to fast gradient descent algorithms. We assess that the leave-one-out potential yields very similar results to the joint approach developed previously, both in terms of the resulting potential parameters, and by Bayes factor evaluation in a phylogenetic context. On the other hand, the leave-one-out approach results in a considerable computational benefit (up to a 1,000 fold decrease in computational time for the optimization procedure. Conclusion Due to its computational speed, the optimization method we propose offers an attractive alternative for the design and empirical evaluation of alternative forms of potentials, using large data sets and high-dimensional parameterizations.

  13. Statistical study of high-latitude plasma flow during magnetospheric substorms

    Directory of Open Access Journals (Sweden)

    G. Provan

    2004-11-01

    Full Text Available We have utilised the near-global imaging capabilities of the Northern Hemisphere SuperDARN radars, to perform a statistical superposed epoch analysis of high-latitude plasma flows during magnetospheric substorms. The study involved 67 substorms, identified using the IMAGE FUV space-borne auroral imager. A substorm co-ordinate system was developed, centred on the magnetic local time and magnetic latitude of substorm onset determined from the auroral images. The plasma flow vectors from all 67 intervals were combined, creating global statistical plasma flow patterns and backscatter occurrence statistics during the substorm growth and expansion phases. The commencement of the substorm growth phase was clearly observed in the radar data 18-20min before substorm onset, with an increase in the anti-sunward component of the plasma velocity flowing across dawn sector of the polar cap and a peak in the dawn-to-dusk transpolar voltage. Nightside backscatter moved to lower latitudes as the growth phase progressed. At substorm onset a flow suppression region was observed on the nightside, with fast flows surrounding the suppressed flow region. The dawn-to-dusk transpolar voltage increased from ~40kV just before substorm onset to ~75kV 12min after onset. The low-latitude return flow started to increase at substorm onset and continued to increase until 8min after onset. The velocity flowing across the polar-cap peaked 12-14min after onset. This increase in the flux of the polar cap and the excitation of large-scale plasma flow occurred even though the IMF Bz component was increasing (becoming less negative during most of this time. This study is the first to statistically prove that nightside reconnection creates magnetic flux and excites high-latitude plasma flow in a similar way to dayside reconnection and that dayside and nightside reconnection, are two separate time-dependent processes.

  14. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  15. Statistical inference for template aging

    Science.gov (United States)

    Schuckers, Michael E.

    2006-04-01

    A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.

  16. Lagrangian statistics across the turbulent-nonturbulent interface in a turbulent plane jet.

    Science.gov (United States)

    Taveira, Rodrigo R; Diogo, José S; Lopes, Diogo C; da Silva, Carlos B

    2013-10-01

    Lagrangian statistics from millions of particles are used to study the turbulent entrainment mechanism in a direct numerical simulation of a turbulent plane jet at Re(λ) ≈ 110. The particles (tracers) are initially seeded at the irrotational region of the jet near the turbulent shear layer and are followed as they are drawn into the turbulent region across the turbulent-nonturbulent interface (TNTI), allowing the study of the enstrophy buildup and thereby characterizing the turbulent entrainment mechanism in the jet. The use of Lagrangian statistics following fluid particles gives a more correct description of the entrainment mechanism than in previous works since the statistics in relation to the TNTI position involve data from the trajectories of the entraining fluid particles. The Lagrangian statistics for the particles show the existence of a velocity jump and a characteristic vorticity jump (with a thickness which is one order of magnitude greater than the Kolmogorov microscale), in agreement with previous results using Eulerian statistics. The particles initially acquire enstrophy by viscous diffusion and later by enstrophy production, which becomes "active" only deep inside the turbulent region. Both enstrophy diffusion and production near the TNTI differ substantially from inside the turbulent region. Only about 1% of all particles find their way into pockets of irrotational flow engulfed into the turbulent shear layer region, indicating that "engulfment" is not significant for the present flow, indirectly suggesting that the entrainment is largely due to "nibbling" small-scale mechanisms acting along the entire TNTI surface. Probability density functions of particle positions suggests that the particles spend more time crossing the region near the TNTI than traveling inside the turbulent region, consistent with the particles moving tangent to the interface around the time they cross it.

  17. Statistical Learning, Syllable Processing, and Speech Production in Healthy Hearing and Hearing-Impaired Preschool Children: A Mismatch Negativity Study.

    Science.gov (United States)

    Studer-Eichenberger, Esther; Studer-Eichenberger, Felix; Koenig, Thomas

    2016-01-01

    The objectives of the present study were to investigate temporal/spectral sound-feature processing in preschool children (4 to 7 years old) with peripheral hearing loss compared with age-matched controls. The results verified the presence of statistical learning, which was diminished in children with hearing impairments (HIs), and elucidated possible perceptual mediators of speech production. Perception and production of the syllables /ba/, /da/, /ta/, and /na/ were recorded in 13 children with normal hearing and 13 children with HI. Perception was assessed physiologically through event-related potentials (ERPs) recorded by EEG in a multifeature mismatch negativity paradigm and behaviorally through a discrimination task. Temporal and spectral features of the ERPs during speech perception were analyzed, and speech production was quantitatively evaluated using speech motor maximum performance tasks. Proximal to stimulus onset, children with HI displayed a difference in map topography, indicating diminished statistical learning. In later ERP components, children with HI exhibited reduced amplitudes in the N2 and early parts of the late disciminative negativity components specifically, which are associated with temporal and spectral control mechanisms. Abnormalities of speech perception were only subtly reflected in speech production, as the lone difference found in speech production studies was a mild delay in regulating speech intensity. In addition to previously reported deficits of sound-feature discriminations, the present study results reflect diminished statistical learning in children with HI, which plays an early and important, but so far neglected, role in phonological processing. Furthermore, the lack of corresponding behavioral abnormalities in speech production implies that impaired perceptual capacities do not necessarily translate into productive deficits.

  18. A Case Study in Elementary Statistics: The Florida Panther Population

    Science.gov (United States)

    Lazowski, Andrew; Stopper, Geffrey

    2013-01-01

    We describe a case study that was created to intertwine the fields of biology and mathematics. This project is given in an elementary probability and statistics course for non-math majors. Some goals of this case study include: to expose students to biology in a math course, to apply probability to real-life situations, and to display how far a…

  19. Open Access!: Review of Online Statistics: An Interactive Multimedia Course of Study by David Lane

    Directory of Open Access Journals (Sweden)

    Samuel L. Tunstall

    2016-01-01

    Full Text Available David M. Lane (project leader. Online Statistics Education: An Interactive Multimedia Course of Study (http://onlinestatbook.com/ Also: David M. Lane (primary author and editor, with David Scott, Mikki Hebl, Rudy Guerra, Dan Osherson, and Heidi Zimmer. Introduction to Statistics. Online edition (http://onlinestatbook.com/Online_Statistics_Education.pdf, 694 pp. It is rare that students receive high-quality textbooks for free, but David Lane's Online Statistics: An Interactive Multimedia Course of Study permits precisely that. This review gives an overview of the many features in Lane's online textbook, including the Java Applets, the textbook itself, and the resources available for instructors. A discussion of uses of the site, as well as a comparison of the text to alternative online statistics textbooks, is included.

  20. Statistical inferences under the Null hypothesis: Common mistakes and pitfalls in neuroimaging studies.

    Directory of Open Access Journals (Sweden)

    Jean-Michel eHupé

    2015-02-01

    Full Text Available Published studies using functional and structural MRI include many errors in the way data are analyzed and conclusions reported. This was observed when working on a comprehensive review of the neural bases of synesthesia, but these errors are probably endemic to neuroimaging studies. All studies reviewed had based their conclusions using Null Hypothesis Significance Tests (NHST. NHST have yet been criticized since their inception because they are more appropriate for taking decisions related to a Null hypothesis (like in manufacturing than for making inferences about behavioral and neuronal processes. Here I focus on a few key problems of NHST related to brain imaging techniques, and explain why or when we should not rely on significance tests. I also observed that, often, the ill-posed logic of NHST was even not correctly applied, and describe what I identified as common mistakes or at least problematic practices in published papers, in light of what could be considered as the very basics of statistical inference. MRI statistics also involve much more complex issues than standard statistical inference. Analysis pipelines vary a lot between studies, even for those using the same software, and there is no consensus which pipeline is the best. I propose a synthetic view of the logic behind the possible methodological choices, and warn against the usage and interpretation of two statistical methods popular in brain imaging studies, the false discovery rate (FDR procedure and permutation tests. I suggest that current models for the analysis of brain imaging data suffer from serious limitations and call for a revision taking into account the new statistics (confidence intervals logic.

  1. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    Science.gov (United States)

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  2. [Pro Familia statistics for 1974].

    Science.gov (United States)

    1975-09-01

    Statistics for 1974 for the West German family planning organization Pro Familia are reported. 56 offices are now operating, and 23,726 clients were seen. Men were seen more frequently than previously. 10,000 telephone calls were also handled. 16-25 year olds were increasingly represented in the clientele, as were unmarried persons of all ages. 1,242 patients were referred to physicians or clinics for clinical diagnosis.

  3. Statistical methods for elimination of guarantee-time bias in cohort studies: a simulation study

    Directory of Open Access Journals (Sweden)

    In Sung Cho

    2017-08-01

    Full Text Available Abstract Background Aspirin has been considered to be beneficial in preventing cardiovascular diseases and cancer. Several pharmaco-epidemiology cohort studies have shown protective effects of aspirin on diseases using various statistical methods, with the Cox regression model being the most commonly used approach. However, there are some inherent limitations to the conventional Cox regression approach such as guarantee-time bias, resulting in an overestimation of the drug effect. To overcome such limitations, alternative approaches, such as the time-dependent Cox model and landmark methods have been proposed. This study aimed to compare the performance of three methods: Cox regression, time-dependent Cox model and landmark method with different landmark times in order to address the problem of guarantee-time bias. Methods Through statistical modeling and simulation studies, the performance of the above three methods were assessed in terms of type I error, bias, power, and mean squared error (MSE. In addition, the three statistical approaches were applied to a real data example from the Korean National Health Insurance Database. Effect of cumulative rosiglitazone dose on the risk of hepatocellular carcinoma was used as an example for illustration. Results In the simulated data, time-dependent Cox regression outperformed the landmark method in terms of bias and mean squared error but the type I error rates were similar. The results from real-data example showed the same patterns as the simulation findings. Conclusions While both time-dependent Cox regression model and landmark analysis are useful in resolving the problem of guarantee-time bias, time-dependent Cox regression is the most appropriate method for analyzing cumulative dose effects in pharmaco-epidemiological studies.

  4. Statistical testing of association between menstruation and migraine.

    Science.gov (United States)

    Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G

    2015-02-01

    To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.

  5. Learning Statistics at the Farmers Market? A Comparison of Academic Service Learning and Case Studies in an Introductory Statistics Course

    Science.gov (United States)

    Hiedemann, Bridget; Jones, Stacey M.

    2010-01-01

    We compare the effectiveness of academic service learning to that of case studies in an undergraduate introductory business statistics course. Students in six sections of the course were assigned either an academic service learning project (ASL) or business case studies (CS). We examine two learning outcomes: students' performance on the final…

  6. Detailed statistical analysis plan for the target temperature management after out-of-hospital cardiac arrest trial

    DEFF Research Database (Denmark)

    Nielsen, Niklas; Winkel, Per; Cronberg, Tobias

    2013-01-01

    Animal experimental studies and previous randomized trials suggest an improvement in mortality and neurological function with temperature regulation to hypothermia after cardiac arrest. According to a systematic review, previous trials were small, had a risk of bias, evaluated select populations......, and did not treat hyperthermia in the control groups. The optimal target temperature management (TTM) strategy is not known. To prevent outcome reporting bias, selective reporting and data-driven results, we present the a priori defined detailed statistical analysis plan as an update to the previously...

  7. Facts about Newspapers '85: A Statistical Summary of the Newspaper Business.

    Science.gov (United States)

    American Newspaper Publishers Association, Washington, DC.

    A statistical summary of the newspaper industry for 1984 and previous years is presented in this brochure. Focusing primarily on the United States newspaper industry, the brochure also contains some information on Canadian newspapers. The brochure presents statistics in the following categories: (1) number of daily newspapers, (2) daily newspaper…

  8. Previously unidentified changes in renal cell carcinoma gene expression identified by parametric analysis of microarray data

    International Nuclear Information System (INIS)

    Lenburg, Marc E; Liou, Louis S; Gerry, Norman P; Frampton, Garrett M; Cohen, Herbert T; Christman, Michael F

    2003-01-01

    Renal cell carcinoma is a common malignancy that often presents as a metastatic-disease for which there are no effective treatments. To gain insights into the mechanism of renal cell carcinogenesis, a number of genome-wide expression profiling studies have been performed. Surprisingly, there is very poor agreement among these studies as to which genes are differentially regulated. To better understand this lack of agreement we profiled renal cell tumor gene expression using genome-wide microarrays (45,000 probe sets) and compare our analysis to previous microarray studies. We hybridized total RNA isolated from renal cell tumors and adjacent normal tissue to Affymetrix U133A and U133B arrays. We removed samples with technical defects and removed probesets that failed to exhibit sequence-specific hybridization in any of the samples. We detected differential gene expression in the resulting dataset with parametric methods and identified keywords that are overrepresented in the differentially expressed genes with the Fisher-exact test. We identify 1,234 genes that are more than three-fold changed in renal tumors by t-test, 800 of which have not been previously reported to be altered in renal cell tumors. Of the only 37 genes that have been identified as being differentially expressed in three or more of five previous microarray studies of renal tumor gene expression, our analysis finds 33 of these genes (89%). A key to the sensitivity and power of our analysis is filtering out defective samples and genes that are not reliably detected. The widespread use of sample-wise voting schemes for detecting differential expression that do not control for false positives likely account for the poor overlap among previous studies. Among the many genes we identified using parametric methods that were not previously reported as being differentially expressed in renal cell tumors are several oncogenes and tumor suppressor genes that likely play important roles in renal cell

  9. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  10. Statistical Analysis of Large Simulated Yield Datasets for Studying Climate Effects

    Science.gov (United States)

    Makowski, David; Asseng, Senthold; Ewert, Frank; Bassu, Simona; Durand, Jean-Louis; Martre, Pierre; Adam, Myriam; Aggarwal, Pramod K.; Angulo, Carlos; Baron, Chritian; hide

    2015-01-01

    Many studies have been carried out during the last decade to study the effect of climate change on crop yields and other key crop characteristics. In these studies, one or several crop models were used to simulate crop growth and development for different climate scenarios that correspond to different projections of atmospheric CO2 concentration, temperature, and rainfall changes (Semenov et al., 1996; Tubiello and Ewert, 2002; White et al., 2011). The Agricultural Model Intercomparison and Improvement Project (AgMIP; Rosenzweig et al., 2013) builds on these studies with the goal of using an ensemble of multiple crop models in order to assess effects of climate change scenarios for several crops in contrasting environments. These studies generate large datasets, including thousands of simulated crop yield data. They include series of yield values obtained by combining several crop models with different climate scenarios that are defined by several climatic variables (temperature, CO2, rainfall, etc.). Such datasets potentially provide useful information on the possible effects of different climate change scenarios on crop yields. However, it is sometimes difficult to analyze these datasets and to summarize them in a useful way due to their structural complexity; simulated yield data can differ among contrasting climate scenarios, sites, and crop models. Another issue is that it is not straightforward to extrapolate the results obtained for the scenarios to alternative climate change scenarios not initially included in the simulation protocols. Additional dynamic crop model simulations for new climate change scenarios are an option but this approach is costly, especially when a large number of crop models are used to generate the simulated data, as in AgMIP. Statistical models have been used to analyze responses of measured yield data to climate variables in past studies (Lobell et al., 2011), but the use of a statistical model to analyze yields simulated by complex

  11. Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology

    Directory of Open Access Journals (Sweden)

    Michèle B. Nuijten

    2017-12-01

    Full Text Available In this paper, we present three retrospective observational studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011. We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies. In Study 2, we compared reporting inconsistencies in psychology articles published in PLOS journals (with a data sharing policy and Frontiers in Psychology (without a stipulated data sharing policy. In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors. Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing seem extremely effective in promoting data sharing. We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

  12. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  13. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  14. Statistical considerations on safety analysis

    International Nuclear Information System (INIS)

    Pal, L.; Makai, M.

    2004-01-01

    The authors have investigated the statistical methods applied to safety analysis of nuclear reactors and arrived at alarming conclusions: a series of calculations with the generally appreciated safety code ATHLET were carried out to ascertain the stability of the results against input uncertainties in a simple experimental situation. Scrutinizing those calculations, we came to the conclusion that the ATHLET results may exhibit chaotic behavior. A further conclusion is that the technological limits are incorrectly set when the output variables are correlated. Another formerly unnoticed conclusion of the previous ATHLET calculations that certain innocent looking parameters (like wall roughness factor, the number of bubbles per unit volume, the number of droplets per unit volume) can influence considerably such output parameters as water levels. The authors are concerned with the statistical foundation of present day safety analysis practices and can only hope that their own misjudgment will be dispelled. Until then, the authors suggest applying correct statistical methods in safety analysis even if it makes the analysis more expensive. It would be desirable to continue exploring the role of internal parameters (wall roughness factor, steam-water surface in thermal hydraulics codes, homogenization methods in neutronics codes) in system safety codes and to study their effects on the analysis. In the validation and verification process of a code one carries out a series of computations. The input data are not precisely determined because measured data have an error, calculated data are often obtained from a more or less accurate model. Some users of large codes are content with comparing the nominal output obtained from the nominal input, whereas all the possible inputs should be taken into account when judging safety. At the same time, any statement concerning safety must be aleatory, and its merit can be judged only when the probability is known with which the

  15. The effects of previous open renal stone surgery types on PNL outcomes.

    Science.gov (United States)

    Ozgor, Faruk; Kucuktopcu, Onur; Ucpinar, Burak; Sarilar, Omer; Erbin, Akif; Yanaral, Fatih; Sahan, Murat; Binbay, Murat

    2016-01-01

    Our aim was to demonstrate the effect of insicion of renal parenchyma during open renal stone surgery (ORSS) on percutaneous nephrolithotomy (PNL) outcomes. Patients with history of ORSS who underwent PNL operation between June 2005 and June 2015 were analyzed retrospectively. Patients were divided into two groups according to their type of previous ORSS. Patients who had a history of ORSS with parenchymal insicion, such as radial nephrotomies, anatrophic nephrolithotomy, lower pole resection, and partial nephrectomy, were included in Group 1. Other patients with a history of open pyelolithotomy were enrolled in Group 2. Preoperative characteristics, perioperative data, stone-free status, and complications were compared between the groups. Stone-free status was defined as complete clearance of stone(s) or presence of residual fragments smaller than 4 mm. The retrospective nature of our study, different experience level of surgeons, and lack of the evaluation of anesthetic agents and cost of procedures were limitations of our study. 123 and 111 patients were enrolled in Groups 1 and 2, respectively. Preoperative characteristics were similar between groups. In Group 1, the mean operative time was statistically longer than in Group 2 (p=0.013). Stone-free status was significantly higher in Group 2 than in Group 1 (p=0.027). Complication rates were similar between groups. Hemorrhage requiring blood transfusion was the most common complication in both groups (10.5% vs. 9.9%). Our study demonstrated that a history of previous ORSS with parenchymal insicion significantly reduces the success rates of PNL procedure.

  16. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  17. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    Science.gov (United States)

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  18. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  19. Structure and statistics of turbulent flow over riblets

    Science.gov (United States)

    Henderson, R. D.; Crawford, C. H.; Karniadakis, G. E.

    1993-01-01

    In this paper we present comparisons of turbulence statistics obtained from direct numerical simulation of flow over streamwise aligned triangular riblets with experimental results. We also present visualizations of the instantaneous velocity field inside and around the riblet valleys. In light of the behavior of the statistics and flowfields inside the riblet valleys, we investigate previously reported physical mechanisms for the drag reducing effect of riblets; our results here support the hypothesis of flow anchoring by the riblet valleys and the corresponding inhibition of spanwise flow motions.

  20. Statistical study of foreshock cavitons

    Directory of Open Access Journals (Sweden)

    P. Kajdič

    2013-12-01

    Full Text Available In this work we perform a statistical analysis of 92 foreshock cavitons observed with the Cluster spacecraft 1 during the period 2001–2006. We analyze time intervals during which the spacecraft was located in the Earth's foreshock with durations longer than 10 min. Together these amount to ~ 50 days. The cavitons are transient structures in the Earth's foreshock. Their main signatures in the data include simultaneous depletions of the magnetic field intensity and plasma density, which are surrounded by a rim of enhanced values of these two quantities. Cavitons form due to nonlinear interaction of transverse and compressive ultra-low frequency (ULF waves and are therefore always surrounded by intense compressive ULF fluctuations. They are carried by the solar wind towards the bow shock. This work represents the first systematic study of a large sample of foreshock cavitons. We find that cavitons appear for a wide range of solar wind and interplanetary magnetic field conditions and are therefore a common feature upstream of Earth's quasi-parallel bow shock with an average occurrence rate of ~ 2 events per day. We also discuss their observational properties in the context of other known upstream phenomena and show that the cavitons are a distinct structure in the foreshock.

  1. The use of statistics in real and simulated investigations performed by undergraduate health sciences' students

    OpenAIRE

    Pimenta, Rui; Nascimento, Ana; Vieira, Margarida; Costa, Elísio

    2010-01-01

    In previous works, we evaluated the statistical reasoning ability acquired by health sciences’ students carrying out their final undergraduate project. We found that these students achieved a good level of statistical literacy and reasoning in descriptive statistics. However, concerning inferential statistics the students did not reach a similar level. Statistics educators therefore claim for more effective ways to learn statistics such as project based investigations. These can be simulat...

  2. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  3. Association of Aortic Valve Sclerosis with Previous Coronary Artery Disease and Risk Factors

    Directory of Open Access Journals (Sweden)

    Filipe Carvalho Marmelo

    2014-11-01

    Full Text Available Background: Aortic valve sclerosis (AVS is characterized by increased thickness, calcification and stiffness of the aortic leaflets without fusion of the commissures. Several studies show an association between AVS and presence of coronary artery disease. Objective: The aim of this study is to investigate the association between presence of AVS with occurrence of previous coronary artery disease and classical risk factors. Methods: The sample was composed of 2,493 individuals who underwent transthoracic echocardiography between August 2011 and December 2012. The mean age of the cohort was 67.5 ± 15.9 years, and 50.7% were female. Results: The most frequent clinical indication for Doppler echocardiography was the presence of stroke (28.8%, and the most common risk factor was hypertension (60.8%. The most prevalent pathological findings on Doppler echocardiography were mitral valve sclerosis (37.1% and AVS (36.7%. There was a statistically significant association between AVS with hypertension (p < 0.001, myocardial infarction (p = 0.007, diabetes (p = 0.006 and compromised left ventricular systolic function (p < 0.001. Conclusion: Patients with AVS have higher prevalences of hypertension, stroke, hypercholesterolemia, myocardial infarction, diabetes and compromised left ventricular systolic function when compared with patients without AVS. We conclude that there is an association between presence of AVS with previous coronary artery disease and classical risk factors.

  4. Statistical Indicators for Religious Studies: Indicators of Level and Structure

    Science.gov (United States)

    Herteliu, Claudiu; Isaic-Maniu, Alexandru

    2009-01-01

    Using statistic indicators as vectors of information relative to the operational status of a phenomenon, including a religious one, is unanimously accepted. By introducing a system of statistic indicators we can also analyze the interfacing areas of a phenomenon. In this context, we have elaborated a system of statistic indicators specific to the…

  5. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Statistical methods of parameter estimation for deterministically chaotic time series

    Science.gov (United States)

    Pisarenko, V. F.; Sornette, D.

    2004-03-01

    We discuss the possibility of applying some standard statistical methods (the least-square method, the maximum likelihood method, and the method of statistical moments for estimation of parameters) to deterministically chaotic low-dimensional dynamic system (the logistic map) containing an observational noise. A “segmentation fitting” maximum likelihood (ML) method is suggested to estimate the structural parameter of the logistic map along with the initial value x1 considered as an additional unknown parameter. The segmentation fitting method, called “piece-wise” ML, is similar in spirit but simpler and has smaller bias than the “multiple shooting” previously proposed. Comparisons with different previously proposed techniques on simulated numerical examples give favorable results (at least, for the investigated combinations of sample size N and noise level). Besides, unlike some suggested techniques, our method does not require the a priori knowledge of the noise variance. We also clarify the nature of the inherent difficulties in the statistical analysis of deterministically chaotic time series and the status of previously proposed Bayesian approaches. We note the trade off between the need of using a large number of data points in the ML analysis to decrease the bias (to guarantee consistency of the estimation) and the unstable nature of dynamical trajectories with exponentially fast loss of memory of the initial condition. The method of statistical moments for the estimation of the parameter of the logistic map is discussed. This method seems to be the unique method whose consistency for deterministically chaotic time series is proved so far theoretically (not only numerically).

  7. Monitoring and Evaluation; Statistical Support for Life-cycle Studies, 2003 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Skalski, John

    2003-12-01

    This report summarizes the statistical analysis and consulting activities performed under Contract No. 00004134, Project No. 199105100 funded by Bonneville Power Administration during 2003. These efforts are focused on providing real-time predictions of outmigration timing, assessment of life-history performance measures, evaluation of status and trends in recovery, and guidance on the design and analysis of Columbia Basin fish and wildlife studies monitoring and evaluation studies. The overall objective of the project is to provide BPA and the rest of the fisheries community with statistical guidance on design, analysis, and interpretation of monitoring data, which will lead to improved monitoring and evaluation of salmonid mitigation programs in the Columbia/Snake River Basin. This overall goal is being accomplished by making fisheries data readily available for public scrutiny, providing statistical guidance on the design and analyses of studies by hands-on support and written documents, and providing real-time analyses of tagging results during the smolt outmigration for review by decision makers. For a decade, this project has been providing in-season projections of smolt outmigration timing to assist in spill management. As many as 50 different fish stocks at 8 different hydroprojects are tracked and real-time to predict the 'percent of run to date' and 'date to specific percentile'. The project also conducts added-value analyses of historical tagging data to understand relationships between fish responses, environmental factors, and anthropogenic effects. The statistical analysis of historical tagging data crosses agency lines in order to assimilate information on salmon population dynamics irrespective of origin. The lessons learned from past studies are used to improve the design and analyses of future monitoring and evaluation efforts. Through these efforts, the project attempts to provide the fisheries community with reliable analyses

  8. A statistical study of the upstream intermediate ion boundary in the Earth's foreshock

    Directory of Open Access Journals (Sweden)

    K. Meziane

    Full Text Available A statistical investigation of the location of onset of intermediate and gyrating ion populations in the Earth's foreshock is presented based on Fixed Voltage Analyzer data from ISEE 1. This study reveals the existence of a spatial boundary for intermediate and gyrating ion populations that coincides with the reported ULF wave boundary. This boundary position in the Earth's foreshock depends strongly upon the magnetic cone angle θBX and appears well defined for relatively large cone angles, though not for small cone angles. As reported in a previous study of the ULF wave boundary, the position of the intermediate-gyrating ion boundary is not compatible with a fixed growth rate of the waves resulting from the interaction between a uniform beam and the ambient plasma. The present work examines the momentum associated with protons which travel along this boundary, and we show that the variation of the boundary position (or equivalently, the associated particle momentum with the cone angle is related to classical acceleration mechanisms at the bow shock surface. The same functional behavior as a function of the cone angle is obtained for the momentum predicted by an acceleration model and for the particle momentum associated with the boundary. However, the model predicts systematically larger values of the momentum than the observation related values by a constant amount; we suggest that this difference may be due to some momentum exchange between the incident solar-wind population and the backstreaming particles through a wave-particle interaction resulting from a beam plasma instability.

    Key words. Intermediate ion boundary · Statistical investigation · Earth's foreshock · ISEE 1 spacecraft

  9. Connecting functional and statistical definitions of genotype by genotype interactions in coevolutionary studies

    Directory of Open Access Journals (Sweden)

    Katy Denise Heath

    2014-04-01

    Full Text Available Predicting how species interactions evolve requires that we understand the mechanistic basis of coevolution, and thus the functional genotype-by-genotype interactions (G × G that drive reciprocal natural selection. Theory on host-parasite coevolution provides testable hypotheses for empiricists, but depends upon models of functional G × G that remain loosely tethered to the molecular details of any particular system. In practice, reciprocal cross-infection studies are often used to partition the variation in infection or fitness in a population that is attributable to G × G (statistical G × G. Here we use simulations to demonstrate that within-population statistical G × G likely tells us little about the existence of coevolution, its strength, or the genetic basis of functional G × G. Combined with studies of multiple populations or points in time, mapping and molecular techniques can bridge the gap between natural variation and mechanistic models of coevolution, while model-based statistics can formally confront coevolutionary models with cross-infection data. Together these approaches provide a robust framework for inferring the infection genetics underlying statistical G × G, helping unravel the genetic basis of coevolution.

  10. Statistical Study of False Alarms of Geomagnetic Storms

    DEFF Research Database (Denmark)

    Leer, Kristoffer; Vennerstrøm, Susanne; Veronig, A.

    . A subset of these halo CMEs did not cause a geomagnetic storm the following four days and have therefore been considered as false alarms. The properties of these events are investigated and discussed here. Their statistics are compared to the geo-effective CMEs. The ability to identify potential false......Coronal Mass Ejections (CMEs) are known to cause geomagnetic storms on Earth. However, not all CMEs will trigger geomagnetic storms, even if they are heading towards the Earth. In this study, front side halo CMEs with speed larger than 500 km/s have been identified from the SOHO LASCO catalogue...

  11. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  12. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  13. Applied Bayesian statistical studies in biology and medicine

    CERN Document Server

    D’Amore, G; Scalfari, F

    2004-01-01

    It was written on another occasion· that "It is apparent that the scientific culture, if one means production of scientific papers, is growing exponentially, and chaotically, in almost every field of investigation". The biomedical sciences sensu lato and mathematical statistics are no exceptions. One might say then, and with good reason, that another collection of bio­ statistical papers would only add to the overflow and cause even more confusion. Nevertheless, this book may be greeted with some interest if we state that most of the papers in it are the result of a collaboration between biologists and statisticians, and partly the product of the Summer School th "Statistical Inference in Human Biology" which reaches its 10 edition in 2003 (information about the School can be obtained at the Web site http://www2. stat. unibo. itleventilSito%20scuolalindex. htm). is common experience - and not only This is rather important. Indeed, it in Italy - that encounters between statisticians and researchers are spora...

  14. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  15. Previous induced abortion among young women seeking abortion-related care in Kenya: a cross-sectional analysis.

    Science.gov (United States)

    Kabiru, Caroline W; Ushie, Boniface A; Mutua, Michael M; Izugbara, Chimaraoke O

    2016-05-14

    Unsafe abortion is a leading cause of death among young women aged 10-24 years in sub-Saharan Africa. Although having multiple induced abortions may exacerbate the risk for poor health outcomes, there has been minimal research on young women in this region who have multiple induced abortions. The objective of this study was therefore to assess the prevalence and correlates of reporting a previous induced abortion among young females aged 12-24 years seeking abortion-related care in Kenya. We used data on 1,378 young women aged 12-24 years who presented for abortion-related care in 246 health facilities in a nationwide survey conducted in 2012. Socio-demographic characteristics, reproductive and clinical histories, and physical examination assessment data were collected from women during a one-month data collection period using an abortion case capture form. Nine percent (n = 98) of young women reported a previous induced abortion prior to the index pregnancy for which they were receiving care. Statistically significant differences by previous history of induced abortion were observed for area of residence, religion and occupation at bivariate level. Urban dwellers and unemployed/other young women were more likely to report a previous induced abortion. A greater proportion of young women reporting a previous induced abortion stated that they were using a contraceptive method at the time of the index pregnancy (47 %) compared with those reporting no previous induced abortion (23 %). Not surprisingly, a greater proportion of young women reporting a previous induced abortion (82 %) reported their index pregnancy as unintended (not wanted at all or mistimed) compared with women reporting no previous induced abortion (64 %). Our study results show that about one in every ten young women seeking abortion-related care in Kenya reports a previous induced abortion. Comprehensive post-abortion care services targeting young women are needed. In particular, post

  16. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  17. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  18. The Reliability of Single Subject Statistics for Biofeedback Studies.

    Science.gov (United States)

    Bremner, Frederick J.; And Others

    To test the usefulness of single subject statistical designs for biofeedback, three experiments were conducted comparing biofeedback to meditation, and to a compound stimulus recognition task. In a statistical sense, this experimental design is best described as one experiment with two replications. The apparatus for each of the three experiments…

  19. Spatio-temporal dependencies between hospital beds, physicians and health expenditure using visual variables and data classification in statistical table

    Science.gov (United States)

    Medyńska-Gulij, Beata; Cybulski, Paweł

    2016-06-01

    This paper analyses the use of table visual variables of statistical data of hospital beds as an important tool for revealing spatio-temporal dependencies. It is argued that some of conclusions from the data about public health and public expenditure on health have a spatio-temporal reference. Different from previous studies, this article adopts combination of cartographic pragmatics and spatial visualization with previous conclusions made in public health literature. While the significant conclusions about health care and economic factors has been highlighted in research papers, this article is the first to apply visual analysis to statistical table together with maps which is called previsualisation.

  20. Spatio-temporal dependencies between hospital beds, physicians and health expenditure using visual variables and data classification in statistical table

    Directory of Open Access Journals (Sweden)

    Medyńska-Gulij Beata

    2016-06-01

    Full Text Available This paper analyses the use of table visual variables of statistical data of hospital beds as an important tool for revealing spatio-temporal dependencies. It is argued that some of conclusions from the data about public health and public expenditure on health have a spatio-temporal reference. Different from previous studies, this article adopts combination of cartographic pragmatics and spatial visualization with previous conclusions made in public health literature. While the significant conclusions about health care and economic factors has been highlighted in research papers, this article is the first to apply visual analysis to statistical table together with maps which is called previsualisation.

  1. Statistic rCBF study of extrapyramidal disorders

    Energy Technology Data Exchange (ETDEWEB)

    Kamei, Hiroshi; Nakajima, Takashi; Fukuhara, Nobuyoshi [National Saigata Hospital, Ogata, Niigata (Japan)

    2002-08-01

    We studied regional cerebral blood flow (rCBF) in 16 patients with Parkinson's disease (PD), 2 patients with dementia with Lewy bodies (DLB), 2 patients with progressive supranuclear palsy (PSP), 2 patients with striatonigral degeneration, and 16 normal volunteers, using Three-dimensional stereotactic surface projections (3D-SSP). Decreased rCBF in PD patients was shown in the posterior parietal and occipital cortex. Decreased rCBF in DLB was shown in the frontal, parietal and occipital cortex with relative sparing of the sensorimotor cortex.. Decreased rCBF in PSP was shown in the frontal cortex. Decreased rCBF in SND was shown in the frontal cortex and cerebellum. Statistic rCBF analysis using 3D-SSP was a useful measure for the early differential diagnosis of extrapyramidal disorders. (author)

  2. Statistical study of ion pitch-angle distributions

    International Nuclear Information System (INIS)

    Sibeck, D.G.; Mcentire, R.W.; Lui, A.T.Y.; Krimigis, S.M.

    1987-01-01

    Preliminary results of a statistical study of energetic (34-50 keV) ion pitch-angle distributions (PADs) within 9 Re of earth provide evidence for an orderly pattern consistent with both drift-shell splitting and magnetopause shadowing. Normal ion PADs dominate the dayside and inner magnetosphere. Butterfly PADs typically occur in a narrow belt stretching from dusk to dawn through midnight, where they approach within 6 Re of earth. While those ion butterfly PADs that typically occur on closed drift paths are mainly caused by drift-shell splitting, there is also evidence for magnetopause shadowing in observations of more frequent butterfly PAD occurrence in the outer magnetosphere near dawn than dusk. Isotropic and gradient boundary PADs terminate the tailward extent of the butterfly ion PAD belt. 9 references

  3. Epilepsy and occupational accidents in Brazil: a national statistics study.

    Science.gov (United States)

    Lunardi, Mariana dos Santos; Soliman, Lucas Alexandre Pedrollo; Pauli, Carla; Lin, Katia

    2011-01-01

    Epilepsy may restrict the patient's daily life. It causes lower quality of life and increased risk for work-related accidents (WRA). The aim of this study is to analyze the implantation of the Epidemiologic and Technical Security System Nexus (ETSSN) and WRA patterns among patients with epilepsy. Data regarding WRA, between 1999 and 2008, on the historical database of WRA Infolog Statistical Yearbook from Brazilian Ministry of Social Security were reviewed. There was a significant increase of reported cases during the ten year period, mainly after the establishment of the ETSSN. The increased granted benefits evidenced the epidemiologic association between epilepsy and WRA. ETSSN possibly raised the registration of occupational accidents and granted benefits. However, the real number of WRA may remain underestimated due to informal economy and house workers' accidents which are usually not included in the official statistics in Brazil.

  4. Statistical learning and auditory processing in children with music training: An ERP study.

    Science.gov (United States)

    Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Ibrahim, Ronny; Arciuli, Joanne

    2017-07-01

    The question whether musical training is associated with enhanced auditory and cognitive abilities in children is of considerable interest. In the present study, we compared children with music training versus those without music training across a range of auditory and cognitive measures, including the ability to detect implicitly statistical regularities in input (statistical learning). Statistical learning of regularities embedded in auditory and visual stimuli was measured in musically trained and age-matched untrained children between the ages of 9-11years. In addition to collecting behavioural measures, we recorded electrophysiological measures to obtain an online measure of segmentation during the statistical learning tasks. Musically trained children showed better performance on melody discrimination, rhythm discrimination, frequency discrimination, and auditory statistical learning. Furthermore, grand-averaged ERPs showed that triplet onset (initial stimulus) elicited larger responses in the musically trained children during both auditory and visual statistical learning tasks. In addition, children's music skills were associated with performance on auditory and visual behavioural statistical learning tasks. Our data suggests that individual differences in musical skills are associated with children's ability to detect regularities. The ERP data suggest that musical training is associated with better encoding of both auditory and visual stimuli. Although causality must be explored in further research, these results may have implications for developing music-based remediation strategies for children with learning impairments. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  5. Self-assessed performance improves statistical fusion of image labels

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Frederick W., E-mail: frederick.w.bryan@vanderbilt.edu; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Reich, Daniel S. [Translational Neuroradiology Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland 20892 (United States); Landman, Bennett A. [Electrical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); Biomedical Engineering, Vanderbilt University, Nashville, Tennessee 37235 (United States); and Radiology and Radiological Sciences, Vanderbilt University, Nashville, Tennessee 37235 (United States)

    2014-03-15

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  6. Self-assessed performance improves statistical fusion of image labels

    International Nuclear Information System (INIS)

    Bryan, Frederick W.; Xu, Zhoubing; Asman, Andrew J.; Allen, Wade M.; Reich, Daniel S.; Landman, Bennett A.

    2014-01-01

    Purpose: Expert manual labeling is the gold standard for image segmentation, but this process is difficult, time-consuming, and prone to inter-individual differences. While fully automated methods have successfully targeted many anatomies, automated methods have not yet been developed for numerous essential structures (e.g., the internal structure of the spinal cord as seen on magnetic resonance imaging). Collaborative labeling is a new paradigm that offers a robust alternative that may realize both the throughput of automation and the guidance of experts. Yet, distributing manual labeling expertise across individuals and sites introduces potential human factors concerns (e.g., training, software usability) and statistical considerations (e.g., fusion of information, assessment of confidence, bias) that must be further explored. During the labeling process, it is simple to ask raters to self-assess the confidence of their labels, but this is rarely done and has not been previously quantitatively studied. Herein, the authors explore the utility of self-assessment in relation to automated assessment of rater performance in the context of statistical fusion. Methods: The authors conducted a study of 66 volumes manually labeled by 75 minimally trained human raters recruited from the university undergraduate population. Raters were given 15 min of training during which they were shown examples of correct segmentation, and the online segmentation tool was demonstrated. The volumes were labeled 2D slice-wise, and the slices were unordered. A self-assessed quality metric was produced by raters for each slice by marking a confidence bar superimposed on the slice. Volumes produced by both voting and statistical fusion algorithms were compared against a set of expert segmentations of the same volumes. Results: Labels for 8825 distinct slices were obtained. Simple majority voting resulted in statistically poorer performance than voting weighted by self-assessed performance

  7. Trends in study design and the statistical methods employed in a leading general medicine journal.

    Science.gov (United States)

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing

  8. Clinical and Statistical Study on Canine Impaction

    Directory of Open Access Journals (Sweden)

    Adina-Simona Coșarcă

    2013-08-01

    Full Text Available Aim: The aim of this study was to perform a clinical and statistical research on permanent impacted canine patients among those with dental impaction referred to and treated at the Oral and Maxillo-Facial Surgery Clinic of Tîrgu Mureș, over a four years period (2009-2012. Materials and methods: The study included 858 patients having dental impaction, and upon clinical records, different parameters, like frequency, gender, age, quadrant involvement, patient residence, associated complications, referring specialist and type of treatment, related to canine impaction, were assessed. Results: The study revealed: about 10% frequency of canine impaction among dental impactions; more frequent in women, in the first quadrant (tooth 13; most cases diagnosed between the age of 10-19 years; patients under 20 were referred by an orthodontist, those over 20 by a dentist; surgical exposure was more often performed than odontectomy. Conclusions: Canine impaction is the second-most frequent dental impaction in dental arch after third molars; it occurs especially in women. Due to its important role, canine recovery within dental arch is a goal to be achieved, whenever possible. Therefore, diagnose and treatment of canine impaction requires an interdisciplinary approach (surgical and orthodontic

  9. [Suicide in Luxembourg: a statistical study].

    Science.gov (United States)

    1983-01-01

    A review of the situation concerning suicide in Luxembourg is presented. The existing laws are first described, and some methodological questions are summarized. A statistical analysis of suicide in the country is then presented. Data are included on trends over time, 1881-1982; and on variations in suicide by sex, age, marital status, religion, nationality, and occupation and standard of living. A bibliography is also provided.

  10. Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.

    Science.gov (United States)

    Yalch, Matthew M

    2016-03-01

    Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).

  11. Practical Statistics for LHC Physicists: Bayesian Inference (3/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  12. Practical Statistics for LHC Physicists: Frequentist Inference (2/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  13. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  14. Statistical study of auroral fragmentation into patches

    Science.gov (United States)

    Hashimoto, Ayumi; Shiokawa, Kazuo; Otsuka, Yuichi; Oyama, Shin-ichiro; Nozawa, Satonori; Hori, Tomoaki; Lester, Mark; Johnsen, Magnar Gullikstad

    2015-08-01

    The study of auroral dynamics is important when considering disturbances of the magnetosphere. Shiokawa et al. (2010, 2014) reported observations of finger-like auroral structures that cause auroral fragmentation. Those structures are probably produced by macroscopic instabilities in the magnetosphere, mainly of the Rayleigh-Taylor type. However, the statistical characteristics of these structures have not yet been investigated. Here based on observations by an all-sky imager at Tromsø (magnetic latitude = 67.1°N), Norway, over three winter seasons, we statistically analyzed the occurrence conditions of 14 large-scale finger-like structures that developed from large-scale auroral regions including arcs and 6 small-scale finger-like structures that developed in auroral patches. The large-scale structures were seen from midnight to dawn local time and usually appeared at the beginning of the substorm recovery phase, near the low-latitude boundary of the auroral region. The small-scale structures were primarily seen at dawn and mainly occurred in the late recovery phase of substorms. The sizes of these large- and small-scale structures mapped in the magnetospheric equatorial plane are usually larger than the gyroradius of 10 keV protons, indicating that the finger-like structures could be caused by magnetohydrodynamic instabilities. However, the scale of small structures is only twice the gyroradius of 10 keV protons, suggesting that finite Larmor radius effects may contribute to the formation of small-scale structures. The eastward propagation velocities of the structures are -40 to +200 m/s and are comparable with those of plasma drift velocities measured by the colocating Super Dual Auroral Radar Network radar.

  15. Statistical Theory of the Ideal MHD Geodynamo

    Science.gov (United States)

    Shebalin, J. V.

    2012-01-01

    A statistical theory of geodynamo action is developed, using a mathematical model of the geodynamo as a rotating outer core containing an ideal (i.e., no dissipation), incompressible, turbulent, convecting magnetofluid. On the concentric inner and outer spherical bounding surfaces the normal components of the velocity, magnetic field, vorticity and electric current are zero, as is the temperature fluctuation. This allows the use of a set of Galerkin expansion functions that are common to both velocity and magnetic field, as well as vorticity, current and the temperature fluctuation. The resulting dynamical system, based on the Boussinesq form of the magnetohydrodynamic (MHD) equations, represents MHD turbulence in a spherical domain. These basic equations (minus the temperature equation) and boundary conditions have been used previously in numerical simulations of forced, decaying MHD turbulence inside a sphere [1,2]. Here, the ideal case is studied through statistical analysis and leads to a prediction that an ideal coherent structure will be found in the form of a large-scale quasistationary magnetic field that results from broken ergodicity, an effect that has been previously studied both analytically and numerically for homogeneous MHD turbulence [3,4]. The axial dipole component becomes prominent when there is a relatively large magnetic helicity (proportional to the global correlation of magnetic vector potential and magnetic field) and a stationary, nonzero cross helicity (proportional to the global correlation of velocity and magnetic field). The expected angle of the dipole moment vector with respect to the rotation axis is found to decrease to a minimum as the average cross helicity increases for a fixed value of magnetic helicity and then to increase again when average cross helicity approaches its maximum possible value. Only a relatively small value of cross helicity is needed to produce a dipole moment vector that is aligned at approx.10deg with the

  16. Study of energy fluctuation effect on the statistical mechanics of equilibrium systems

    International Nuclear Information System (INIS)

    Lysogorskiy, Yu V; Wang, Q A; Tayurskii, D A

    2012-01-01

    This work is devoted to the modeling of energy fluctuation effect on the behavior of small classical thermodynamic systems. It is known that when an equilibrium system gets smaller and smaller, one of the major quantities that becomes more and more uncertain is its internal energy. These increasing fluctuations can considerably modify the original statistics. The present model considers the effect of such energy fluctuations and is based on an overlapping between the Boltzmann-Gibbs statistics and the statistics of the fluctuation. Within this o verlap statistics , we studied the effects of several types of energy fluctuations on the probability distribution, internal energy and heat capacity. It was shown that the fluctuations can considerably change the temperature dependence of internal energy and heat capacity in the low energy range and at low temperatures. Particularly, it was found that, due to the lower energy limit of the systems, the fluctuations reduce the probability for the low energy states close to the lowest energy and increase the total average energy. This energy increasing is larger for lower temperatures, making negative heat capacity possible for this case.

  17. Statistics and Probability at Secondary Schools in the Federal State of Salzburg: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Wolfgang Voit

    2014-12-01

    Full Text Available Knowledge about the practical use of statistics and probability in today's mathematics instruction at secondary schools is vital in order to improve the academic education for future teachers. We have conducted an empirical study among school teachers to inform towards improved mathematics instruction and teacher preparation. The study provides a snapshot into the daily practice of instruction at school. Centered around the four following questions, the status of statistics and probability was examined. Where did  the current mathematics teachers study? What relevance do statistics and probability have in school? Which contents are actually taught in class? What kind of continuing education would be desirable for teachers? The study population consisted of all teachers of mathematics at secondary schools in the federal state of Salzburg.

  18. To P or Not to P: Backing Bayesian Statistics.

    Science.gov (United States)

    Buchinsky, Farrel J; Chadha, Neil K

    2017-12-01

    In biomedical research, it is imperative to differentiate chance variation from truth before we generalize what we see in a sample of subjects to the wider population. For decades, we have relied on null hypothesis significance testing, where we calculate P values for our data to decide whether to reject a null hypothesis. This methodology is subject to substantial misinterpretation and errant conclusions. Instead of working backward by calculating the probability of our data if the null hypothesis were true, Bayesian statistics allow us instead to work forward, calculating the probability of our hypothesis given the available data. This methodology gives us a mathematical means of incorporating our "prior probabilities" from previous study data (if any) to produce new "posterior probabilities." Bayesian statistics tell us how confidently we should believe what we believe. It is time to embrace and encourage their use in our otolaryngology research.

  19. A statistical/computational/experimental approach to study the microstructural morphology of damage

    NARCIS (Netherlands)

    Hoefnagels, J. P. M.; Du, C.; de Geus, T. W. J.; Peerlings, R. H. J.; Geers, M. G. D.; Beese, A.M.; Zehnder, A.T.; Xia, Sh.

    2016-01-01

    The fractural behavior of multi-phase materials is not well understood. Therefore, a statistic study of micro-failures is conducted to deepen our insights on the failure mechanisms. We systematically studied the influence of the morphology of dual phase (DP) steel on the fracture behavior at the

  20. [Confidence interval or p-value--similarities and differences between two important methods of statistical inference of quantitative studies].

    Science.gov (United States)

    Harari, Gil

    2014-01-01

    Statistic significance, also known as p-value, and CI (Confidence Interval) are common statistics measures and are essential for the statistical analysis of studies in medicine and life sciences. These measures provide complementary information about the statistical probability and conclusions regarding the clinical significance of study findings. This article is intended to describe the methodologies, compare between the methods, assert their suitability for the different needs of study results analysis and to explain situations in which each method should be used.

  1. Statistical study of density fluctuations in the tore supra tokamak

    International Nuclear Information System (INIS)

    Devynck, P.; Fenzi, C.; Garbet, X.; Laviron, C.

    1998-03-01

    It is believed that the radial anomalous transport in tokamaks is caused by plasma turbulence. Using infra-red laser scattering technique on the Tore Supra tokamak, statistical properties of the density fluctuations are studied as a function of the scales in ohmic as well as additional heating regimes using the lower hybrid or the ion cyclotron frequencies. The probability distributions are compared to a Gaussian in order to estimate the role of intermittency which is found to be negligible. The temporal behaviour of the three-dimensional spectrum is thoroughly discussed; its multifractal character is reflected in the singularity spectrum. The autocorrelation coefficient as well as their long-time incoherence and statistical independence. We also put forward the existence of fluctuations transfer between two distinct but close wavenumbers. A rather clearer image is thus obtained about the way energy is transferred through the turbulent scales. (author)

  2. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  3. Study of relationship between MUF correlation and detection sensitivity of statistical analysis

    International Nuclear Information System (INIS)

    Tamura, Toshiaki; Ihara, Hitoshi; Yamamoto, Yoichi; Ikawa, Koji

    1989-11-01

    Various kinds of statistical analysis are proposed to NRTA (Near Real Time Materials Accountancy) which was devised to satisfy the timeliness goal of one of the detection goals of IAEA. It will be presumed that different statistical analysis results will occur between the case of considered rigorous error propagation (with MUF correlation) and the case of simplified error propagation (without MUF correlation). Therefore, measurement simulation and decision analysis were done using flow simulation of 800 MTHM/Y model reprocessing plant, and relationship between MUF correlation and detection sensitivity and false alarm of statistical analysis was studied. Specific character of material accountancy for 800 MTHM/Y model reprocessing plant was grasped by this simulation. It also became clear that MUF correlation decreases not only false alarm but also detection probability for protracted loss in case of CUMUF test and Page's test applied to NRTA. (author)

  4. Statistical Surface Recovery: A Study on Ear Canals

    DEFF Research Database (Denmark)

    Jensen, Rasmus Ramsbøl; Olesen, Oline Vinter; Paulsen, Rasmus Reinhold

    2012-01-01

    We present a method for surface recovery in partial surface scans based on a statistical model. The framework is based on multivariate point prediction, where the distribution of the points are learned from an annotated data set. The training set consist of surfaces with dense correspondence...... that are Procrustes aligned. The average shape and point covariances can be estimated from this set. It is shown how missing data in a new given shape can be predicted using the learned statistics. The method is evaluated on a data set of 29 scans of ear canal impressions. By using a leave-one-out approach we...

  5. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  6. Mask effects on cosmological studies with weak-lensing peak statistics

    International Nuclear Information System (INIS)

    Liu, Xiangkun; Pan, Chuzhong; Fan, Zuhui; Wang, Qiao

    2014-01-01

    With numerical simulations, we analyze in detail how the bad data removal, i.e., the mask effect, can influence the peak statistics of the weak-lensing convergence field reconstructed from the shear measurement of background galaxies. It is found that high peak fractions are systematically enhanced because of the presence of masks; the larger the masked area is, the higher the enhancement is. In the case where the total masked area is about 13% of the survey area, the fraction of peaks with signal-to-noise ratio ν ≥ 3 is ∼11% of the total number of peaks, compared with ∼7% of the mask-free case in our considered cosmological model. This can have significant effects on cosmological studies with weak-lensing convergence peak statistics, inducing a large bias in the parameter constraints if the effects are not taken into account properly. Even for a survey area of 9 deg 2 , the bias in (Ω m , σ 8 ) is already intolerably large and close to 3σ. It is noted that most of the affected peaks are close to the masked regions. Therefore, excluding peaks in those regions in the peak statistics can reduce the bias effect but at the expense of losing usable survey areas. Further investigations find that the enhancement of the number of high peaks around the masked regions can be largely attributed to the smaller number of galaxies usable in the weak-lensing convergence reconstruction, leading to higher noise than that of the areas away from the masks. We thus develop a model in which we exclude only those very large masks with radius larger than 3' but keep all the other masked regions in peak counting statistics. For the remaining part, we treat the areas close to and away from the masked regions separately with different noise levels. It is shown that this two-noise-level model can account for the mask effect on peak statistics very well, and the bias in cosmological parameters is significantly reduced if this model is applied in the parameter fitting.

  7. Prediction of successful trial of labour in patients with a previous caesarean section

    International Nuclear Information System (INIS)

    Shaheen, N.; Khalil, S.; Iftikhar, P.

    2014-01-01

    Objective: To determine the prediction rate of success in trial of labour after one previous caesarean section. Methods: The cross-sectional study was conducted at the Department of Obstetrics and Gynaecology, Cantonment General Hospital, Rawalpindi, from January 1, 2012 to January 31, 2013, and comprised women with one previous Caesarean section and with single alive foetus at 37-41 weeks of gestation. Women with more than one Caesarean section, unknown site of uterine scar, bony pelvic deformity, placenta previa, intra-uterine growth restriction, deep transverse arrest in previous labour and non-reassuring foetal status at the time of admission were excluded. Intrapartum risk assessment included Bishop score at admission, rate of cervical dilatation and scar tenderness. SPSS 21 was used for statistical analysis. Results: Out of a total of 95 women, the trial was successful in 68 (71.6%). Estimated foetal weight and number of prior vaginal deliveries had a high predictive value for successful trial of labour after Caesarean section. Estimated foetal weight had an odds ratio of 0.46 (p<0.001), while number of prior vaginal deliveries had an odds ratio of 0.85 with (p=0.010). Other factors found to be predictive of successful trial included Bishop score at the time of admission (p<0.037) and rate of cervical dilatation in the first stage of labour (p<0.021). Conclusion: History of prior vaginal deliveries, higher Bishop score at the time of admission, rapid rate of cervical dilatation and lower estimated foetal weight were predictive of a successful trial of labour after Caesarean section. (author)

  8. DATA MINING AND STATISTICS METHODS USAGE FOR ADVANCED TRAINING COURSES QUALITY MEASUREMENT: CASE STUDY

    Directory of Open Access Journals (Sweden)

    Maxim I. Galchenko

    2014-01-01

    Full Text Available In the article we consider a case of the analysis of the data connected with educational statistics, namely – result of professional development courses students survey with specialized software usage. Need for expanded statistical results processing, the scheme of carrying out the analysis is shown. Conclusions on a studied case are presented. 

  9. A comparison of statistical methods for identifying out-of-date systematic reviews.

    Directory of Open Access Journals (Sweden)

    Porjai Pattanittum

    Full Text Available BACKGROUND: Systematic reviews (SRs can provide accurate and reliable evidence, typically about the effectiveness of health interventions. Evidence is dynamic, and if SRs are out-of-date this information may not be useful; it may even be harmful. This study aimed to compare five statistical methods to identify out-of-date SRs. METHODS: A retrospective cohort of SRs registered in the Cochrane Pregnancy and Childbirth Group (CPCG, published between 2008 and 2010, were considered for inclusion. For each eligible CPCG review, data were extracted and "3-years previous" meta-analyses were assessed for the need to update, given the data from the most recent 3 years. Each of the five statistical methods was used, with random effects analyses throughout the study. RESULTS: Eighty reviews were included in this study; most were in the area of induction of labour. The numbers of reviews identified as being out-of-date using the Ottawa, recursive cumulative meta-analysis (CMA, and Barrowman methods were 34, 7, and 7 respectively. No reviews were identified as being out-of-date using the simulation-based power method, or the CMA for sufficiency and stability method. The overall agreement among the three discriminating statistical methods was slight (Kappa = 0.14; 95% CI 0.05 to 0.23. The recursive cumulative meta-analysis, Ottawa, and Barrowman methods were practical according to the study criteria. CONCLUSION: Our study shows that three practical statistical methods could be applied to examine the need to update SRs.

  10. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  11. Statistical mechanics and the evolution of polygenic quantitative traits

    NARCIS (Netherlands)

    Barton, N.H.; De Vladar, H.P.

    The evolution of quantitative characters depends on the frequencies of the alleles involved, yet these frequencies cannot usually be measured. Previous groups have proposed an approximation to the dynamics of quantitative traits, based on an analogy with statistical mechanics. We present a modified

  12. Forest statistics for Southeast Texas counties - 1986

    Science.gov (United States)

    William H. McWilliams; Daniel F. Bertelson

    1986-01-01

    These tables were derived from data obtained during a 1986 inventory of 22 counties comprising the Southeast Unit of Texas (fig. 1). Grimes, Leon, Madison, and Waller counties have been added to the Southeastern Unit since the previous inventory if 1975. All comparisons of the 1975 and 1986 forest statistics made in this Bulletin account for this change. The data on...

  13. Previously unclassified bacteria dominate during thermophilic and mesophilic anaerobic pre-treatment of primary sludge.

    Science.gov (United States)

    Pervin, Hasina M; Batstone, Damien J; Bond, Philip L

    2013-06-01

    Thermophilic biological pre-treatment enables enhanced anaerobic digestion for treatment of wastewater sludges but, at present, there is limited understanding of the hydrolytic-acidogenic microbial composition and its contribution to this process. In this study, the process was assessed by comparing the microbiology of thermophilic (50-65 °C) and mesophilic (35 °C) pre-treatment reactors treating primary sludge. A full-cycle approach for the 16S rRNA genes was applied in order to monitor the diversity of bacteria and their abundance in a thermophilic pre-treatment reactor treating primary sludge. For the thermophilic pre-treatment (TP), over 90% of the sequences were previously undetected and these had less than 97% sequence similarity to cultured organisms. During the first 83 days, members of the Betaproteobacteria dominated the community sequences and a newly designed probe was used to monitor a previously unknown bacterium affiliated with the genus Brachymonas. Between days 85 and 183, three phylotypes that affiliated with the genera Comamonas, Clostridium and Lysobacter were persistently dominant in the TP community, as revealed by terminal-restriction fragment length polymorphism (T-RFLP). Hydrolytic and fermentative functions have been speculated for these bacteria. Mesophilic pre-treatment (MP) and TP communities were different but they were both relatively dynamic. Statistical correlation analysis and the function of closely allied reference organisms indicated that previously unclassified bacteria dominated the TP community and may have been functionally involved in the enhanced hydrolytic performance of thermophilic anaerobic pre-treatment. This study is the first to reveal the diversity and dynamics of bacteria during anaerobic digestion of primary sludge. Copyright © 2013 Elsevier GmbH. All rights reserved.

  14. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  15. Study on loss detection algorithms for tank monitoring data using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Burr, Tom

    2009-01-01

    Evaluation of solution monitoring data to support material balance evaluation was proposed about a decade ago because of concerns regarding the large throughput planned at Rokkasho Reprocessing Plant (RRP). A numerical study using the simulation code (FACSIM) was done and significant increases in the detection probabilities (DP) for certain types of losses were shown. To be accepted internationally, it is very important to verify such claims using real solution monitoring data. However, a demonstrative study with real tank data has not been carried out due to the confidentiality of the tank data. This paper describes an experimental study that has been started using actual data from the Solution Measurement and Monitoring System (SMMS) in the Tokai Reprocessing Plant (TRP) and the Savannah River Site (SRS). Multivariate statistical methods, such as a vector cumulative sum and a multi-scale statistical analysis, have been applied to the real tank data that have superimposed simulated loss. Although quantitative conclusions have not been derived for the moment due to the difficulty of baseline evaluation, the multivariate statistical methods remain promising for abrupt and some types of protracted loss detection. (author)

  16. Cross-Domain Statistical-Sequential Dependencies Are Difficult To Learn

    Directory of Open Access Journals (Sweden)

    Anne McClure Walk

    2016-02-01

    Full Text Available Recent studies have demonstrated participants’ ability to learn cross-modal associations during statistical learning tasks. However, these studies are all similar in that the cross-modal associations to be learned occur simultaneously, rather than sequentially. In addition, the majority of these studies focused on learning across sensory modalities but not across perceptual categories. To test both cross-modal and cross-categorical learning of sequential dependencies, we used an artificial grammar learning task consisting of a serial stream of auditory and/or visual stimuli containing both within- and cross-domain dependencies. Experiment 1 examined within-modal and cross-modal learning across two sensory modalities (audition and vision. Experiment 2 investigated within-categorical and cross-categorical learning across two perceptual categories within the same sensory modality (e.g. shape and color; tones and non-words. Our results indicated that individuals demonstrated learning of the within-modal and within-categorical but not the cross-modal or cross-categorical dependencies. These results stand in contrast to the previous demonstrations of cross-modal statistical learning, and highlight the presence of modality constraints that limit the effectiveness of learning in a multimodal environment.

  17. Late tamoxifen in patients previously operated for breast cancer without postoperative tamoxifen: 5-year results of a single institution randomised study

    International Nuclear Information System (INIS)

    Veronesi, Andrea; Miolo, GianMaria; Magri, Maria D; Crivellari, Diana; Scalone, Simona; Bidoli, Ettore; Lombardi, Davide

    2010-01-01

    A population of breast cancer patients exists who, for various reasons, never received adjuvant post-operative tamoxifen (TAM). This study was aimed to evaluate the role of late TAM in these patients. From 1997 to 2003, patients aged 35 to 75 years, operated more than 2 years previously for monolateral breast cancer without adjuvant TAM, with no signs of metastases and no contraindication to TAM were randomized to TAM 20 mg/day orally for 2 years or follow-up alone. Events were categorized as locoregional relapse, distant metastases, metachronous breast cancer, tumours other than breast cancer and death from any causes, whichever occurred first. The sample size (197 patients per arm, plus 10% allowance) was based on the assumption of a 30% decrease in the number of events occurring at a rate of 5% annually in the 10 years following randomization. Four hundred and thirty-three patients were randomized in the study (TAM 217, follow-up 216). Patients characteristics (TAM/follow-up) included: median age 55/55 years, median time from surgery 25/25 months (range, 25-288/25-294), in situ carcinoma 18/24, oestrogen receptor (ER) positive in 75/68, negative in 70/57, unknown in 72/91 patients. Previous adjuvant treatment included chemotherapy in 131/120 and an LHRH analogue in 11/13 patients. Thirty-six patients prematurely discontinued TAM after a median of 1 month, mostly because of subjective intolerance. Eighty-three events (TAM 39, follow-up 44) occurred: locoregional relapse in 10/8, distant metastases in 14/16, metachronous breast cancer in 4/10, other tumours in 11/10 patients. Less ER-positive secondary breast cancers occurred in the TAM treated patients than in follow-up patients (1 vs 10, p = 0.005). Event-free survival was similar in both groups of patients. This 5-year analysis revealed significantly less metachronous ER-positive breast cancers in the TAM treated patients. No other statistically significant differences have emerged thus far

  18. The Study of Second Higher Education through Mathematical Statistics

    Directory of Open Access Journals (Sweden)

    Olga V. Kremer

    2014-05-01

    Full Text Available The article deals with the statistic reasons, age and wages of people who get the second higher education. People opt for the second higher education mostly due to many economical and physiological factors. According to our research, the age is a key motivator for the second higher education. Based on statistical data the portrait of a second higher education student was drawn.

  19. Working females : a modern statistical approach

    OpenAIRE

    Kuhlenkasper, Torben

    2010-01-01

    The thesis analyzes the changing employment and economic situation of females when they become mothers. Two major questions are focused in the thesis: First, when do mothers return to their previous employment after bearing a child? Secondly, what are the individual economic consequences after having returned to the labor market? Both questions are analyzed empirically with latest statistical methods. The first major part of the thesis, corresponding to the above first motivated question, ...

  20. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  1. Statistical analysis of non-homogeneous Poisson processes. Statistical processing of a particle multidetector

    International Nuclear Information System (INIS)

    Lacombe, J.P.

    1985-12-01

    Statistic study of Poisson non-homogeneous and spatial processes is the first part of this thesis. A Neyman-Pearson type test is defined concerning the intensity measurement of these processes. Conditions are given for which consistency of the test is assured, and others giving the asymptotic normality of the test statistics. Then some techniques of statistic processing of Poisson fields and their applications to a particle multidetector study are given. Quality tests of the device are proposed togetherwith signal extraction methods [fr

  2. Generalized $L-, M-$, and $R$-Statistics

    OpenAIRE

    Serfling, Robert J.

    1984-01-01

    A class of statistics generalizing $U$-statistics and $L$-statistics, and containing other varieties of statistic as well, such as trimmed $U$-statistics, is studied. Using the differentiable statistical function approach, differential approximations are obtained and the influence curves of these generalized $L$-statistics are derived. These results are employed to establish asymptotic normality for such statistics. Parallel generalizations of $M$- and $R$-statistics are noted. Strong converg...

  3. Implant breast reconstruction after salvage mastectomy in previously irradiated patients.

    Science.gov (United States)

    Persichetti, Paolo; Cagli, Barbara; Simone, Pierfranco; Cogliandro, Annalisa; Fortunato, Lucio; Altomare, Vittorio; Trodella, Lucio

    2009-04-01

    The most common surgical approach in case of local tumor recurrence after quadrantectomy and radiotherapy is salvage mastectomy. Breast reconstruction is the subsequent phase of the treatment and the plastic surgeon has to operate on previously irradiated and manipulated tissues. The medical literature highlights that breast reconstruction with tissue expanders is not a pursuable option, considering previous radiotherapy a contraindication. The purpose of this retrospective study is to evaluate the influence of previous radiotherapy on 2-stage breast reconstruction (tissue expander/implant). Only patients with analogous timing of radiation therapy and the same demolitive and reconstructive procedures were recruited. The results of this study prove that, after salvage mastectomy in previously irradiated patients, implant reconstruction is still possible. Further comparative studies are, of course, advisable to draw any conclusion on the possibility to perform implant reconstruction in previously irradiated patients.

  4. Statistical and Methodological Considerations for the Interpretation of Intranasal Oxytocin Studies.

    Science.gov (United States)

    Walum, Hasse; Waldman, Irwin D; Young, Larry J

    2016-02-01

    Over the last decade, oxytocin (OT) has received focus in numerous studies associating intranasal administration of this peptide with various aspects of human social behavior. These studies in humans are inspired by animal research, especially in rodents, showing that central manipulations of the OT system affect behavioral phenotypes related to social cognition, including parental behavior, social bonding, and individual recognition. Taken together, these studies in humans appear to provide compelling, but sometimes bewildering, evidence for the role of OT in influencing a vast array of complex social cognitive processes in humans. In this article, we investigate to what extent the human intranasal OT literature lends support to the hypothesis that intranasal OT consistently influences a wide spectrum of social behavior in humans. We do this by considering statistical features of studies within this field, including factors like statistical power, prestudy odds, and bias. Our conclusion is that intranasal OT studies are generally underpowered and that there is a high probability that most of the published intranasal OT findings do not represent true effects. Thus, the remarkable reports that intranasal OT influences a large number of human social behaviors should be viewed with healthy skepticism, and we make recommendations to improve the reliability of human OT studies in the future. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  5. Statistical assessment of the learning curves of health technologies.

    Science.gov (United States)

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second

  6. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  7. Application of mathematical statistics methods to study fluorite deposits

    International Nuclear Information System (INIS)

    Chermeninov, V.B.

    1980-01-01

    Considered are the applicability of mathematical-statistical methods for the increase of reliability of sampling and geological tasks (study of regularities of ore formation). Compared is the reliability of core sampling (regarding the selective abrasion of fluorite) and neutron activation logging for fluorine. The core sampling data are characterized by higher dispersion than neutron activation logging results (mean value of variation coefficients are 75% and 56% respectively). However the hypothesis of the equality of average two sampling is confirmed; this fact testifies to the absence of considerable variability of ore bodies

  8. Local sequence alignments statistics: deviations from Gumbel statistics in the rare-event tail

    Directory of Open Access Journals (Sweden)

    Burghardt Bernd

    2007-07-01

    Full Text Available Abstract Background The optimal score for ungapped local alignments of infinitely long random sequences is known to follow a Gumbel extreme value distribution. Less is known about the important case, where gaps are allowed. For this case, the distribution is only known empirically in the high-probability region, which is biologically less relevant. Results We provide a method to obtain numerically the biologically relevant rare-event tail of the distribution. The method, which has been outlined in an earlier work, is based on generating the sequences with a parametrized probability distribution, which is biased with respect to the original biological one, in the framework of Metropolis Coupled Markov Chain Monte Carlo. Here, we first present the approach in detail and evaluate the convergence of the algorithm by considering a simple test case. In the earlier work, the method was just applied to one single example case. Therefore, we consider here a large set of parameters: We study the distributions for protein alignment with different substitution matrices (BLOSUM62 and PAM250 and affine gap costs with different parameter values. In the logarithmic phase (large gap costs it was previously assumed that the Gumbel form still holds, hence the Gumbel distribution is usually used when evaluating p-values in databases. Here we show that for all cases, provided that the sequences are not too long (L > 400, a "modified" Gumbel distribution, i.e. a Gumbel distribution with an additional Gaussian factor is suitable to describe the data. We also provide a "scaling analysis" of the parameters used in the modified Gumbel distribution. Furthermore, via a comparison with BLAST parameters, we show that significance estimations change considerably when using the true distributions as presented here. Finally, we study also the distribution of the sum statistics of the k best alignments. Conclusion Our results show that the statistics of gapped and ungapped local

  9. Perceived Statistical Knowledge Level and Self-Reported Statistical Practice Among Academic Psychologists

    Directory of Open Access Journals (Sweden)

    Laura Badenes-Ribera

    2018-06-01

    Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not

  10. Students' Perceptions of Statistics: An Exploration of Attitudes, Conceptualizations, and Content Knowledge of Statistics

    Science.gov (United States)

    Bond, Marjorie E.; Perkins, Susan N.; Ramirez, Caroline

    2012-01-01

    Although statistics education research has focused on students' learning and conceptual understanding of statistics, researchers have only recently begun investigating students' perceptions of statistics. The term perception describes the overlap between cognitive and non-cognitive factors. In this mixed-methods study, undergraduate students…

  11. The Effect of "Clickers" on Attendance in an Introductory Statistics Course: An Action Research Study

    Science.gov (United States)

    Amstelveen, Raoul H.

    2013-01-01

    The purpose of this study was to design and implement a Classroom Response System, also known as a "clicker," to increase attendance in introductory statistics courses at an undergraduate university. Since 2010, non-attendance had been prevalent in introductory statistics courses. Moreover, non-attendance created undesirable classrooms…

  12. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  13. Preliminary study of energy confinement data with a statistical analysis system in HL-2A tokamak

    International Nuclear Information System (INIS)

    Xu Yuan; Cui Zhengying; Ji Xiaoquan; Dong Chunfeng; Yang Qingwei; O J W F Kardaun

    2010-01-01

    Taking advantage of the HL-2A experimental data,an energy confinement database facing ITERL DB2.0 version has been originally established. As for this database,a world widely used statistical analysis system (SAS) has been adopted for the first time to analyze and evaluate the confinement data from HL-2A and the research on scaling laws of energy confinement time corresponding to plasma density is developed, some preliminary results having been achieved. Finally, through comparing with both ITER scaling law and previous ASDEX database, the investigation about L-mode confinement quality on HL-2A and influence of temperature on Spitzer resistivity will be discussed. (authors)

  14. Statistical mechanics of magnetized pair Fermi gas

    International Nuclear Information System (INIS)

    Daicic, J.; Frankel, N.E.; Kowalenko, V.

    1993-01-01

    Following previous work on the magnetized pair Bose gas this contribution presents the statistical mechanics of the charged relativistic Fermi gas with pair creation in d spatial dimensions. Initially, the gas in no external fields is studied. As a result, expansions for the various thermodynamic functions are obtained in both the μ/m→0 (neutrino) limit, and about the point μ/m =1, where μ is the chemical potential. The thermodynamics of a gas of quantum-number conserving massless fermions is also discussed. Then a complete study of the pair Fermi gas in a homogeneous magnetic field, is presented investigating the behavior of the magnetization over a wide range of field strengths. The inclusion of pairs leads to new results for the net magnetization due to the paramagnetic moment of the spins and the diamagnetic Landau orbits. 20 refs

  15. Statistical Study in the mid-altitude cusp region: wave and particle data comparison using a normalized cusp crossing duration

    Science.gov (United States)

    Grison, B.; Escoubet, C. P.; Pitout, F.; Cornilleau-Wehrlin, N.; Dandouras, I.; Lucek, E.

    2009-04-01

    In the mid altitude cusp region the DC magnetic field presents a diamagnetic cavity due to intense ion earthward flux coming from the magnetosheath. A strong ultra low frequency (ULF) magnetic activity is also commonly observed in this region. Most of the mid altitude cusp statistical studies have focused on the location of the cusp and its dependence and response to solar wind, interplanetary magnetic field, dipole tilt angle parameters. In our study we use the database build by Pitout et al. (2006) in order to study the link of wave power in the ULF range (0.35-10Hz) measured by STAFF SC instrument with the ion plasma properties as measured by CIS (and CODIF) instrument as well as the diamagnetic cavity in the mid-altitude cusp region with FGM data. To compare the different crossings we don`t use the cusp position and dynamics but we use a normalized cusp crossing duration that permits to easily average the properties over a large number of crossings. As usual in the cusp, it is particularly relevant to sort the crossings by the corresponding interplanetary magnetic field (IMF) orientation in order to analyse the results. In particular we try to find out what is the most relevant parameter to link the strong wave activity with. The global statistic confirms previous single case observations that have noticed a simultaneity between ion injections and wave activity enhancements. We will also present results concerning other ion parameters and the diamagnetic cavity observed in the mid altitude cusp region.

  16. Statistical assignment of DNA sequences using Bayesian phylogenetics

    DEFF Research Database (Denmark)

    Terkelsen, Kasper Munch; Boomsma, Wouter Krogh; Huelsenbeck, John P.

    2008-01-01

    We provide a new automated statistical method for DNA barcoding based on a Bayesian phylogenetic analysis. The method is based on automated database sequence retrieval, alignment, and phylogenetic analysis using a custom-built program for Bayesian phylogenetic analysis. We show on real data...... that the method outperforms Blast searches as a measure of confidence and can help eliminate 80% of all false assignment based on best Blast hit. However, the most important advance of the method is that it provides statistically meaningful measures of confidence. We apply the method to a re......-analysis of previously published ancient DNA data and show that, with high statistical confidence, most of the published sequences are in fact of Neanderthal origin. However, there are several cases of chimeric sequences that are comprised of a combination of both Neanderthal and modern human DNA....

  17. Statistical modeling of static strengths of nuclear graphites with relevance to structural design

    International Nuclear Information System (INIS)

    Arai, Taketoshi

    1992-02-01

    Use of graphite materials for structural members poses a problem as to how to take into account of statistical properties of static strength, especially tensile fracture stresses, in component structural design. The present study concerns comprehensive examinations on statistical data base and modelings on nuclear graphites. First, the report provides individual samples and their analyses on strengths of IG-110 and PGX graphites for HTTR components. Those statistical characteristics on other HTGR graphites are also exemplified from the literature. Most of statistical distributions of individual samples are found to be approximately normal. The goodness of fit to normal distributions is more satisfactory with larger sample sizes. Molded and extruded graphites, however, possess a variety of statistical properties depending of samples from different with-in-log locations and/or different orientations. Second, the previous statistical models including the Weibull theory are assessed from the viewpoint of applicability to design procedures. This leads to a conclusion that the Weibull theory and its modified ones are satisfactory only for limited parts of tensile fracture behavior. They are not consistent for whole observations. Only normal statistics are justifiable as practical approaches to discuss specified minimum ultimate strengths as statistical confidence limits for individual samples. Third, the assessment of various statistical models emphasizes the need to develop advanced analytical ones which should involve modeling of microstructural features of actual graphite materials. Improvements of other structural design methodologies are also presented. (author)

  18. Statistical point of view on nucleus excited states and fluctuations of differential polarization of particles emitted during nuclear reactions

    International Nuclear Information System (INIS)

    Dumazet, Gerard

    1965-01-01

    As previous works notably performed by Ericson outlined the fact that the compound nucleus model resulted in variations of efficient cross sections about average values and that these variations were not negligible at all as it had been previously admitted, this research thesis aims at establishing theoretical predictions and at showing that Ericson's predictions can be extended to polarization. After having qualitatively and quantitatively recalled the underlying concepts used in the compound nucleus and direct interaction models, the author shows the relevance of a statistical point of view on nuclei which must not be confused with the statistical model itself. Then, after a recall of results obtained by Ericson, the author reports the study of the fluctuations of differential polarization, addresses the experimental aspect of fluctuations, and shows which are the main factors for this kind of study [fr

  19. Studying the microlenses mass function from statistical analysis of the caustic concentration

    Energy Technology Data Exchange (ETDEWEB)

    Mediavilla, T; Ariza, O [Departamento de Estadistica e Investigacion Operativa, Universidad de Cadiz, Avda de Ramon Puyol, s/n 11202 Algeciras (Spain); Mediavilla, E [Instituto de Astrofisica de Canarias, Avda Via Lactea s/n, La Laguna (Spain); Munoz, J A, E-mail: teresa.mediavilla@ca.uca.es, E-mail: octavio.ariza@uca.es, E-mail: emg@iac.es [Departamento de Astrofisica y Astronomia, Universidad de Valencia, Burjassot, Valencia (Spain)

    2011-09-22

    The statistical distribution of caustic crossings by the images of a lensed quasar depends on the properties of the distribution of microlenses in the lens galaxy. We use a procedure based in Inverse Polygon Mapping to easily identify the critical and caustic curves generated by a distribution of stars in the lens galaxy. We analyze the statistical distributions of the number of caustic crossings by a pixel size source for several projected mass densities and different mass distributions. We compare the results of simulations with theoretical binomial distributions. Finally we apply this method to the study of the stellar mass distribution in the lens galaxy of QSO 2237+0305.

  20. Initiating statistical maintenance optimization

    International Nuclear Information System (INIS)

    Doyle, E. Kevin; Tuomi, Vesa; Rowley, Ian

    2007-01-01

    Since the 1980 s maintenance optimization has been centered around various formulations of Reliability Centered Maintenance (RCM). Several such optimization techniques have been implemented at the Bruce Nuclear Station. Further cost refinement of the Station preventive maintenance strategy includes evaluation of statistical optimization techniques. A review of successful pilot efforts in this direction is provided as well as initial work with graphical analysis. The present situation reguarding data sourcing, the principle impediment to use of stochastic methods in previous years, is discussed. The use of Crowe/AMSAA (Army Materials Systems Analysis Activity) plots is demonstrated from the point of view of justifying expenditures in optimization efforts. (author)

  1. Statistical study of overvoltages by maneuvering in switches in high voltage using EMTP-RV

    International Nuclear Information System (INIS)

    Dominguez Herrera, Diego Armando

    2013-01-01

    The transient overvoltages produced by maneuvering of switches are studied in a statistical way and through a variation the sequential closing times of switches in networks larger than 230 kV. This study is performed according to time delays and typical deviation ranges, using the tool EMTP- RV (ElectroMagnetic Trasient Program Restructured Version). A conceptual framework related with the electromagnetic transients by maneuver is developed in triphasic switches installed in nominal voltages higher than 230 kV. The methodology established for the execution of statistical studies of overvoltages by switch maneuver is reviewed and evaluated by simulating two fictitious cases in EMTP-RV [es

  2. A Questionnaire Study on the Attitudes and Previous Experience of Croatian Family Physicians toward their Preparedness for Disaster Management.

    Science.gov (United States)

    Pekez-Pavliško, Tanja; Račić, Maja; Jurišić, Dinka

    2018-04-01

    To explore family physicians' attitudes, previous experience and self-assessed preparedness to respond or to assist in mass casualty incidents in Croatia. The cross-sectional survey was carried out during January 2017. Study participants were recruited through a Facebook group that brings together family physicians from Croatia. They were asked to complete the questionnaire, which was distributed via google.docs. Knowledge and attitudes toward disaster preparedness were evaluated by 18 questions. Analysis of variance, Student t test and Kruskal-Wallis test t were used for statistical analysis. Risk awareness of disasters was high among respondents (M = 4.89, SD=0.450). Only 16.4 of respondents have participated in the management of disaster at the scene. The majority (73.8%) of physicians have not been participating in any educational activity dealing with disaster over the past two years. Family physicians believed they are not well prepared to participate in national (M = 3.02, SD=0.856) and local community emergency response system for disaster (M = 3.16, SD=1.119). Male physicians scored higher preparedness to participate in national emergency response system for disaster ( p =0.012), to carry out accepted triage principles used in the disaster situation ( p =0.003) and recognize differences in health assessments indicating potential exposure to specific agents ( p =0,001) compared to their female colleagues. Croatian primary healthcare system attracts many young physicians, who can be an important part of disaster and emergency management. However, the lack of experience despite a high motivation indicates a need for inclusion of disaster medicine training during undergraduate studies and annual educational activities.

  3. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  4. The use of statistical models in heavy-ion reactions studies

    International Nuclear Information System (INIS)

    Stokstad, R.G.

    1984-01-01

    This chapter reviews the use of statistical models to describe nuclear level densities and the decay of equilibrated nuclei. The statistical models of nuclear structure and nuclear reactions presented here have wide application in the analysis of heavy-ion reaction data. Applications are illustrated with examples of gamma-ray decay, the emission of light particles and heavier clusters of nucleons, and fission. In addition to the compound nucleus, the treatment of equilibrated fragments formed in binary reactions is discussed. The statistical model is shown to be an important tool for the identification of products from nonequilibrium decay

  5. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  6. Storytelling, statistics and hereditary thought: the narrative support of early statistics.

    Science.gov (United States)

    López-Beltrán, Carlos

    2006-03-01

    This paper's main contention is that some basically methodological developments in science which are apparently distant and unrelated can be seen as part of a sequential story. Focusing on general inferential and epistemological matters, the paper links occurrences separated by both in time and space, by formal and representational issues rather than social or disciplinary links. It focuses on a few limited aspects of several cognitive practices in medical and biological contexts separated by geography, disciplines and decades, but connected by long term transdisciplinary representational and inferential structures and constraints. The paper intends to show a given set of knowledge claims based on organizing statistically empirical data can be seen to have been underpinned by a previous, more familiar, and probably more natural, narrative handling of similar evidence. To achieve that this paper moves from medicine in France in the late eighteenth and early nineteenth century to the second half of the nineteenth century in England among gentleman naturalists, following its subject: the shift from narrative depiction of hereditary transmission of physical peculiarities to posterior statistical articulations of the same phenomena. Some early defenders of heredity as an important (if not the most important) causal presence in the understanding of life adopted singular narratives, in the form of case stories from medical and natural history traditions, to flesh out a special kind of causality peculiar to heredity. This work tries to reconstruct historically the rationale that drove the use of such narratives. It then shows that when this rationale was methodologically challenged, its basic narrative and probabilistic underpinings were transferred to the statistical quantificational tools that took their place.

  7. Does local endometrial injury in the nontransfer cycle improve the IVF-ET outcome in the subsequent cycle in patients with previous unsuccessful IVF? A randomized controlled pilot study

    Directory of Open Access Journals (Sweden)

    Sachin A Narvekar

    2010-01-01

    Full Text Available Background: Management of repeated implantation failure despite transfer of good-quality embryos still remains a dilemma for ART specialists. Scrapping of endometrium in the nontransfer cycle has been shown to improve the pregnancy rate in the subsequent IVF/ET cycle in recent studies. Aim: The objective of this randomized controlled trial (RCT was to determine whether endometrial injury caused by Pipelle sampling in the nontransfer cycle could improve the probability of pregnancy in the subsequent IVF cycle in patients who had previous failed IVF outcome. Setting: Tertiary assisted conception center. Design: Randomized controlled study. Materials and Methods: 100 eligible patients with previous failed IVF despite transfer of good-quality embryos were randomly allocated to the intervention group and control groups. In the intervention group, Pipelle endometrial sampling was done twice: One in the follicular phase and again in the luteal phase in the cycle preceding the embryo transfer cycle. Outcome Measure: The primary outcome measure was live birth rate. The secondary outcome measures were implantation and clinical pregnancy rates. Results: The live birth rate was significantly higher in the intervention group compared to control group (22.4% and 9.8% P = 0.04. The clinical pregnancy rate in the intervention group was 32.7%, while that in the control group was 13.7%, which was also statistically significant ( P = 0.01. The implantation rate was significantly higher in the intervention group as compared to controls (13.07% vs 7.1% P = 0.04. Conclusions: Endometrial injury in nontransfer cycle improves the live birth rate,clinical pregnancy and implantation rates in the subsequent IVF-ET cycle in patients with previous unsuccessful IVF cycles.

  8. Mathematical methods in quantum and statistical mechanics

    International Nuclear Information System (INIS)

    Fishman, L.

    1977-01-01

    The mathematical structure and closed-form solutions pertaining to several physical problems in quantum and statistical mechanics are examined in some detail. The J-matrix method, introduced previously for s-wave scattering and based upon well-established Hilbert Space theory and related generalized integral transformation techniques, is extended to treat the lth partial wave kinetic energy and Coulomb Hamiltonians within the context of square integrable (L 2 ), Laguerre (Slater), and oscillator (Gaussian) basis sets. The theory of relaxation in statistical mechanics within the context of the theory of linear integro-differential equations of the Master Equation type and their corresponding Markov processes is examined. Several topics of a mathematical nature concerning various computational aspects of the L 2 approach to quantum scattering theory are discussed

  9. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency.

    Science.gov (United States)

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.

  10. AP statistics crash course

    CERN Document Server

    D'Alessio, Michael

    2012-01-01

    AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da

  11. A statistical method for the detection of alternative splicing using RNA-seq.

    Directory of Open Access Journals (Sweden)

    Liguo Wang

    2010-01-01

    Full Text Available Deep sequencing of transcriptome (RNA-seq provides unprecedented opportunity to interrogate plausible mRNA splicing patterns by mapping RNA-seq reads to exon junctions (thereafter junction reads. In most previous studies, exon junctions were detected by using the quantitative information of junction reads. The quantitative criterion (e.g. minimum of two junction reads, although is straightforward and widely used, usually results in high false positive and false negative rates, owning to the complexity of transcriptome. Here, we introduced a new metric, namely Minimal Match on Either Side of exon junction (MMES, to measure the quality of each junction read, and subsequently implemented an empirical statistical model to detect exon junctions. When applied to a large dataset (>200M reads consisting of mouse brain, liver and muscle mRNA sequences, and using independent transcripts databases as positive control, our method was proved to be considerably more accurate than previous ones, especially for detecting junctions originated from low-abundance transcripts. Our results were also confirmed by real time RT-PCR assay. The MMES metric can be used either in this empirical statistical model or in other more sophisticated classifiers, such as logistic regression.

  12. Binomial vs poisson statistics in radiation studies

    International Nuclear Information System (INIS)

    Foster, J.; Kouris, K.; Spyrou, N.M.; Matthews, I.P.; Welsh National School of Medicine, Cardiff

    1983-01-01

    The processes of radioactive decay, decay and growth of radioactive species in a radioactive chain, prompt emission(s) from nuclear reactions, conventional activation and cyclic activation are discussed with respect to their underlying statistical density function. By considering the transformation(s) that each nucleus may undergo it is shown that all these processes are fundamentally binomial. Formally, when the number of experiments N is large and the probability of success p is close to zero, the binomial is closely approximated by the Poisson density function. In radiation and nuclear physics, N is always large: each experiment can be conceived of as the observation of the fate of each of the N nuclei initially present. Whether p, the probability that a given nucleus undergoes a prescribed transformation, is close to zero depends on the process and nuclide(s) concerned. Hence, although a binomial description is always valid, the Poisson approximation is not always adequate. Therefore further clarification is provided as to when the binomial distribution must be used in the statistical treatment of detected events. (orig.)

  13. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  14. A Pilot Study Teaching Metrology in an Introductory Statistics Course

    Science.gov (United States)

    Casleton, Emily; Beyler, Amy; Genschel, Ulrike; Wilson, Alyson

    2014-01-01

    Undergraduate students who have just completed an introductory statistics course often lack deep understanding of variability and enthusiasm for the field of statistics. This paper argues that by introducing the commonly underemphasized concept of measurement error, students will have a better chance of attaining both. We further present lecture…

  15. Approximations to the distribution of a test statistic in covariance structure analysis: A comprehensive study.

    Science.gov (United States)

    Wu, Hao

    2018-05-01

    In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.

  16. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  17. Statistical sampling plan for the TRU waste assay facility

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Wright, T.; Schultz, F.J.; Haff, K.; Monroe, R.J.

    1983-08-01

    Due to limited space, there is a need to dispose appropriately of the Oak Ridge National Laboratory transuranic waste which is presently stored below ground in 55-gal (208-l) drums within weather-resistant structures. Waste containing less than 100 nCi/g transuranics can be removed from the present storage and be buried, while waste containing greater than 100 nCi/g transuranics must continue to be retrievably stored. To make the necessary measurements needed to determine the drums that can be buried, a transuranic Neutron Interrogation Assay System (NIAS) has been developed at Los Alamos National Laboratory and can make the needed measurements much faster than previous techniques which involved γ-ray spectroscopy. The previous techniques are reliable but time consuming. Therefore, a validation study has been planned to determine the ability of the NIAS to make adequate measurements. The validation of the NIAS will be based on a paired comparison of a sample of measurements made by the previous techniques and the NIAS. The purpose of this report is to describe the proposed sampling plan and the statistical analyses needed to validate the NIAS. 5 references, 4 figures, 5 tables

  18. Order-specific fertility estimates based on perinatal statistics and statistics on out-of-hospital births

    OpenAIRE

    Kreyenfeld, Michaela; Peters, Frederik; Scholz, Rembrandt; Wlosnewski, Ines

    2014-01-01

    Until 2008, German vital statistics has not provided information on biological birth order. We have tried to close part of this gap by providing order-specific fertility rates generated from Perinatal Statistics and statistics on out-of-hospital births for the period 2001-2008. This investigation has been published in Comparative Population Studies (CPoS) (see Kreyenfeld, Scholz, Peters and Wlosnewski 2010). The CPoS-paper describes how data from the Perinatal Statistics and statistics on out...

  19. Statistical aspects of nuclear structure

    International Nuclear Information System (INIS)

    Parikh, J.C.

    1977-01-01

    The statistical properties of energy levels and a statistical approach to transition strengths are discussed in relation to nuclear structure studies at high excitation energies. It is shown that the calculations can be extended to the ground state domain also. The discussion is based on the study of random matrix theory of level density and level spacings, using the Gaussian Orthogonal Ensemble (GOE) concept. The short range and long range correlations are also studied statistically. The polynomial expansion method is used to obtain excitation strengths. (A.K.)

  20. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  1. Does previous open renal surgery or percutaneous nephrolithotomy affect the outcomes and complications of percutaneous nephrolithotomy.

    Science.gov (United States)

    Ozgor, Faruk; Kucuktopcu, Onur; Sarılar, Omer; Toptas, Mehmet; Simsek, Abdulmuttalip; Gurbuz, Zafer Gokhan; Akbulut, Mehmet Fatih; Muslumanoglu, Ahmet Yaser; Binbay, Murat

    2015-11-01

    In this study, we aim to evaluate the effectiveness and safety of PNL in patients with a history of open renal surgery or PNL by comparing with primary patients and to compare impact of previous open renal surgery and PNL on the success and complications of subsequent PNL. Charts of patients, who underwent PNL at our institute, were analyzed retrospectively. Patients were divided into three groups according to history of renal stone surgery. Patients without history of renal surgery were enrolled into Group 1. Other patients with previous PNL and previous open surgery were categorized as Group 2 and Group 3. Preoperative characteristic, perioperative data, stone-free status, and complication rates were compared between the groups. Stone-free status was accepted as completing clearance of stone and residual fragment smaller than 4 mm. Eventually, 2070 patients were enrolled into the study. Open renal surgery and PNL had been done in 410 (Group 2) and 131 (Group 3) patients, retrospectively. The mean operation time was longer (71.3 ± 33.5 min) in Group 2 and the mean fluoroscopy time was longer (8.6 ± 5.0) in Group 3 but there was no statistically significant difference between the groups. Highest stone clearance was achieved in primary PNL patients (81.62%) compared to the other groups (77.10% in Group 2 and 75.61% in Group 3). Stone-free rate was not significantly different between Group 2 and Group 3. Fever, pulmonary complications, and blood transfusion requirement were not statically different between groups but angioembolization was significantly higher in Group 2. Percutaneous nephrolithotomy is a safe and effective treatment modality for patients with renal stones regardless history of previous PNL or open renal surgery. However, history of open renal surgery but not PNL significantly reduced PNL success.

  2. Excel 2016 for engineering statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching engineering statistics effectively. Similar to the previously published Excel 2013 for Engineering Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical engineering problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in engineering courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However,Excel 2016 for Engineering Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and...

  3. Excel 2016 for business statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching business statistics effectively. Similar to the previously published Excel 2010 for Business Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical business problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in business courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Business Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their courses and work. Each ch...

  4. Game statistics for the Island of Olkiluoto in 2011-2012

    Energy Technology Data Exchange (ETDEWEB)

    Niemi, M.; Nieminen, M. [Faunatica Oy, Espoo (Finland); Jussila, I. [Turku Univ. (Finland)

    2012-11-15

    The game statistics for the island of Olkiluoto were updated in the summer 2012 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2011-2012. The moose population has been decreasing slightly during the past ten years. The increasing lynx population has decreasing effect on small ungulate (white-tailed deer and roe deer) populations. The number of hunted mountain hares and European brown hares decreased when comparing the previous year. In addition, the number of hunted raccoon dogs was about 50 per cent lower than in the year 2010. Altogether 27 waterfowls were hunted in 2011. The population of mountain hare is abundant, despite that there were lynx living on the eastern part of island during the winter 2011. Based on track observations, there are pine martens living on the area as well. In addition, there were some observations of wolves visiting on the area. The winter 2011-2012 was milder than the previous one, and it seemed that young swans wintering on the area survived better that in the previous winter. (orig.)

  5. Game statistics for the Island of Olkiluoto in 2011-2012

    International Nuclear Information System (INIS)

    Niemi, M.; Nieminen, M.; Jussila, I.

    2012-11-01

    The game statistics for the island of Olkiluoto were updated in the summer 2012 and compared with earlier statistics. Population size estimates are based on interviews of the local hunters. No moose or deer inventories were made in the winter 2011-2012. The moose population has been decreasing slightly during the past ten years. The increasing lynx population has decreasing effect on small ungulate (white-tailed deer and roe deer) populations. The number of hunted mountain hares and European brown hares decreased when comparing the previous year. In addition, the number of hunted raccoon dogs was about 50 per cent lower than in the year 2010. Altogether 27 waterfowls were hunted in 2011. The population of mountain hare is abundant, despite that there were lynx living on the eastern part of island during the winter 2011. Based on track observations, there are pine martens living on the area as well. In addition, there were some observations of wolves visiting on the area. The winter 2011-2012 was milder than the previous one, and it seemed that young swans wintering on the area survived better that in the previous winter. (orig.)

  6. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  7. Automating Exams for a Statistics Course: II. A Case Study.

    Science.gov (United States)

    Michener, R. Dean; And Others

    A specific application of the process of automating exams for any introductory statistics course is described. The process of automating exams was accomplished by using the Statistical Test Item Collection System (STICS). This system was first used to select a set of questions based on course requirements established in advance; afterward, STICS…

  8. The Communicability of Graphical Alternatives to Tabular Displays of Statistical Simulation Studies

    Science.gov (United States)

    Cook, Alex R.; Teo, Shanice W. L.

    2011-01-01

    Simulation studies are often used to assess the frequency properties and optimality of statistical methods. They are typically reported in tables, which may contain hundreds of figures to be contrasted over multiple dimensions. To assess the degree to which these tables are fit for purpose, we performed a randomised cross-over experiment in which statisticians were asked to extract information from (i) such a table sourced from the literature and (ii) a graphical adaptation designed by the authors, and were timed and assessed for accuracy. We developed hierarchical models accounting for differences between individuals of different experience levels (under- and post-graduate), within experience levels, and between different table-graph pairs. In our experiment, information could be extracted quicker and, for less experienced participants, more accurately from graphical presentations than tabular displays. We also performed a literature review to assess the prevalence of hard-to-interpret design features in tables of simulation studies in three popular statistics journals, finding that many are presented innumerately. We recommend simulation studies be presented in graphical form. PMID:22132184

  9. Mathematical statistics and stochastic processes

    CERN Document Server

    Bosq, Denis

    2013-01-01

    Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob

  10. Assessing Statistical Competencies in Clinical and Translational Science Education: One Size Does Not Fit All

    Science.gov (United States)

    Lindsell, Christopher J.; Welty, Leah J.; Mazumdar, Madhu; Thurston, Sally W.; Rahbar, Mohammad H.; Carter, Rickey E.; Pollock, Bradley H.; Cucchiara, Andrew J.; Kopras, Elizabeth J.; Jovanovic, Borko D.; Enders, Felicity T.

    2014-01-01

    Abstract Introduction Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? Methods For three types of CTS learners (principal investigator, co‐investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Results Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. Discussion When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Conclusion Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. PMID:25212569

  11. Modern applied U-statistics

    CERN Document Server

    Kowalski, Jeanne

    2008-01-01

    A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...

  12. An Exploratory Study of Taiwanese Mathematics Teachers' Conceptions of School Mathematics, School Statistics, and Their Differences

    Science.gov (United States)

    Yang, Kai-Lin

    2014-01-01

    This study used phenomenography, a qualitative method, to investigate Taiwanese mathematics teachers' conceptions of school mathematics, school statistics, and their differences. To collect data, we interviewed five mathematics teachers by open questions. They also responded to statements drawn on mathematical/statistical conceptions and…

  13. Statistical shear lag model - unraveling the size effect in hierarchical composites.

    Science.gov (United States)

    Wei, Xiaoding; Filleter, Tobin; Espinosa, Horacio D

    2015-05-01

    Numerous experimental and computational studies have established that the hierarchical structures encountered in natural materials, such as the brick-and-mortar structure observed in sea shells, are essential for achieving defect tolerance. Due to this hierarchy, the mechanical properties of natural materials have a different size dependence compared to that of typical engineered materials. This study aimed to explore size effects on the strength of bio-inspired staggered hierarchical composites and to define the influence of the geometry of constituents in their outstanding defect tolerance capability. A statistical shear lag model is derived by extending the classical shear lag model to account for the statistics of the constituents' strength. A general solution emerges from rigorous mathematical derivations, unifying the various empirical formulations for the fundamental link length used in previous statistical models. The model shows that the staggered arrangement of constituents grants composites a unique size effect on mechanical strength in contrast to homogenous continuous materials. The model is applied to hierarchical yarns consisting of double-walled carbon nanotube bundles to assess its predictive capabilities for novel synthetic materials. Interestingly, the model predicts that yarn gauge length does not significantly influence the yarn strength, in close agreement with experimental observations. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  14. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    Science.gov (United States)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  15. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. USING STATISTICAL SURVEY IN ECONOMICS

    Directory of Open Access Journals (Sweden)

    Delia TESELIOS

    2012-01-01

    Full Text Available Statistical survey is an effective method of statistical investigation that involves gathering quantitative data, which is often preferred in statistical reports due to the information which can be obtained regarding the entire population studied by observing a part of it. Therefore, because of the information provided, polls are used in many research areas. In economics, statistics are used in the decision making process in choosing competitive strategies in the analysis of certain economic phenomena, the formulation of forecasts. Economic study presented in this paper is to illustrate how a simple random sampling is used to analyze the existing parking spaces situation in a given locality.

  17. Statistics Anxiety and Instructor Immediacy

    Science.gov (United States)

    Williams, Amanda S.

    2010-01-01

    The purpose of this study was to investigate the relationship between instructor immediacy and statistics anxiety. It was predicted that students receiving immediacy would report lower levels of statistics anxiety. Using a pretest-posttest-control group design, immediacy was measured using the Instructor Immediacy scale. Statistics anxiety was…

  18. A study of outliers in statistical distributions of mechanical properties of structural steels

    International Nuclear Information System (INIS)

    Oefverbeck, P.; Oestberg, G.

    1977-01-01

    The safety against failure of pressure vessels can be assessed by statistical methods, so-called probabilistic fracture mechanics. The data base for such estimations is admittedly rather meagre, making it necessary to assume certain conventional statistical distributions. Since the failure rates arrived at are low, for nuclear vessels of the order of 10 - to 10 - per year, the extremes of the variables involved, among other things the mechanical properties of the steel used, are of particular interest. A question sometimes raised is whether outliers, or values exceeding the extremes in the assumed distributions, might occur. In order to explore this possibility a study has been made of strength values of three qualities of structural steels, available in samples of up to about 12,000. Statistical evaluation of these samples with respect to outliers, using standard methods for this purpose, revealed the presence of such outliers in most cases, with a frequency of occurrence of, typically, a few values per thousand, estimated by the methods described. Obviously, statistical analysis alone cannot be expected to shed any light on the causes of outliers. Thus, the interpretation of these results with respect to their implication for the probabilistic estimation of the integrety of pressure vessels must await further studies of a similar nature in which the test specimens corresponding to outliers can be recovered and examined metallographically. For the moment the results should be regarded only as a factor to be considered in discussions of the safety of pressure vessels. (author)

  19. Lifetime statistics of quantum chaos studied by a multiscale analysis

    KAUST Repository

    Di Falco, A.

    2012-04-30

    In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.

  20. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  1. Sizing for the apparel industry using statistical analysis - a Brazilian case study

    Science.gov (United States)

    Capelassi, C. H.; Carvalho, M. A.; El Kattel, C.; Xu, B.

    2017-10-01

    The study of the body measurements of Brazilian women used the Kinect Body Imaging system for 3D body scanning. The result of the study aims to meet the needs of the apparel industry for accurate measurements. Data was statistically treated using the IBM SPSS 23 system, with 95% confidence (P 0,58) and from the Hip-to-Height Ratio - HHR (bottom portion): Small (HHR 0,68).

  2. Common pitfalls in statistical analysis: Absolute risk reduction, relative risk reduction, and number needed to treat

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Aggarwal, Rakesh

    2016-01-01

    In the previous article in this series on common pitfalls in statistical analysis, we looked at the difference between risk and odds. Risk, which refers to the probability of occurrence of an event or outcome, can be defined in absolute or relative terms. Understanding what these measures represent is essential for the accurate interpretation of study results. PMID:26952180

  3. Statistical model of stress corrosion cracking based on extended ...

    Indian Academy of Sciences (India)

    2016-09-07

    Sep 7, 2016 ... Abstract. In the previous paper (Pramana – J. Phys. 81(6), 1009 (2013)), the mechanism of stress corrosion cracking (SCC) based on non-quadratic form of Dirichlet energy was proposed and its statistical features were discussed. Following those results, we discuss here how SCC propagates on pipe wall ...

  4. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  5. The spin-statistics connection in quantum gravity

    International Nuclear Information System (INIS)

    Balachandran, A.P.; Batista, E.; Costa e Silva, I.P.; Teotonio-Sobrinho, P.

    2000-01-01

    It is well known that in spite of sharing some properties with conventional particles, topological geons in general violate the spin-statistics theorem. On the other hand, it is generally believed that in quantum gravity theories allowing for topology change, using pair creation and annihilation of geons, one should be able to recover this theorem. In this paper, we take an alternative route, and use an algebraic formalism developed in previous work. We give a description of topological geons where an algebra of 'observables' is identified and quantized. Different irreducible representations of this algebra correspond to different kinds of geons, and are labeled by a non-abelian 'charge' and 'magnetic flux'. We then find that the usual spin-statistics theorem is indeed violated, but a new spin-statistics relation arises, when we assume that the fluxes are superselected. This assumption can be proved if all observables are local, as is generally the case in physical theories. Finally, we also discuss how our approach fits into conventional formulations of quantum gravity

  6. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  7. The Incoming Statistical Knowledge of Undergraduate Majors in a Department of Mathematics and Statistics

    Science.gov (United States)

    Cook, Samuel A.; Fukawa-Connelly, Timothy

    2016-01-01

    Studies have shown that at the end of an introductory statistics course, students struggle with building block concepts, such as mean and standard deviation, and rely on procedural understandings of the concepts. This study aims to investigate the understandings entering freshman of a department of mathematics and statistics (including mathematics…

  8. Strange statistics, braid group representations and multipoint functions in the N-component model

    International Nuclear Information System (INIS)

    Lee, H.C.; Ge, M.L.; Couture, M.; Wu, Y.S.

    1989-01-01

    The statistics of fields in low dimensions is studied from the point of view of the braid group B n of n strings. Explicit representations M R for the N-component model, N = 2 to 5, are derived by solving the Yang-Baxter-like braid group relations for the statistical matrix R, which describes the transformation of the bilinear product of two N-component fields under the transposition of coordinates. When R 2 not equal to 1 the statistics is neither Bose-Einstein nor Fermi-Dirac; it is strange. It is shown that for each N, the N + 1 parameter family of solutions obtained is the most general one under a given set of constraints including charge conservation. Extended Nth order (N > 2) Alexander-Conway relations for link polynomials are derived. They depend nonhomogeneously only on one of the N + 1 parameters. The N = 3 and 4 ones agree with those previously derived

  9. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  10. Narrative Review of Statistical Reporting Checklists, Mandatory Statistical Editing, and Rectifying Common Problems in the Reporting of Scientific Articles.

    Science.gov (United States)

    Dexter, Franklin; Shafer, Steven L

    2017-03-01

    Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.

  11. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...

  12. Infant Statistical Learning

    Science.gov (United States)

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  13. Technical issues relating to the statistical parametric mapping of brain SPECT studies

    International Nuclear Information System (INIS)

    Hatton, R.L.; Cordato, N.; Hutton, B.F.; Lau, Y.H.; Evans, S.G.

    2000-01-01

    Full text: Statistical Parametric Mapping (SPM) is a software tool designed for the statistical analysis of functional neuro images, specifically Positron Emission Tomography and functional Magnetic Resonance Imaging, and more recently SPECT. This review examines some problems associated with the analysis of SPECT. A comparison of a patient group with normal studies revealed factors that could influence results, some that commonly occur, others that require further exploration. To optimise the differences between two groups of subjects, both spatial variability and differences in global activity must be minimised. The choice and effectiveness of co registration method and approach to normalisation of activity concentration can affect the optimisation. A small number of subject scans were identified as possessing truncated data resulting in edge effects that could adversely influence the analysis. Other problems included unusual areas of significance possibly related to reconstruction methods and the geometry associated with nonparallel collimators. Areas of extra cerebral significance are a point of concern - and may result from scatter effects, or mis registration. Difficulties in patient positioning, due to postural limitations, can lead to resolution differences. SPM has been used to assess areas of statistical significance arising from these technical factors, as opposed to areas of true clinical significance when comparing subject groups. This contributes to a better understanding of the effects of technical factors so that these may be eliminated, minimised, or incorporated in the study design. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  14. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    Science.gov (United States)

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  15. Polychlorinated biphenyl exposure, diabetes and endogenous hormones: a cross-sectional study in men previously employed at a capacitor manufacturing plant.

    Science.gov (United States)

    Persky, Victoria; Piorkowski, Julie; Turyk, Mary; Freels, Sally; Chatterton, Robert; Dimos, John; Bradlow, H Leon; Chary, Lin Kaatz; Burse, Virlyn; Unterman, Terry; Sepkovic, Daniel W; McCann, Kenneth

    2012-08-29

    Studies have shown associations of diabetes and endogenous hormones with exposure to a wide variety of organochlorines. We have previously reported positive associations of polychlorinated biphenyls (PCBs) and inverse associations of selected steroid hormones with diabetes in postmenopausal women previously employed in a capacitor manufacturing plant. This paper examines associations of PCBs with diabetes and endogenous hormones in 63 men previously employed at the same plant who in 1996 underwent surveys of their exposure and medical history and collection of bloods and urine for measurements of PCBs, lipids, liver function, hematologic markers and endogenous hormones. PCB exposure was positively associated with diabetes and age and inversely associated with thyroid stimulating hormone and triiodothyronine-uptake. History of diabetes was significantly related to total PCBs and all PCB functional groupings, but not to quarters worked and job score, after control for potential confounders. None of the exposures were related to insulin resistance (HOMA-IR) in non-diabetic men. Associations of PCBs with specific endogenous hormones differ in some respects from previous findings in postmenopausal women employed at the capacitor plant. Results from this study, however, do confirm previous reports relating PCB exposure to diabetes and suggest that these associations are not mediated by measured endogenous hormones.

  16. Practical statistics for educators

    CERN Document Server

    Ravid, Ruth

    2014-01-01

    Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.

  17. Improved score statistics for meta-analysis in single-variant and gene-level association studies.

    Science.gov (United States)

    Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo

    2018-06-01

    Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.

  18. Evaluation of the Wishart test statistics for polarimetric SAR data

    DEFF Research Database (Denmark)

    Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut

    2003-01-01

    A test statistic for equality of two covariance matrices following the complex Wishart distribution has previously been used in new algorithms for change detection, edge detection and segmentation in polarimetric SAR images. Previously, the results for change detection and edge detection have been...... quantitatively evaluated. This paper deals with the evaluation of segmentation. A segmentation performance measure originally developed for single-channel SAR images has been extended to polarimetric SAR images, and used to evaluate segmentation for a merge-using-moment algorithm for polarimetric SAR data....

  19. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    Science.gov (United States)

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  20. Estimation of global network statistics from incomplete data.

    Directory of Open Access Journals (Sweden)

    Catherine A Bliss

    Full Text Available Complex networks underlie an enormous variety of social, biological, physical, and virtual systems. A profound complication for the science of complex networks is that in most cases, observing all nodes and all network interactions is impossible. Previous work addressing the impacts of partial network data is surprisingly limited, focuses primarily on missing nodes, and suggests that network statistics derived from subsampled data are not suitable estimators for the same network statistics describing the overall network topology. We generate scaling methods to predict true network statistics, including the degree distribution, from only partial knowledge of nodes, links, or weights. Our methods are transparent and do not assume a known generating process for the network, thus enabling prediction of network statistics for a wide variety of applications. We validate analytical results on four simulated network classes and empirical data sets of various sizes. We perform subsampling experiments by varying proportions of sampled data and demonstrate that our scaling methods can provide very good estimates of true network statistics while acknowledging limits. Lastly, we apply our techniques to a set of rich and evolving large-scale social networks, Twitter reply networks. Based on 100 million tweets, we use our scaling techniques to propose a statistical characterization of the Twitter Interactome from September 2008 to November 2008. Our treatment allows us to find support for Dunbar's hypothesis in detecting an upper threshold for the number of active social contacts that individuals maintain over the course of one week.

  1. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  2. Statistical modeling of competitive threshold collision-induced dissociation

    Science.gov (United States)

    Rodgers, M. T.; Armentrout, P. B.

    1998-08-01

    Collision-induced dissociation of (R1OH)Li+(R2OH) with xenon is studied using guided ion beam mass spectrometry. R1OH and R2OH include the following molecules: water, methanol, ethanol, 1-propanol, 2-propanol, and 1-butanol. In all cases, the primary products formed correspond to endothermic loss of one of the neutral alcohols, with minor products that include those formed by ligand exchange and loss of both ligands. The cross-section thresholds are interpreted to yield 0 and 298 K bond energies for (R1OH)Li+-R2OH and relative Li+ binding affinities of the R1OH and R2OH ligands after accounting for the effects of multiple ion-molecule collisions, internal energy of the reactant ions, and dissociation lifetimes. We introduce a means to simultaneously analyze the cross sections for these competitive dissociations using statistical theories to predict the energy dependent branching ratio. Thermochemistry in good agreement with previous work is obtained in all cases. In essence, this statistical approach provides a detailed means of correcting for the "competitive shift" inherent in multichannel processes.

  3. Using Fun in the Statistics Classroom: An Exploratory Study of College Instructors' Hesitations and Motivations

    Science.gov (United States)

    Lesser, Lawrence M.; Wall, Amitra A.; Carver, Robert H.; Pearl, Dennis K.; Martin, Nadia; Kuiper, Shonda; Posner, Michael A.; Erickson, Patricia; Liao, Shu-Min; Albert, Jim; Weber, John J., III

    2013-01-01

    This study examines statistics instructors' use of fun as well as their motivations, hesitations, and awareness of resources. In 2011, a survey was administered to attendees at a national statistics education conference, and follow-up qualitative interviews were conducted with 16 of those ("N" = 249) surveyed to provide further…

  4. Sacrococcygeal pilonidal disease: analysis of previously proposed risk factors

    Directory of Open Access Journals (Sweden)

    Ali Harlak

    2010-01-01

    Full Text Available PURPOSE: Sacrococcygeal pilonidal disease is a source of one of the most common surgical problems among young adults. While male gender, obesity, occupations requiring sitting, deep natal clefts, excessive body hair, poor body hygiene and excessive sweating are described as the main risk factors for this disease, most of these need to be verified with a clinical trial. The present study aimed to evaluate the value and effect of these factors on pilonidal disease. METHOD: Previously proposed main risk factors were evaluated in a prospective case control study that included 587 patients with pilonidal disease and 2,780 healthy control patients. RESULTS: Stiffness of body hair, number of baths and time spent seated per day were the three most predictive risk factors. Adjusted odds ratios were 9.23, 6.33 and 4.03, respectively (p<0.001. With an adjusted odds ratio of 1.3 (p<.001, body mass index was another risk factor. Family history was not statistically different between the groups and there was no specific occupation associated with the disease. CONCLUSIONS: Hairy people who sit down for more than six hours a day and those who take a bath two or less times per week are at a 219-fold increased risk for sacrococcygeal pilonidal disease than those without these risk factors. For people with a great deal of hair, there is a greater need for them to clean their intergluteal sulcus. People who engage in work that requires sitting in a seat for long periods of time should choose more comfortable seats and should also try to stand whenever possible.

  5. Statistical Association Criteria in Forensic Psychiatry–A criminological evaluation of casuistry

    Science.gov (United States)

    Gheorghiu, V; Buda, O; Popescu, I; Trandafir, MS

    2011-01-01

    Purpose. Identification of potential shared primary psychoprophylaxis and crime prevention is measured by analyzing the rate of commitments for patients–subjects to forensic examination. Material and method. The statistic trial is a retrospective, document–based study. The statistical lot consists of 770 initial examination reports performed and completed during the whole year 2007, primarily analyzed in order to summarize the data within the National Institute of Forensic Medicine, Bucharest, Romania (INML), with one of the group variables being ‘particularities of the psychiatric patient history’, containing the items ‘forensic onset’, ‘commitments within the last year prior to the examination’ and ‘absence of commitments within the last year prior to the examination’. The method used was the Kendall bivariate correlation. For this study, the authors separately analyze only the two items regarding commitments by other correlation alternatives and by modern, elaborate statistical analyses, i.e. recording of the standard case study variables, Kendall bivariate correlation, cross tabulation, factor analysis and hierarchical cluster analysis. Results. The results are varied, from theoretically presumed clinical nosography (such as schizophrenia or manic depression), to non–presumed (conduct disorders) or unexpected behavioral acts, and therefore difficult to interpret. Conclusions. One took into consideration the features of the batch as well as the results of the previous standard correlation of the whole statistical lot. The authors emphasize the role of medical security measures that are actually applied in the therapeutic management in general and in risk and second offence management in particular, as well as the role of forensic psychiatric examinations in the detection of certain aspects related to the monitoring of mental patients. PMID:21505571

  6. Basics of modern mathematical statistics

    CERN Document Server

    Spokoiny, Vladimir

    2015-01-01

    This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.

  7. Statistical learning is constrained to less abstract patterns in complex sensory input (but not the least).

    Science.gov (United States)

    Emberson, Lauren L; Rubinstein, Dani Y

    2016-08-01

    objects) and suggests that, at least with the current categories and type of learner, there are biases to pick up on statistical regularities between individual objects even when robust statistical information is present at other levels of abstraction. These findings speak directly to emerging theories about how systems supporting statistical learning and prediction operate in our structure-rich environments. Moreover, the theoretical implications of the current work across multiple domains of study is already clear: statistical learning cannot be assumed to be unconstrained even if statistical learning has previously been established at a given level of abstraction when that information is presented in isolation. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Statistical ensembles and molecular dynamics studies of anisotropic solids. II

    International Nuclear Information System (INIS)

    Ray, J.R.; Rahman, A.

    1985-01-01

    We have recently discussed how the Parrinello--Rahman theory can be brought into accord with the theory of the elastic and thermodynamic behavior of anisotropic media. This involves the isoenthalpic--isotension ensemble of statistical mechanics. Nose has developed a canonical ensemble form of molecular dynamics. We combine Nose's ideas with the Parrinello--Rahman theory to obtain a canonical form of molecular dynamics appropriate to the study of anisotropic media subjected to arbitrary external stress. We employ this isothermal--isotension ensemble in a study of a fcc→ close-packed structural phase transformation in a Lennard-Jones solid subjected to uniaxial compression. Our interpretation of the Nose theory does not involve a scaling of the time variable. This latter fact leads to simplifications when studying the time dependence of quantities

  9. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  10. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  11. Effect of a supplementation with myo-inositol plus melatonin on oocyte quality in women who failed to conceive in previous in vitro fertilization cycles for poor oocyte quality: a prospective, longitudinal, cohort study.

    Science.gov (United States)

    Unfer, Vittorio; Raffone, Emanuela; Rizzo, Piero; Buffo, Silvia

    2011-11-01

    Several factors can affect oocyte quality and therefore pregnancy outcome in assisted reproductive technology (ART) cycles. Recently, a number of studies have shown that the presence of several compounds in the follicular fluid positively correlates with oocyte quality and maturation (i.e., myo-inositol and melatonin). In the present study, we aim to evaluate the pregnancy outcomes after the administration of myo-inositol combined with melatonin in women who failed to conceive in previous in vitro fertilization (IVF) cycles due to poor oocyte quality. Forty-six women were treated with 4 g/day myo-inositol and 3 mg/day melatonin (inofolic® and inofolic® Plus, Lo.Lipharma, Rome) for 3 months and then underwent a new IVF cycle. After treatment, the number of mature oocytes, the fertilization rate, the number of both, total and top-quality embryos transferred were statistically higher compared to the previous IVF cycle, while there was no difference in the number of retrieved oocyte. After treatment, a total of 13 pregnancies occurred, 9 of them were confirmed echographically; four evolved in spontaneous abortion. The treatment with myo-inositol and melatonin improves ovarian stimulation protocols and pregnancy outcomes in infertile women with poor oocyte quality.

  12. Statistical mechanics for a class of quantum statistics

    International Nuclear Information System (INIS)

    Isakov, S.B.

    1994-01-01

    Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived

  13. A study on the advanced statistical core thermal design methodology

    International Nuclear Information System (INIS)

    Lee, Seung Hyuk

    1992-02-01

    A statistical core thermal design methodology for generating the limit DNBR and the nominal DNBR is proposed and used in assessing the best-estimate thermal margin in a reactor core. Firstly, the Latin Hypercube Sampling Method instead of the conventional Experimental Design Technique is utilized as an input sampling method for a regression analysis to evaluate its sampling efficiency. Secondly and as a main topic, the Modified Latin Hypercube Sampling and the Hypothesis Test Statistics method is proposed as a substitute for the current statistical core thermal design method. This new methodology adopts 'a Modified Latin Hypercube Sampling Method' which uses the mean values of each interval of input variables instead of random values to avoid the extreme cases that arise in the tail areas of some parameters. Next, the independence between the input variables is verified through 'Correlation Coefficient Test' for statistical treatment of their uncertainties. And the distribution type of DNBR response is determined though 'Goodness of Fit Test'. Finally, the limit DNBR with one-sided 95% probability and 95% confidence level, DNBR 95/95 ' is estimated. The advantage of this methodology over the conventional statistical method using Response Surface and Monte Carlo simulation technique lies in its simplicity of the analysis procedure, while maintaining the same level of confidence in the limit DNBR result. This methodology is applied to the two cases of DNBR margin calculation. The first case is the application to the determination of the limit DNBR where the DNBR margin is determined by the difference between the nominal DNBR and the limit DNBR. The second case is the application to the determination of the nominal DNBR where the DNBR margin is determined by the difference between the lower limit value of the nominal DNBR and the CHF correlation limit being used. From this study, it is deduced that the proposed methodology gives a good agreement in the DNBR results

  14. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    Science.gov (United States)

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  15. Excel 2013 for physical sciences statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J; Horton, Howard F

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching physical sciences statistics effectively. Similar to the previously published Excel 2010 for Physical Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2013 for Physical Sciences Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in their ...

  16. Excel 2016 for social science statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching social science statistics effectively. Similar to the previously published Excel 2013 for Social Sciences Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical social science problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in social science courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Social Science Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply Excel to statistical techniques necessary in ...

  17. The kid, the clerk, and the gambler: Critical studies in statistics and cognitive science

    NARCIS (Netherlands)

    Madsen, M.W.

    2015-01-01

    This dissertation presents a series of case studies in linguistics, psychology, and statistics. These case studies take up a variety of theories, concepts, and debates, and in each case attempt to shed new light on these topics by consistently focusing on foundational issues.

  18. Evaluating clinical and public health interventions: a practical guide to study design and statistics

    National Research Council Canada - National Science Library

    Katz, Mitchell H

    2010-01-01

    ... and observational studies. In addition to reviewing standard statistical analysis, the book has easy-to-follow explanations of cutting edge techniques for evaluating interventions, including propensity score analysis...

  19. Examining reproducibility in psychology : A hybrid method for combining a statistically significant original study and a replication

    NARCIS (Netherlands)

    Van Aert, R.C.M.; Van Assen, M.A.L.M.

    2018-01-01

    The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter

  20. Statistical deprojection of galaxy pairs

    Science.gov (United States)

    Nottale, Laurent; Chamaraux, Pierre

    2018-06-01

    Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.

  1. Transition-Region Ultraviolet Explosive Events in IRIS Si IV: A Statistical Analysis

    Science.gov (United States)

    Bartz, Allison

    2018-01-01

    Explosive events (EEs) in the solar transition region are characterized by broad, non-Gaussian line profiles with wings at Doppler velocities exceeding the speed of sound. We present a statistical analysis of 23 IRIS (Interface Region Imaging Spectrograph) sit-and-stare observations, observed between April 2014 and March 2017. Using the IRIS Si IV 1394 Å and 1403 Å spectral windows and the 1400Å Slit Jaw images we have identified 581 EEs. We found that most EEs last less than 20 min. and have a spatial scale on the slit less than 10”, agreeing with measurements in previous work. We observed most EEs in active regions, regardless of date of observation, but selection bias of IRIS observations cannot be ruled out. We also present preliminary findings of optical depth effects from our statistical study.

  2. Effects of previous unsuccessful extracorporeal shockwave lithotripsy treatment on the performance and outcome of percutaneous nephrolithotomy.

    Science.gov (United States)

    Türk, Hakan; Yoldaş, Mehmet; Süelözgen, Tufan; İşoğlu, Cemal Selcuk; Karabıçak, Mustafa; Ergani, Batuhan; Ün, Sıtkı

    2017-06-01

    To evaluate the effects of previous unsuccessful extracorporeal shockwave lithotripsy (ESWL) treatment on the performance and outcome of percutaneous nephrolithotomy (PCNL). Of 1625 PCNL procedures performed in our clinic, 393 renal units with similar stone burden and number of accesses was included in the present study. We categorised the study patients into two groups according to whether they underwent ESWL within 1 year prior to PCNL or not. Accordingly, Group 1 comprised 143 (36.3%) ESWL-treated patients and Group 2 comprised 250 (63.7%) non-ESWL-treated patients. Residual stones were detected in 36 (25.1%) of the ESWL-treated patients (Group 1) and in 60 (24%) of non-ESWL-treated patients (Group 2). There were no statistically significant differences between the groups for length of hospital stay (LOS), nephrostomy tube removal time, and the presence of residual stones. When we evaluated the groups for both the preoperative and postoperative haemoglobin (Hb) drop and blood transfusion rate, manifest Hb declines and more transfusions were required in the ESWL-treated patients (both P  = 0.01). In our study, previous ESWL treatment had no influence on the PCNL stone-free rate, operation time, incidence of postoperative complications, and LOS, in patients with similar stone burdens. However, bleeding during PCNL was more prevalent in the ESWL-treated patients, so close attention should be paid to bleeding in patients who have been pretreated with ESWL.

  3. VGC analyzer: a software for statistical analysis of fully crossed multiple-reader multiple-case visual grading characteristics studies

    International Nuclear Information System (INIS)

    Baath, Magnus; Hansson, Jonny

    2016-01-01

    Visual grading characteristics (VGC) analysis is a non-parametric rank-invariant method for analysis of visual grading data. In VGC analysis, image quality ratings for two different conditions are compared by producing a VGC curve, similar to how the ratings for normal and abnormal cases in receiver operating characteristic (ROC) analysis are used to create an ROC curve. The use of established ROC software for the analysis of VGC data has therefore previously been proposed. However, the ROC analysis is based on the assumption of independence between normal and abnormal cases. In VGC analysis, this independence cannot always be assumed, e.g. if the ratings are based on the same patients imaged under both conditions. A dedicated software intended for analysis of VGC studies, which takes possible dependencies between ratings into account in the statistical analysis of a VGC study, has therefore been developed. The software-VGC Analyzer-determines the area under the VGC curve and its uncertainty using non-parametric re-sampling techniques. This article gives an introduction to VGC Analyzer, describes the types of analyses that can be performed and instructs the user about the input and output data. (authors)

  4. Statistical evolution of quiet-Sun small-scale magnetic features using Sunrise observations

    Science.gov (United States)

    Anusha, L. S.; Solanki, S. K.; Hirzberger, J.; Feller, A.

    2017-02-01

    The evolution of small magnetic features in quiet regions of the Sun provides a unique window for probing solar magneto-convection. Here we analyze small-scale magnetic features in the quiet Sun, using the high resolution, seeing-free observations from the Sunrise balloon borne solar observatory. Our aim is to understand the contribution of different physical processes, such as splitting, merging, emergence and cancellation of magnetic fields to the rearrangement, addition and removal of magnetic flux in the photosphere. We have employed a statistical approach for the analysis and the evolution studies are carried out using a feature-tracking technique. In this paper we provide a detailed description of the feature-tracking algorithm that we have newly developed and we present the results of a statistical study of several physical quantities. The results on the fractions of the flux in the emergence, appearance, splitting, merging, disappearance and cancellation qualitatively agrees with other recent studies. To summarize, the total flux gained in unipolar appearance is an order of magnitude larger than the total flux gained in emergence. On the other hand, the bipolar cancellation contributes nearly an equal amount to the loss of magnetic flux as unipolar disappearance. The total flux lost in cancellation is nearly six to eight times larger than the total flux gained in emergence. One big difference between our study and previous similar studies is that, thanks to the higher spatial resolution of Sunrise, we can track features with fluxes as low as 9 × 1014 Mx. This flux is nearly an order of magnitude lower than the smallest fluxes of the features tracked in the highest resolution previous studies based on Hinode data. The area and flux of the magnetic features follow power-law type distribution, while the lifetimes show either power-law or exponential type distribution depending on the exact definitions used to define various birth and death events. We have

  5. Improving cerebellar segmentation with statistical fusion

    Science.gov (United States)

    Plassard, Andrew J.; Yang, Zhen; Prince, Jerry L.; Claassen, Daniel O.; Landman, Bennett A.

    2016-03-01

    The cerebellum is a somatotopically organized central component of the central nervous system well known to be involved with motor coordination and increasingly recognized roles in cognition and planning. Recent work in multiatlas labeling has created methods that offer the potential for fully automated 3-D parcellation of the cerebellar lobules and vermis (which are organizationally equivalent to cortical gray matter areas). This work explores the trade offs of using different statistical fusion techniques and post hoc optimizations in two datasets with distinct imaging protocols. We offer a novel fusion technique by extending the ideas of the Selective and Iterative Method for Performance Level Estimation (SIMPLE) to a patch-based performance model. We demonstrate the effectiveness of our algorithm, Non- Local SIMPLE, for segmentation of a mixed population of healthy subjects and patients with severe cerebellar anatomy. Under the first imaging protocol, we show that Non-Local SIMPLE outperforms previous gold-standard segmentation techniques. In the second imaging protocol, we show that Non-Local SIMPLE outperforms previous gold standard techniques but is outperformed by a non-locally weighted vote with the deeper population of atlases available. This work advances the state of the art in open source cerebellar segmentation algorithms and offers the opportunity for routinely including cerebellar segmentation in magnetic resonance imaging studies that acquire whole brain T1-weighted volumes with approximately 1 mm isotropic resolution.

  6. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    International Nuclear Information System (INIS)

    Clerc, F; Njiki-Menga, G-H; Witschger, O

    2013-01-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a

  7. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2018-06-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  8. Evaluation of air quality in a megacity using statistics tools

    Science.gov (United States)

    Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana

    2017-03-01

    Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.

  9. Statistical modelling of transcript profiles of differentially regulated genes

    Directory of Open Access Journals (Sweden)

    Sergeant Martin J

    2008-07-01

    Full Text Available Abstract Background The vast quantities of gene expression profiling data produced in microarray studies, and the more precise quantitative PCR, are often not statistically analysed to their full potential. Previous studies have summarised gene expression profiles using simple descriptive statistics, basic analysis of variance (ANOVA and the clustering of genes based on simple models fitted to their expression profiles over time. We report the novel application of statistical non-linear regression modelling techniques to describe the shapes of expression profiles for the fungus Agaricus bisporus, quantified by PCR, and for E. coli and Rattus norvegicus, using microarray technology. The use of parametric non-linear regression models provides a more precise description of expression profiles, reducing the "noise" of the raw data to produce a clear "signal" given by the fitted curve, and describing each profile with a small number of biologically interpretable parameters. This approach then allows the direct comparison and clustering of the shapes of response patterns between genes and potentially enables a greater exploration and interpretation of the biological processes driving gene expression. Results Quantitative reverse transcriptase PCR-derived time-course data of genes were modelled. "Split-line" or "broken-stick" regression identified the initial time of gene up-regulation, enabling the classification of genes into those with primary and secondary responses. Five-day profiles were modelled using the biologically-oriented, critical exponential curve, y(t = A + (B + CtRt + ε. This non-linear regression approach allowed the expression patterns for different genes to be compared in terms of curve shape, time of maximal transcript level and the decline and asymptotic response levels. Three distinct regulatory patterns were identified for the five genes studied. Applying the regression modelling approach to microarray-derived time course data

  10. Interplanetary sources of magnetic storms: A statistical study

    DEFF Research Database (Denmark)

    Vennerstrøm, Susanne

    2001-01-01

    Magnetic storms are mainly caused by the occurrence of intense southward magnetic fields in the interplanetary medium. These fields can be formed directly either by ejection of magnetic structures from the Sun or by stream interaction processes during solar wind propagation. In the present study we...... examine 30 years of satellite measurement of the solar wind during magnetic storms, with the aim of estimating the relative importance of these two processes. We use the solar wind proton temperature relative to the temperature expected from the empirical relation to the solar wind speed T......-p/T-exp, together with the speed gradient, and the interplanetary magnetic field azimuth in the ecliptic, in order to distinguish between the two processes statistically. We find that compression due to stream interaction is at least as important as the direct effect of ejection of intense fields, and probably more...

  11. Interplanetary sources to magnetic storms - A statistical study

    DEFF Research Database (Denmark)

    Vennerstrøm, Susanne

    2001-01-01

    Magnetic storms are mainly caused by the occurrence of intense southward magnetic fields in the interplanetary medium. These fields can be formed directly either by ejection of magnetic structures from the Sun or by stream interaction processes during solar wind propagation. In the present study we...... examine 30 years of satellite measurement of the solar wind during magnetic storms, with the aim of estimating the relative importance of these two processes. We use the solar wind proton temperature relative to the temperature expected from the empirical relation to the solar wind speed Tp/Texp, together...... with the speed gradient, and the interplanetary magnetic field azimuth in the ecliptic, in order to distinguish between the two processes statistically. We find that compression due to stream interaction is at least as important as the direct effect of ejection of intense fields, and probably more so. Only...

  12. Composition-based statistics and translated nucleotide searches: Improving the TBLASTN module of BLAST

    Directory of Open Access Journals (Sweden)

    Schäffer Alejandro A

    2006-12-01

    Full Text Available Abstract Background TBLASTN is a mode of operation for BLAST that aligns protein sequences to a nucleotide database translated in all six frames. We present the first description of the modern implementation of TBLASTN, focusing on new techniques that were used to implement composition-based statistics for translated nucleotide searches. Composition-based statistics use the composition of the sequences being aligned to generate more accurate E-values, which allows for a more accurate distinction between true and false matches. Until recently, composition-based statistics were available only for protein-protein searches. They are now available as a command line option for recent versions of TBLASTN and as an option for TBLASTN on the NCBI BLAST web server. Results We evaluate the statistical and retrieval accuracy of the E-values reported by a baseline version of TBLASTN and by two variants that use different types of composition-based statistics. To test the statistical accuracy of TBLASTN, we ran 1000 searches using scrambled proteins from the mouse genome and a database of human chromosomes. To test retrieval accuracy, we modernize and adapt to translated searches a test set previously used to evaluate the retrieval accuracy of protein-protein searches. We show that composition-based statistics greatly improve the statistical accuracy of TBLASTN, at a small cost to the retrieval accuracy. Conclusion TBLASTN is widely used, as it is common to wish to compare proteins to chromosomes or to libraries of mRNAs. Composition-based statistics improve the statistical accuracy, and therefore the reliability, of TBLASTN results. The algorithms used by TBLASTN are not widely known, and some of the most important are reported here. The data used to test TBLASTN are available for download and may be useful in other studies of translated search algorithms.

  13. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  14. Irrigated Area Maps and Statistics of India Using Remote Sensing and National Statistics

    Directory of Open Access Journals (Sweden)

    Prasad S. Thenkabail

    2009-04-01

    Full Text Available The goal of this research was to compare the remote-sensing derived irrigated areas with census-derived statistics reported in the national system. India, which has nearly 30% of global annualized irrigated areas (AIAs, and is the leading irrigated area country in the World, along with China, was chosen for the study. Irrigated areas were derived for nominal year 2000 using time-series remote sensing at two spatial resolutions: (a 10-km Advanced Very High Resolution Radiometer (AVHRR and (b 500-m Moderate Resolution Imaging Spectroradiometer (MODIS. These areas were compared with the Indian National Statistical Data on irrigated areas reported by the: (a Directorate of Economics and Statistics (DES of the Ministry of Agriculture (MOA, and (b Ministry of Water Resources (MoWR. A state-by-state comparison of remote sensing derived irrigated areas when compared with MoWR derived irrigation potential utilized (IPU, an equivalent of AIA, provided a high degree of correlation with R2 values of: (a 0.79 with 10-km, and (b 0.85 with MODIS 500-m. However, the remote sensing derived irrigated area estimates for India were consistently higher than the irrigated areas reported by the national statistics. The remote sensing derived total area available for irrigation (TAAI, which does not consider intensity of irrigation, was 101 million hectares (Mha using 10-km and 113 Mha using 500-m. The AIAs, which considers intensity of irrigation, was 132 Mha using 10-km and 146 Mha using 500-m. In contrast the IPU, an equivalent of AIAs, as reported by MoWR was 83 Mha. There are “large variations” in irrigated area statistics reported, even between two ministries (e.g., Directorate of Statistics of Ministry of Agriculture and Ministry of Water Resources of the same national system. The causes include: (a reluctance on part of the states to furnish irrigated area data in view of their vested interests in sharing of water, and (b reporting of large volumes of data

  15. Statistical wind analysis for near-space applications

    Science.gov (United States)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  16. Is Previous Tubal Ligation a Risk Factor for Hysterectomy because of Abnormal Uterine Bleeding?

    Directory of Open Access Journals (Sweden)

    Sanam Moradan

    2012-07-01

    Full Text Available Objectives: Post tubal ligation syndrome (PTLS is a term used to describe a variety of post tubal ligation side effects or symptoms. These include increased menstrual bleeding and hysterectomy. Whether or not post tubal syndrome is a real entity, it has been a subject of controversy in the medical literature for decades. Numerous studies have reported conflicting conclusions about these symptoms. In this study the incidence of hysterectomy for bleeding disorders among sterilized women was compared with the incidence of hysterectomy for bleeding disorders among non-sterilized female population of the same age.Methods: This study was carried out on 160 women, 38-52 years, who underwent hysterectomy in Amir University Hospital, Semnan, Iran, from September 2008 to September 2011. After gathering of data from medical records, in this study, the incidence of hysterectomy for bleeding disorders among sterilized women was compared with the incidence of hysterectomy for bleeding disorders among nonsterilized female population for the same age.Results: The mean age of the study group was 44/4±5/7 and the mean age of the control group was 45/2±5/3, (p=0.424.The mean parity of the study group was 3/8±1/8 and the mean parity of the control group was 3/5±1/4, (p=0.220. So, in regard to age and parity, two groups were matched. Hysterectomies were performed for 160 cases and abnormal uterine bleeding was the cause of hysterectomy in 67 cases. Among 67 cases, 19 cases (37.3% had previous tubal sterilization + hysterectomy (study group and 48 cases (44% were not undergoing tubal sterilization but had hysterectomy for abnormal bleeding causes (control group. Statistical analyses showed that there were not significant differences between two groups, (RR=0.85; 95% CI: 0.56-1.28; p=0.418.Conclusion: The result of this study showed that previous tubal sterilization is not a risk factor for undergoing hysterectomy because of abnormal uterine bleeding.

  17. Three-dimensional Reconstruction and Homogenization of Heterogeneous Materials Using Statistical Correlation Functions and FEM

    Energy Technology Data Exchange (ETDEWEB)

    Baniassadi, Majid; Mortazavi, Behzad; Hamedani, Amani; Garmestani, Hamid; Ahzi, Said; Fathi-Torbaghan, Madjid; Ruch, David; Khaleel, Mohammad A.

    2012-01-31

    In this study, a previously developed reconstruction methodology is extended to three-dimensional reconstruction of a three-phase microstructure, based on two-point correlation functions and two-point cluster functions. The reconstruction process has been implemented based on hybrid stochastic methodology for simulating the virtual microstructure. While different phases of the heterogeneous medium are represented by different cells, growth of these cells is controlled by optimizing parameters such as rotation, shrinkage, translation, distribution and growth rates of the cells. Based on the reconstructed microstructure, finite element method (FEM) was used to compute the effective elastic modulus and effective thermal conductivity. A statistical approach, based on two-point correlation functions, was also used to directly estimate the effective properties of the developed microstructures. Good agreement between the predicted results from FEM analysis and statistical methods was found confirming the efficiency of the statistical methods for prediction of thermo-mechanical properties of three-phase composites.

  18. Statistical flaws in design and analysis of fertility treatment studies on cryopreservation raise doubts on the conclusions

    Science.gov (United States)

    van Gelder, P.H.A.J.M.; Nijs, M.

    2011-01-01

    Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care. PMID:24753877

  19. Statistical flaws in design and analysis of fertility treatment -studies on cryopreservation raise doubts on the conclusions.

    Science.gov (United States)

    van Gelder, P H A J M; Nijs, M

    2011-01-01

    Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost -importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the -required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper -interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care.

  20. Initial phantom study comparing image quality in computed tomography using adaptive statistical iterative reconstruction and new adaptive statistical iterative reconstruction v.

    Science.gov (United States)

    Lim, Kyungjae; Kwon, Heejin; Cho, Jinhan; Oh, Jongyoung; Yoon, Seongkuk; Kang, Myungjin; Ha, Dongho; Lee, Jinhwa; Kang, Eunju

    2015-01-01

    The purpose of this study was to assess the image quality of a novel advanced iterative reconstruction (IR) method called as "adaptive statistical IR V" (ASIR-V) by comparing the image noise, contrast-to-noise ratio (CNR), and spatial resolution from those of filtered back projection (FBP) and adaptive statistical IR (ASIR) on computed tomography (CT) phantom image. We performed CT scans at 5 different tube currents (50, 70, 100, 150, and 200 mA) using 3 types of CT phantoms. Scanned images were subsequently reconstructed in 7 different scan settings, such as FBP, and 3 levels of ASIR and ASIR-V (30%, 50%, and 70%). The image noise was measured in the first study using body phantom. The CNR was measured in the second study using contrast phantom and the spatial resolutions were measured in the third study using a high-resolution phantom. We compared the image noise, CNR, and spatial resolution among the 7 reconstructed image scan settings to determine whether noise reduction, high CNR, and high spatial resolution could be achieved at ASIR-V. At quantitative analysis of the first and second studies, it showed that the images reconstructed using ASIR-V had reduced image noise and improved CNR compared with those of FBP and ASIR (P ASIR-V had significantly improved spatial resolution than those of FBP and ASIR (P ASIR-V provides a significant reduction in image noise and a significant improvement in CNR as well as spatial resolution. Therefore, this technique has the potential to reduce the radiation dose further without compromising image quality.

  1. Evaluation of PDA Technical Report No 33. Statistical Testing Recommendations for a Rapid Microbiological Method Case Study.

    Science.gov (United States)

    Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David

    2015-01-01

    New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc

  2. Statistics of 2D solitons

    International Nuclear Information System (INIS)

    Brekke, L.; Imbo, T.D.

    1992-01-01

    The authors study the inequivalent quantizations of (1 + 1)-dimensional nonlinear sigma models with space manifold S 1 and target manifold X. If x is multiply connected, these models possess topological solitons. After providing a definition of spin and statistics for these solitons and demonstrating a spin-statistics correlation, we give various examples where the solitons can have exotic statistics. In some of these models, the solitons may obey a generalized version of fractional statistics called ambistatistics. In this paper the relevance of these 2d models to the statistics of vortices in (2 + 1)-dimensional spontaneously broken gauge theories is discussed. The authors close with a discussion concerning the extension of our results to higher dimensions

  3. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    Science.gov (United States)

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights

  4. Evidence-based orthodontics. Current statistical trends in published articles in one journal.

    Science.gov (United States)

    Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J

    2010-09-01

    To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).

  5. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  6. Meta-analysis of prediction model performance across multiple studies: Which scale helps ensure between-study normality for the C-statistic and calibration measures?

    Science.gov (United States)

    Snell, Kym Ie; Ensor, Joie; Debray, Thomas Pa; Moons, Karel Gm; Riley, Richard D

    2017-01-01

    If individual participant data are available from multiple studies or clusters, then a prediction model can be externally validated multiple times. This allows the model's discrimination and calibration performance to be examined across different settings. Random-effects meta-analysis can then be used to quantify overall (average) performance and heterogeneity in performance. This typically assumes a normal distribution of 'true' performance across studies. We conducted a simulation study to examine this normality assumption for various performance measures relating to a logistic regression prediction model. We simulated data across multiple studies with varying degrees of variability in baseline risk or predictor effects and then evaluated the shape of the between-study distribution in the C-statistic, calibration slope, calibration-in-the-large, and E/O statistic, and possible transformations thereof. We found that a normal between-study distribution was usually reasonable for the calibration slope and calibration-in-the-large; however, the distributions of the C-statistic and E/O were often skewed across studies, particularly in settings with large variability in the predictor effects. Normality was vastly improved when using the logit transformation for the C-statistic and the log transformation for E/O, and therefore we recommend these scales to be used for meta-analysis. An illustrated example is given using a random-effects meta-analysis of the performance of QRISK2 across 25 general practices.

  7. Subjective Ratings of Beauty and Aesthetics: Correlations With Statistical Image Properties in Western Oil Paintings

    Science.gov (United States)

    Lehmann, Thomas; Redies, Christoph

    2017-01-01

    For centuries, oil paintings have been a major segment of the visual arts. The JenAesthetics data set consists of a large number of high-quality images of oil paintings of Western provenance from different art periods. With this database, we studied the relationship between objective image measures and subjective evaluations of the images, especially evaluations on aesthetics (defined as artistic value) and beauty (defined as individual liking). The objective measures represented low-level statistical image properties that have been associated with aesthetic value in previous research. Subjective rating scores on aesthetics and beauty correlated not only with each other but also with different combinations of the objective measures. Furthermore, we found that paintings from different art periods vary with regard to the objective measures, that is, they exhibit specific patterns of statistical image properties. In addition, clusters of participants preferred different combinations of these properties. In conclusion, the results of the present study provide evidence that statistical image properties vary between art periods and subject matters and, in addition, they correlate with the subjective evaluation of paintings by the participants. PMID:28694958

  8. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  9. A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L

    2014-01-01

    We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.

  10. Statistics Anxiety and Business Statistics: The International Student

    Science.gov (United States)

    Bell, James A.

    2008-01-01

    Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…

  11. The biomechanics of running in athletes with previous hamstring injury: A case-control study.

    Science.gov (United States)

    Daly, C; Persson, U McCarthy; Twycross-Lewis, R; Woledge, R C; Morrissey, D

    2016-04-01

    Hamstring injury is prevalent with persistently high reinjury rates. We aim to inform hamstring rehabilitation by exploring the electromyographic and kinematic characteristics of running in athletes with previous hamstring injury. Nine elite male Gaelic games athletes who had returned to sport after hamstring injury and eight closely matched controls sprinted while lower limb kinematics and muscle activity of the previously injured biceps femoris, bilateral gluteus maximus, lumbar erector spinae, rectus femoris, and external oblique were recorded. Intergroup comparisons of muscle activation ratios and kinematics were performed. Previously injured athletes demonstrated significantly reduced biceps femoris muscle activation ratios with respect to ipsilateral gluteus maximus (maximum difference -12.5%, P = 0.03), ipsilateral erector spinae (maximum difference -12.5%, P = 0.01), ipsilateral external oblique (maximum difference -23%, P = 0.01), and contralateral rectus femoris (maximum difference -22%, P = 0.02) in the late swing phase. We also detected sagittal asymmetry in hip flexion (maximum 8°, P = 0.01), pelvic tilt (maximum 4°, P = 0.02), and medial rotation of the knee (maximum 6°, P = 0.03) effectively putting the hamstrings in a lengthened position just before heel strike. Previous hamstring injury is associated with altered biceps femoris associated muscle activity and potentially injurious kinematics. These deficits should be considered and addressed during rehabilitation. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  13. Comparison of Vital Statistics Definitions of Suicide against a Coroner Reference Standard: A Population-Based Linkage Study.

    Science.gov (United States)

    Gatov, Evgenia; Kurdyak, Paul; Sinyor, Mark; Holder, Laura; Schaffer, Ayal

    2018-03-01

    We sought to determine the utility of health administrative databases for population-based suicide surveillance, as these data are generally more accessible and more integrated with other data sources compared to coroners' records. In this retrospective validation study, we identified all coroner-confirmed suicides between 2003 and 2012 in Ontario residents aged 21 and over and linked this information to Statistics Canada's vital statistics data set. We examined the overlap between the underlying cause of death field and secondary causes of death using ICD-9 and ICD-10 codes for deliberate self-harm (i.e., suicide) and examined the sociodemographic and clinical characteristics of misclassified records. Among 10,153 linked deaths, there was a very high degree of overlap between records coded as deliberate self-harm in the vital statistics data set and coroner-confirmed suicides using both ICD-9 and ICD-10 definitions (96.88% and 96.84% sensitivity, respectively). This alignment steadily increased throughout the study period (from 95.9% to 98.8%). Other vital statistics diagnoses in primary fields included uncategorised signs and symptoms. Vital statistics records that were misclassified did not differ from valid records in terms of sociodemographic characteristics but were more likely to have had an unspecified place of injury on the death certificate ( P statistics and coroner classification of suicide deaths suggests that health administrative data can reliably be used to identify suicide deaths.

  14. Statistical analysis plan for the Adjunctive Corticosteroid Treatment in Critically Ill Patients with Septic Shock (ADRENAL) trial

    DEFF Research Database (Denmark)

    Billot, Laurent; Venkatesh, Balasubramanian; Myburgh, John

    2017-01-01

    BACKGROUND: The Adjunctive Corticosteroid Treatment in Critically Ill Patients with Septic Shock (ADRENAL) trial, a 3800-patient, multicentre, randomised controlled trial, will be the largest study to date of corticosteroid therapy in patients with septic shock. OBJECTIVE: To describe a statistical...... and statisticians and approved by the ADRENAL management committee. All authors were blind to treatment allocation and to the unblinded data produced during two interim analyses conducted by the Data Safety and Monitoring Committee. The data shells were produced from a previously published protocol. Statistical...... analyses are described in broad detail. Trial outcomes were selected and categorised into primary, secondary and tertiary outcomes, and appropriate statistical comparisons between groups are planned and described in a way that is transparent, available to the public, verifiable and determined before...

  15. 47 CFR 1.363 - Introduction of statistical data.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Introduction of statistical data. 1.363 Section... Proceedings Evidence § 1.363 Introduction of statistical data. (a) All statistical studies, offered in... analyses, and experiments, and those parts of other studies involving statistical methodology shall be...

  16. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  17. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    Science.gov (United States)

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  18. Statistical summaries of selected Iowa streamflow data through September 2013

    Science.gov (United States)

    Eash, David A.; O'Shea, Padraic S.; Weber, Jared R.; Nguyen, Kevin T.; Montgomery, Nicholas L.; Simonson, Adrian J.

    2016-01-04

    Statistical summaries of streamflow data collected at 184 streamgages in Iowa are presented in this report. All streamgages included for analysis have at least 10 years of continuous record collected before or through September 2013. This report is an update to two previously published reports that presented statistical summaries of selected Iowa streamflow data through September 1988 and September 1996. The statistical summaries include (1) monthly and annual flow durations, (2) annual exceedance probabilities of instantaneous peak discharges (flood frequencies), (3) annual exceedance probabilities of high discharges, and (4) annual nonexceedance probabilities of low discharges and seasonal low discharges. Also presented for each streamgage are graphs of the annual mean discharges, mean annual mean discharges, 50-percent annual flow-duration discharges (median flows), harmonic mean flows, mean daily mean discharges, and flow-duration curves. Two sets of statistical summaries are presented for each streamgage, which include (1) long-term statistics for the entire period of streamflow record and (2) recent-term statistics for or during the 30-year period of record from 1984 to 2013. The recent-term statistics are only calculated for streamgages with streamflow records pre-dating the 1984 water year and with at least 10 years of record during 1984–2013. The streamflow statistics in this report are not adjusted for the effects of water use; although some of this water is used consumptively, most of it is returned to the streams.

  19. Dexamethasone intravitreal implant in previously treated patients with diabetic macular edema : Subgroup analysis of the MEAD study

    OpenAIRE

    Augustin, A.J.; Kuppermann, B.D.; Lanzetta, P.; Loewenstein, A.; Li, X.; Cui, H.; Hashad, Y.; Whitcup, S.M.; Abujamra, S.; Acton, J.; Ali, F.; Antoszyk, A.; Awh, C.C.; Barak, A.; Bartz-Schmidt, K.U.

    2015-01-01

    Background Dexamethasone intravitreal implant 0.7?mg (DEX 0.7) was approved for treatment of diabetic macular edema (DME) after demonstration of its efficacy and safety in the MEAD registration trials. We performed subgroup analysis of MEAD study results to evaluate the efficacy and safety of DEX 0.7 treatment in patients with previously treated DME. Methods Three-year, randomized, sham-controlled phase 3 study in patients with DME, best-corrected visual acuity (BCVA) of 34?68 Early Treatment...

  20. Spreadsheets as tools for statistical computing and statistics education

    OpenAIRE

    Neuwirth, Erich

    2000-01-01

    Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.

  1. Multispecies Coevolution Particle Swarm Optimization Based on Previous Search History

    Directory of Open Access Journals (Sweden)

    Danping Wang

    2017-01-01

    Full Text Available A hybrid coevolution particle swarm optimization algorithm with dynamic multispecies strategy based on K-means clustering and nonrevisit strategy based on Binary Space Partitioning fitness tree (called MCPSO-PSH is proposed. Previous search history memorized into the Binary Space Partitioning fitness tree can effectively restrain the individuals’ revisit phenomenon. The whole population is partitioned into several subspecies and cooperative coevolution is realized by an information communication mechanism between subspecies, which can enhance the global search ability of particles and avoid premature convergence to local optimum. To demonstrate the power of the method, comparisons between the proposed algorithm and state-of-the-art algorithms are grouped into two categories: 10 basic benchmark functions (10-dimensional and 30-dimensional, 10 CEC2005 benchmark functions (30-dimensional, and a real-world problem (multilevel image segmentation problems. Experimental results show that MCPSO-PSH displays a competitive performance compared to the other swarm-based or evolutionary algorithms in terms of solution accuracy and statistical tests.

  2. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  3. Statistics II essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Statistics II discusses sampling theory, statistical inference, independent and dependent variables, correlation theory, experimental design, count data, chi-square test, and time se

  4. Exploring Statistics Anxiety: Contrasting Mathematical, Academic Performance and Trait Psychological Predictors

    Science.gov (United States)

    Bourne, Victoria J.

    2018-01-01

    Statistics anxiety is experienced by a large number of psychology students, and previous research has examined a range of potential correlates, including academic performance, mathematical ability and psychological predictors. These varying predictors are often considered separately, although there may be shared variance between them. In the…

  5. A comparative analysis of the statistical properties of large mobile phone calling networks.

    Science.gov (United States)

    Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N

    2014-05-30

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of these four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture of a representative large social network that might shed new lights on the modelling of large social networks.

  6. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  7. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  8. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  9. Parallel auto-correlative statistics with VTK.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  10. ON STATISTICALLY CONVERGENT IN FINITE DIMENSIONAL SPACES

    OpenAIRE

    GÜNCAN, Ayşe Nur

    2009-01-01

    Abstract: In this paper, the notion of statistical convergence, which was introduced by Steinhaus (1951), was studied in Rm ; and some concepts and theorems, whose statistical correspondence for the real number sequences were given, were carried to Rm . In addition, the concepts of the statistical limit point and the statistical cluster point were given and it was mentioned that these two concepts were'nt equal in Fridy's study in 1993. These concepts were given in Rm and the i...

  11. Statistics for Learning Genetics

    Science.gov (United States)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  12. Learning Styles Preferences of Statistics Students: A Study in the Faculty of Business and Economics at the UAE University

    Science.gov (United States)

    Yousef, Darwish Abdulrahman

    2016-01-01

    Purpose: Although there are many studies addressing the learning styles of business students as well as students of other disciplines, there are few studies which address the learning style preferences of statistics students. The purpose of this study is to explore the learning style preferences of statistics students at a United Arab Emirates…

  13. Experimental statistics for biological sciences.

    Science.gov (United States)

    Bang, Heejung; Davidian, Marie

    2010-01-01

    In this chapter, we cover basic and fundamental principles and methods in statistics - from "What are Data and Statistics?" to "ANOVA and linear regression," which are the basis of any statistical thinking and undertaking. Readers can easily find the selected topics in most introductory statistics textbooks, but we have tried to assemble and structure them in a succinct and reader-friendly manner in a stand-alone chapter. This text has long been used in real classroom settings for both undergraduate and graduate students who do or do not major in statistical sciences. We hope that from this chapter, readers would understand the key statistical concepts and terminologies, how to design a study (experimental or observational), how to analyze the data (e.g., describe the data and/or estimate the parameter(s) and make inference), and how to interpret the results. This text would be most useful if it is used as a supplemental material, while the readers take their own statistical courses or it would serve as a great reference text associated with a manual for any statistical software as a self-teaching guide.

  14. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    Science.gov (United States)

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  15. Statistical mechanics of learning orthogonal signals for general covariance models

    International Nuclear Information System (INIS)

    Hoyle, David C

    2010-01-01

    Statistical mechanics techniques have proved to be useful tools in quantifying the accuracy with which signal vectors are extracted from experimental data. However, analysis has previously been limited to specific model forms for the population covariance C, which may be inappropriate for real world data sets. In this paper we obtain new statistical mechanical results for a general population covariance matrix C. For data sets consisting of p sample points in R N we use the replica method to study the accuracy of orthogonal signal vectors estimated from the sample data. In the asymptotic limit of N,p→∞ at fixed α = p/N, we derive analytical results for the signal direction learning curves. In the asymptotic limit the learning curves follow a single universal form, each displaying a retarded learning transition. An explicit formula for the location of the retarded learning transition is obtained and we find marked variation in the location of the retarded learning transition dependent on the distribution of population covariance eigenvalues. The results of the replica analysis are confirmed against simulation

  16. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Science.gov (United States)

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  17. Single-cell mRNA transfection studies: delivery, kinetics and statistics by numbers.

    Science.gov (United States)

    Leonhardt, Carolin; Schwake, Gerlinde; Stögbauer, Tobias R; Rappl, Susanne; Kuhr, Jan-Timm; Ligon, Thomas S; Rädler, Joachim O

    2014-05-01

    In artificial gene delivery, messenger RNA (mRNA) is an attractive alternative to plasmid DNA (pDNA) since it does not require transfer into the cell nucleus. Here we show that, unlike for pDNA transfection, the delivery statistics and dynamics of mRNA-mediated expression are generic and predictable in terms of mathematical modeling. We measured the single-cell expression time-courses and levels of enhanced green fluorescent protein (eGFP) using time-lapse microscopy and flow cytometry (FC). The single-cell analysis provides direct access to the distribution of onset times, life times and expression rates of mRNA and eGFP. We introduce a two-step stochastic delivery model that reproduces the number distribution of successfully delivered and translated mRNA molecules and thereby the dose-response relation. Our results establish a statistical framework for mRNA transfection and as such should advance the development of RNA carriers and small interfering/micro RNA-based drugs. This team of authors established a statistical framework for mRNA transfection by using a two-step stochastic delivery model that reproduces the number distribution of successfully delivered and translated mRNA molecules and thereby their dose-response relation. This study establishes a nice connection between theory and experimental planning and will aid the cellular delivery of mRNA molecules. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Towards evidence-based computational statistics: lessons from clinical research on the role and design of real-data benchmark studies.

    Science.gov (United States)

    Boulesteix, Anne-Laure; Wilson, Rory; Hapfelmeier, Alexander

    2017-09-09

    The goal of medical research is to develop interventions that are in some sense superior, with respect to patient outcome, to interventions currently in use. Similarly, the goal of research in methodological computational statistics is to develop data analysis tools that are themselves superior to the existing tools. The methodology of the evaluation of medical interventions continues to be discussed extensively in the literature and it is now well accepted that medicine should be at least partly "evidence-based". Although we statisticians are convinced of the importance of unbiased, well-thought-out study designs and evidence-based approaches in the context of clinical research, we tend to ignore these principles when designing our own studies for evaluating statistical methods in the context of our methodological research. In this paper, we draw an analogy between clinical trials and real-data-based benchmarking experiments in methodological statistical science, with datasets playing the role of patients and methods playing the role of medical interventions. Through this analogy, we suggest directions for improvement in the design and interpretation of studies which use real data to evaluate statistical methods, in particular with respect to dataset inclusion criteria and the reduction of various forms of bias. More generally, we discuss the concept of "evidence-based" statistical research, its limitations and its impact on the design and interpretation of real-data-based benchmark experiments. We suggest that benchmark studies-a method of assessment of statistical methods using real-world datasets-might benefit from adopting (some) concepts from evidence-based medicine towards the goal of more evidence-based statistical research.

  19. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  20. Challenging previous conceptions of vegetarianism and eating disorders.

    Science.gov (United States)

    Fisak, B; Peterson, R D; Tantleff-Dunn, S; Molnar, J M

    2006-12-01

    The purpose of this study was to replicate and expand upon previous research that has examined the potential association between vegetarianism and disordered eating. Limitations of previous research studies are addressed, including possible low reliability of measures of eating pathology within vegetarian samples, use of only a few dietary restraint measures, and a paucity of research examining potential differences in body image and food choice motives of vegetarians versus nonvegetarians. Two hundred and fifty-six college students completed a number of measures of eating pathology and body image, and a food choice motives questionnaire. Interestingly, no significant differences were found between vegetarians and nonvegetarians in measures of eating pathology or body image. However, significant differences in food choice motives were found. Implications for both researchers and clinicians are discussed.

  1. Excel 2016 for health services management statistics a guide to solving problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching health services management statistics effectively. Similar to the previously published Excel 2013 for Health Services Management Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical health service management problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in health service courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Health Services Management Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and managers how to apply...

  2. Methodological difficulties of conducting agroecological studies from a statistical perspective

    DEFF Research Database (Denmark)

    Bianconi, A.; Dalgaard, Tommy; Manly, Bryan F J

    2013-01-01

    Statistical methods for analysing agroecological data might not be able to help agroecologists to solve all of the current problems concerning crop and animal husbandry, but such methods could well help agroecologists to assess, tackle, and resolve several agroecological issues in a more reliable...... and accurate manner. Therefore, our goal in this paper is to discuss the importance of statistical tools for alternative agronomic approaches, because alternative approaches, such as organic farming, should not only be promoted by encouraging farmers to deploy agroecological techniques, but also by providing...

  3. Understanding Statistics and Statistics Education: A Chinese Perspective

    Science.gov (United States)

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  4. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  5. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  6. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  7. A previous hamstring injury affects kicking mechanics in soccer players.

    Science.gov (United States)

    Navandar, Archit; Veiga, Santiago; Torres, Gonzalo; Chorro, David; Navarro, Enrique

    2018-01-10

    Although the kicking skill is influenced by limb dominance and sex, how a previous hamstring injury affects kicking has not been studied in detail. Thus, the objective of this study was to evaluate the effect of sex and limb dominance on kicking in limbs with and without a previous hamstring injury. 45 professional players (males: n=19, previously injured players=4, age=21.16 ± 2.00 years; females: n=19, previously injured players=10, age=22.15 ± 4.50 years) performed 5 kicks each with their preferred and non-preferred limb at a target 7m away, which were recorded with a three-dimensional motion capture system. Kinematic and kinetic variables were extracted for the backswing, leg cocking, leg acceleration and follow through phases. A shorter backswing (20.20 ± 3.49% vs 25.64 ± 4.57%), and differences in knee flexion angle (58 ± 10o vs 72 ± 14o) and hip flexion velocity (8 ± 0rad/s vs 10 ± 2rad/s) were observed in previously injured, non-preferred limb kicks for females. A lower peak hip linear velocity (3.50 ± 0.84m/s vs 4.10 ± 0.45m/s) was observed in previously injured, preferred limb kicks of females. These differences occurred in the backswing and leg-cocking phases where the hamstring muscles were the most active. A variation in the functioning of the hamstring muscles and that of the gluteus maximus and iliopsoas in the case of a previous injury could account for the differences observed in the kicking pattern. Therefore, the effects of a previous hamstring injury must be considered while designing rehabilitation programs to re-educate kicking movement.

  8. Statistical theory of resistive drift-wave turbulence and transport

    International Nuclear Information System (INIS)

    Hu, G.; Krommes, J.A.; Bowman, J.C.

    1997-01-01

    Resistive drift-wave turbulence in a slab geometry is studied by statistical closure methods and direct numerical simulations. The two-field Hasegawa endash Wakatani (HW) fluid model, which evolves the electrostatic potential and plasma density self-consistently, is a paradigm for understanding the generic nonlinear behavior of multiple-field plasma turbulence. A gyrokinetic derivation of the HW model is sketched. The recently developed Realizable Markovian Closure (RMC) is applied to the HW model; spectral properties, nonlinear energy transfers, and turbulent transport calculations are discussed. The closure results are also compared to direct numerical simulation results; excellent agreement is found. The transport scaling with the adiabaticity parameter, which measures the strength of the parallel electron resistivity, is analytically derived and understood through weak- and strong-turbulence analyses. No evidence is found to support previous suggestions that coherent structures cause a large depression of saturated transport from its quasilinear value in the hydrodynamic regime of the HW model. Instead, the depression of transport is well explained by the spectral balance equation of the (second-order) statistical closure when account is taken of incoherent noise. copyright 1997 American Institute of Physics

  9. Statistical mechanics of anyons

    International Nuclear Information System (INIS)

    Arovas, D.P.

    1985-01-01

    We study the statistical mechanics of a two-dimensional gas of free anyons - particles which interpolate between Bose-Einstein and Fermi-Dirac character. Thermodynamic quantities are discussed in the low-density regime. In particular, the second virial coefficient is evaluated by two different methods and is found to exhibit a simple, periodic, but nonanalytic behavior as a function of the statistics determining parameter. (orig.)

  10. Multimodal integration in statistical learning

    DEFF Research Database (Denmark)

    Mitchell, Aaron; Christiansen, Morten Hyllekvist; Weiss, Dan

    2014-01-01

    , we investigated the ability of adults to integrate audio and visual input during statistical learning. We presented learners with a speech stream synchronized with a video of a speaker’s face. In the critical condition, the visual (e.g., /gi/) and auditory (e.g., /mi/) signals were occasionally...... facilitated participants’ ability to segment the speech stream. Our results therefore demonstrate that participants can integrate audio and visual input to perceive the McGurk illusion during statistical learning. We interpret our findings as support for modality-interactive accounts of statistical learning.......Recent advances in the field of statistical learning have established that learners are able to track regularities of multimodal stimuli, yet it is unknown whether the statistical computations are performed on integrated representations or on separate, unimodal representations. In the present study...

  11. Electricity Statistics for France. Definitive results for the year 2015

    International Nuclear Information System (INIS)

    2016-01-01

    The mission of RTE, the French power transmission system operator, a public service assignment, is to balance the electricity supply and demand in real time. This report presents some detailed statistics on electricity flows in France, on electricity market mechanism and on facilities: consumption, generation, trade, RTE's network performance and evolution with respect to the previous year

  12. Electricity Statistics for France. Definitive results for the year 2013

    International Nuclear Information System (INIS)

    2014-01-01

    The mission of RTE, the French power transmission system operator, a public service assignment, is to balance the electricity supply and demand in real time. This report presents some detailed statistics on electricity flows in France, on electricity market mechanism and on facilities: consumption, generation, trade, RTE's network performance and evolution with respect to the previous year

  13. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  14. FDG-PET and CT patterns of bone metastases and their relationship to previously administered anti-cancer therapy

    International Nuclear Information System (INIS)

    Israel, Ora; Bar-Shalom, Rachel; Keidar, Zohar; Goldberg, Anat; Nachtigal, Alicia; Militianu, Daniela; Fogelman, Ignac

    2006-01-01

    To assess 18 F-fluorodeoxyglucose (FDG) uptake in bone metastases in patients with and without previous treatment, and compare positive positron emission tomography (PET) with osteolytic or osteoblastic changes on computed tomography (CT). One hundred and thirty-one FDG-PET/CT studies were reviewed for bone metastases. A total of 294 lesions were found in 76 patients, 81 in untreated patients and 213 in previously treated patients. PET was assessed for abnormal FDG uptake localised by PET/CT to the skeleton. CT was evaluated for bone metastases and for blastic or lytic pattern. The relationship between the presence and pattern of bone metastases on PET and CT, and prior treatment was statistically analysed using the chi-square test. PET identified 174 (59%) metastases, while CT detected 280 (95%). FDG-avid metastases included 74/81 (91%) untreated and 100/213 (47%) treated lesions (p<0.001). On CT there were 76/81 (94%) untreated and 204/213 (96%) treated metastases (p NS). In untreated patients, 85% of lesions were seen on both PET and CT (26 blastic, 43 lytic). In treated patients, 53% of lesions were seen only on CT (95 blastic, 18 lytic). Of the osteoblastic metastases, 65/174 (37%) were PET positive and 98/120 (82%), PET negative (p<0.001). The results of the present study indicate that when imaging bone metastases, prior treatment can alter the relationship between PET and CT findings. Most untreated bone metastases are PET positive and lytic on CT, while in previously treated patients most lesions are PET negative and blastic on CT. PET and CT therefore appear to be complementary in the assessment of bone metastases. (orig.)

  15. Statistics and finance an introduction

    CERN Document Server

    Ruppert, David

    2004-01-01

    This textbook emphasizes the applications of statistics and probability to finance. Students are assumed to have had a prior course in statistics, but no background in finance or economics. The basics of probability and statistics are reviewed and more advanced topics in statistics, such as regression, ARMA and GARCH models, the bootstrap, and nonparametric regression using splines, are introduced as needed. The book covers the classical methods of finance such as portfolio theory, CAPM, and the Black-Scholes formula, and it introduces the somewhat newer area of behavioral finance. Applications and use of MATLAB and SAS software are stressed. The book will serve as a text in courses aimed at advanced undergraduates and masters students in statistics, engineering, and applied mathematics as well as quantitatively oriented MBA students. Those in the finance industry wishing to know more statistics could also use it for self-study. David Ruppert is the Andrew Schultz, Jr. Professor of Engineering, School of Oper...

  16. Statistical methods and errors in family medicine articles between 2010 and 2014-Suez Canal University, Egypt: A cross-sectional study.

    Science.gov (United States)

    Nour-Eldein, Hebatallah

    2016-01-01

    With limited statistical knowledge of most physicians it is not uncommon to find statistical errors in research articles. To determine the statistical methods and to assess the statistical errors in family medicine (FM) research articles that were published between 2010 and 2014. This was a cross-sectional study. All 66 FM research articles that were published over 5 years by FM authors with affiliation to Suez Canal University were screened by the researcher between May and August 2015. Types and frequencies of statistical methods were reviewed in all 66 FM articles. All 60 articles with identified inferential statistics were examined for statistical errors and deficiencies. A comprehensive 58-item checklist based on statistical guidelines was used to evaluate the statistical quality of FM articles. Inferential methods were recorded in 62/66 (93.9%) of FM articles. Advanced analyses were used in 29/66 (43.9%). Contingency tables 38/66 (57.6%), regression (logistic, linear) 26/66 (39.4%), and t-test 17/66 (25.8%) were the most commonly used inferential tests. Within 60 FM articles with identified inferential statistics, no prior sample size 19/60 (31.7%), application of wrong statistical tests 17/60 (28.3%), incomplete documentation of statistics 59/60 (98.3%), reporting P value without test statistics 32/60 (53.3%), no reporting confidence interval with effect size measures 12/60 (20.0%), use of mean (standard deviation) to describe ordinal/nonnormal data 8/60 (13.3%), and errors related to interpretation were mainly for conclusions without support by the study data 5/60 (8.3%). Inferential statistics were used in the majority of FM articles. Data analysis and reporting statistics are areas for improvement in FM research articles.

  17. Statistical prediction of biomethane potentials based on the composition of lignocellulosic biomass

    DEFF Research Database (Denmark)

    Thomsen, Sune Tjalfe; Spliid, Henrik; Østergård, Hanne

    2014-01-01

    Mixture models are introduced as a new and stronger methodology for statistical prediction of biomethane potentials (BPM) from lignocellulosic biomass compared to the linear regression models previously used. A large dataset from literature combined with our own data were analysed using canonical...

  18. Adjusting the Adjusted X[superscript 2]/df Ratio Statistic for Dichotomous Item Response Theory Analyses: Does the Model Fit?

    Science.gov (United States)

    Tay, Louis; Drasgow, Fritz

    2012-01-01

    Two Monte Carlo simulation studies investigated the effectiveness of the mean adjusted X[superscript 2]/df statistic proposed by Drasgow and colleagues and, because of problems with the method, a new approach for assessing the goodness of fit of an item response theory model was developed. It has been previously recommended that mean adjusted…

  19. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  20. Interactions among Knowledge, Beliefs, and Goals in Framing a Qualitative Study in Statistics Education

    Science.gov (United States)

    Groth, Randall E.

    2010-01-01

    In the recent past, qualitative research methods have become more prevalent in the field of statistics education. This paper offers thoughts on the process of framing a qualitative study by means of an illustrative example. The decisions that influenced the framing of a study of pre-service teachers' understanding of the concept of statistical…

  1. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  2. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  3. Myocardial infarction (heart attack) and its risk factors: a statistical study

    International Nuclear Information System (INIS)

    Salahuddin; Alamgir

    2005-01-01

    A Statistical technique of odds ratio analysis was performed to look at the association of Myocardial Infarction with sex, smoking, hypertension, cholesterol, diabetes, family history, number of dependents, household income and residence. For this purpose a total of 506 patients were examined and their personal and medical data were collected. For each patient, the phenomenon of myocardial infarction was studied in relation to different risk factors. The analysis suggests that smoking, hypertension, cholesterol level, diabetes, family history are important risk factors for the occurrence of MI. (author)

  4. Statistical study of the non-linear propagation of a partially coherent laser beam

    International Nuclear Information System (INIS)

    Ayanides, J.P.

    2001-01-01

    This research thesis is related to the LMJ project (Laser MegaJoule) and thus to the study and development of thermonuclear fusion. It reports the study of the propagation of a partially-coherent laser beam by using a statistical modelling in order to obtain mean values for the field, and thus bypassing a complex and costly calculation of deterministic quantities. Random fluctuations of the propagated field are supposed to comply with a Gaussian statistics; the laser central wavelength is supposed to be small with respect with fluctuation magnitude; a scale factor is introduced to clearly distinguish the scale of the random and fast variations of the field fluctuations, and the scale of the slow deterministic variations of the field envelopes. The author reports the study of propagation through a purely linear media and through a non-dispersive media, and then through slow non-dispersive and non-linear media (in which the reaction time is large with respect to grain correlation duration, but small with respect to the variation scale of the field macroscopic envelope), and thirdly through an instantaneous dispersive and non linear media (which instantaneously reacts to the field) [fr

  5. Proceedings of the first ERDA statistical symposium, Los Alamos, NM, November 3--5, 1975. [Sixteen papers

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, W L; Harris, J L [eds.

    1976-03-01

    The First ERDA Statistical Symposium was organized to provide a means for communication among ERDA statisticians, and the sixteen papers presented at the meeting are given. Topics include techniques of numerical analysis used for accelerators, nuclear reactors, skewness and kurtosis statistics, radiochemical spectral analysis, quality control, and other statistics problems. Nine of the papers were previously announced in Nuclear Science Abstracts (NSA), while the remaining seven were abstracted for ERDA Energy Research Abstracts (ERA) and INIS Atomindex. (PMA)

  6. A study of the feasibility of statistical analysis of airport performance simulation

    Science.gov (United States)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  7. Study of the effects of photoelectron statistics on Thomson scattering data

    International Nuclear Information System (INIS)

    Hart, G.W.; Levinton, F.M.; McNeill, D.H.

    1986-01-01

    A computer code has been developed which simulates a Thomson scattering measurement, from the counting statistics of the input channels through the mathematical analysis of the data. The scattered and background signals in each of the wavelength channels are assumed to obey Poisson statistics, and the spectral data are fitted to a Gaussian curve using a nonlinear least-squares fitting algorithm. This method goes beyond the usual calculation of the signal-to-noise ratio for the hardware and gives a quantitative measure of the effect of the noise on the final measurement. This method is applicable to Thomson scattering measurements in which the signal-to-noise ratio is low due to either low signal or high background. Thomson scattering data from the S-1 spheromak have been compared to this simulation, and they have been found to be in good agreement. This code has proven to be useful in assessing the effects of counting statistics relative to shot-to-shot variability in producing the observed spread in the data. It was also useful for designing improvements for the S-1 Thomson scattering system, and this method would be applicable to any measurement affected by counting statistics

  8. What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science.

    Science.gov (United States)

    Patil, Prasad; Peng, Roger D; Leek, Jeffrey T

    2016-07-01

    A recent study of the replicability of key psychological findings is a major contribution toward understanding the human side of the scientific process. Despite the careful and nuanced analysis reported, the simple narrative disseminated by the mass, social, and scientific media was that in only 36% of the studies were the original results replicated. In the current study, however, we showed that 77% of the replication effect sizes reported were within a 95% prediction interval calculated using the original effect size. Our analysis suggests two critical issues in understanding replication of psychological studies. First, researchers' intuitive expectations for what a replication should show do not always match with statistical estimates of replication. Second, when the results of original studies are very imprecise, they create wide prediction intervals-and a broad range of replication effects that are consistent with the original estimates. This may lead to effects that replicate successfully, in that replication results are consistent with statistical expectations, but do not provide much information about the size (or existence) of the true effect. In this light, the results of the Reproducibility Project: Psychology can be viewed as statistically consistent with what one might expect when performing a large-scale replication experiment. © The Author(s) 2016.

  9. Classical model of intermediate statistics

    International Nuclear Information System (INIS)

    Kaniadakis, G.

    1994-01-01

    In this work we present a classical kinetic model of intermediate statistics. In the case of Brownian particles we show that the Fermi-Dirac (FD) and Bose-Einstein (BE) distributions can be obtained, just as the Maxwell-Boltzmann (MD) distribution, as steady states of a classical kinetic equation that intrinsically takes into account an exclusion-inclusion principle. In our model the intermediate statistics are obtained as steady states of a system of coupled nonlinear kinetic equations, where the coupling constants are the transmutational potentials η κκ' . We show that, besides the FD-BE intermediate statistics extensively studied from the quantum point of view, we can also study the MB-FD and MB-BE ones. Moreover, our model allows us to treat the three-state mixing FD-MB-BE intermediate statistics. For boson and fermion mixing in a D-dimensional space, we obtain a family of FD-BE intermediate statistics by varying the transmutational potential η BF . This family contains, as a particular case when η BF =0, the quantum statistics recently proposed by L. Wu, Z. Wu, and J. Sun [Phys. Lett. A 170, 280 (1992)]. When we consider the two-dimensional FD-BE statistics, we derive an analytic expression of the fraction of fermions. When the temperature T→∞, the system is composed by an equal number of bosons and fermions, regardless of the value of η BF . On the contrary, when T=0, η BF becomes important and, according to its value, the system can be completely bosonic or fermionic, or composed both by bosons and fermions

  10. Is Cognitive Activity of Speech Based On Statistical Independence?

    DEFF Research Database (Denmark)

    Feng, Ling; Hansen, Lars Kai

    2008-01-01

    This paper explores the generality of COgnitive Component Analysis (COCA), which is defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. The hypothesis of {COCA} is ecological......: the essentially independent features in a context defined ensemble can be efficiently coded using a sparse independent component representation. Our devised protocol aims at comparing the performance of supervised learning (invoking cognitive activity) and unsupervised learning (statistical regularities) based...... on similar representations, and the only difference lies in the human inferred labels. Inspired by the previous research on COCA, we introduce a new pair of models, which directly employ the independent hypothesis. Statistical regularities are revealed at multiple time scales on phoneme, gender, age...

  11. Annual Bulletin of General Energy Statistics for Europe. V. 23, 1990

    International Nuclear Information System (INIS)

    1992-01-01

    The purpose of the Bulletin is to provide basic data on the energy situation as a whole in European countries, Canada and the United States of America. This publication is purely statistical in character. As from the 1980 edition of the bulletin the scope of statistics comprises production of energy by form, overall energy balance sheets and deliveries of petroleum products for inland consumption. While less details are given for solid and gaseous fuels as sources of energy than in previous editions of the bulletin, more information is available for liquid fuels and nuclear, hydro- and geothermal energy

  12. A rank-based algorithm of differential expression analysis for small cell line data with statistical control.

    Science.gov (United States)

    Li, Xiangyu; Cai, Hao; Wang, Xianlong; Ao, Lu; Guo, You; He, Jun; Gu, Yunyan; Qi, Lishuang; Guan, Qingzhou; Lin, Xu; Guo, Zheng

    2017-10-13

    To detect differentially expressed genes (DEGs) in small-scale cell line experiments, usually with only two or three technical replicates for each state, the commonly used statistical methods such as significance analysis of microarrays (SAM), limma and RankProd (RP) lack statistical power, while the fold change method lacks any statistical control. In this study, we demonstrated that the within-sample relative expression orderings (REOs) of gene pairs were highly stable among technical replicates of a cell line but often widely disrupted after certain treatments such like gene knockdown, gene transfection and drug treatment. Based on this finding, we customized the RankComp algorithm, previously designed for individualized differential expression analysis through REO comparison, to identify DEGs with certain statistical control for small-scale cell line data. In both simulated and real data, the new algorithm, named CellComp, exhibited high precision with much higher sensitivity than the original RankComp, SAM, limma and RP methods. Therefore, CellComp provides an efficient tool for analyzing small-scale cell line data. © The Author 2017. Published by Oxford University Press.

  13. Medical school attrition-beyond the statistics a ten year retrospective study.

    Science.gov (United States)

    Maher, Bridget M; Hynes, Helen; Sweeney, Catherine; Khashan, Ali S; O'Rourke, Margaret; Doran, Kieran; Harris, Anne; Flynn, Siun O'

    2013-01-31

    Medical school attrition is important--securing a place in medical school is difficult and a high attrition rate can affect the academic reputation of a medical school and staff morale. More important, however, are the personal consequences of dropout for the student. The aims of our study were to examine factors associated with attrition over a ten-year period (2001-2011) and to study the personal effects of dropout on individual students. The study included quantitative analysis of completed cohorts and qualitative analysis of ten-year data. Data were collected from individual student files, examination and admission records, exit interviews and staff interviews. Statistical analysis was carried out on five successive completed cohorts. Qualitative data from student files was transcribed and independently analysed by three authors. Data was coded and categorized and key themes were identified. Overall attrition rate was 5.7% (45/779) in 6 completed cohorts when students who transferred to other medical courses were excluded. Students from Kuwait and United Arab Emirates had the highest dropout rate (RR = 5.70, 95% Confidence Intervals 2.65 to 12.27;p psychological morbidity in 40% (higher than other studies). Qualitative analysis revealed recurrent themes of isolation, failure, and despair. Student Welfare services were only accessed by one-third of dropout students. While dropout is often multifactorial, certain red flag signals may alert us to risk of dropout including non-EU origin, academic struggling, absenteeism, social isolation, depression and leave of absence. Psychological morbidity amongst dropout students is high and Student Welfare services should be actively promoted. Absenteeism should prompt early intervention. Behind every dropout statistic lies a personal story. All medical schools have a duty of care to support students who leave the medical programme.

  14. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  15. Proceedings of the 1980 DOE statistical symposium

    International Nuclear Information System (INIS)

    Truett, T.; Margolies, D.; Mensing, R.W.

    1981-04-01

    Separate abstracts were prepared for 8 of the 16 papers presented at the DOE Statistical Symposium in California in October 1980. The topics of those papers not included cover the relative detection efficiency on sets of irradiated fuel elements, estimating failure rates for pumps in nuclear reactors, estimating fragility functions, application of bounded-influence regression, the influence function method applied to energy time series data, reliability problems in power generation systems and uncertainty analysis associated with radioactive waste disposal. The other 8 papers have previously been added to the data base

  16. No discrimination against previous mates in a sexually cannibalistic spider

    Science.gov (United States)

    Fromhage, Lutz; Schneider, Jutta M.

    2005-09-01

    In several animal species, females discriminate against previous mates in subsequent mating decisions, increasing the potential for multiple paternity. In spiders, female choice may take the form of selective sexual cannibalism, which has been shown to bias paternity in favor of particular males. If cannibalistic attacks function to restrict a male's paternity, females may have little interest to remate with males having survived such an attack. We therefore studied the possibility of female discrimination against previous mates in sexually cannibalistic Argiope bruennichi, where females almost always attack their mate at the onset of copulation. We compared mating latency and copulation duration of males having experienced a previous copulation either with the same or with a different female, but found no evidence for discrimination against previous mates. However, males copulated significantly shorter when inserting into a used, compared to a previously unused, genital pore of the female.

  17. Abiraterone in metastatic prostate cancer without previous chemotherapy

    NARCIS (Netherlands)

    Ryan, Charles J.; Smith, Matthew R.; de Bono, Johann S.; Molina, Arturo; Logothetis, Christopher J.; de Souza, Paul; Fizazi, Karim; Mainwaring, Paul; Piulats, Josep M.; Ng, Siobhan; Carles, Joan; Mulders, Peter F. A.; Basch, Ethan; Small, Eric J.; Saad, Fred; Schrijvers, Dirk; van Poppel, Hendrik; Mukherjee, Som D.; Suttmann, Henrik; Gerritsen, Winald R.; Flaig, Thomas W.; George, Daniel J.; Yu, Evan Y.; Efstathiou, Eleni; Pantuck, Allan; Winquist, Eric; Higano, Celestia S.; Taplin, Mary-Ellen; Park, Youn; Kheoh, Thian; Griffin, Thomas; Scher, Howard I.; Rathkopf, Dana E.; Boyce, A.; Costello, A.; Davis, I.; Ganju, V.; Horvath, L.; Lynch, R.; Marx, G.; Parnis, F.; Shapiro, J.; Singhal, N.; Slancar, M.; van Hazel, G.; Wong, S.; Yip, D.; Carpentier, P.; Luyten, D.; de Reijke, T.

    2013-01-01

    Abiraterone acetate, an androgen biosynthesis inhibitor, improves overall survival in patients with metastatic castration-resistant prostate cancer after chemotherapy. We evaluated this agent in patients who had not received previous chemotherapy. In this double-blind study, we randomly assigned

  18. Excel 2016 for educational and psychological statistics a guide to solving practical problems

    CERN Document Server

    Quirk, Thomas J

    2016-01-01

    This book shows the capabilities of Microsoft Excel in teaching educational and psychological statistics effectively. Similar to the previously published Excel 2013 for Educational and Psychological Statistics, this book is a step-by-step exercise-driven guide for students and practitioners who need to master Excel to solve practical education and psychology problems. If understanding statistics isn’t your strongest suit, you are not especially mathematically-inclined, or if you are wary of computers, this is the right book for you. Excel, a widely available computer program for students and managers, is also an effective teaching and learning tool for quantitative analyses in education and psychology courses. Its powerful computational ability and graphical functions make learning statistics much easier than in years past. However, Excel 2016 for Educational and Psychological Statistics: A Guide to Solving Practical Problems is the first book to capitalize on these improvements by teaching students and man...

  19. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2012-01-31

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  20. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2011-01-01

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  1. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  2. Statistical distributions applications and parameter estimates

    CERN Document Server

    Thomopoulos, Nick T

    2017-01-01

    This book gives a description of the group of statistical distributions that have ample application to studies in statistics and probability.  Understanding statistical distributions is fundamental for researchers in almost all disciplines.  The informed researcher will select the statistical distribution that best fits the data in the study at hand.  Some of the distributions are well known to the general researcher and are in use in a wide variety of ways.  Other useful distributions are less understood and are not in common use.  The book describes when and how to apply each of the distributions in research studies, with a goal to identify the distribution that best applies to the study.  The distributions are for continuous, discrete, and bivariate random variables.  In most studies, the parameter values are not known a priori, and sample data is needed to estimate parameter values.  In other scenarios, no sample data is available, and the researcher seeks some insight that allows the estimate of ...

  3. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  4. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  5. Solution of the statistical bootstrap with Bose statistics

    International Nuclear Information System (INIS)

    Engels, J.; Fabricius, K.; Schilling, K.

    1977-01-01

    A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density

  6. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  7. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  8. Statistical and theoretical research

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology

  9. A longitudinal study of plasma insulin and glucagon in women with previous gestational diabetes

    DEFF Research Database (Denmark)

    Damm, P; Kühl, C; Hornnes, P

    1995-01-01

    OBJECTIVE: To investigate whether plasma insulin or glucagon predicts later development of diabetes in women with gestational diabetes mellitus (GDM). RESEARCH DESIGN AND METHODS: The subjects studied were 91 women with diet-treated GDM and 33 healthy women. Plasma insulin and glucagon during a 50...... at follow-up (2 had insulin-dependent diabetes mellitus, 13 had non-insulin-dependent diabetes mellitus, and 12 had impaired glucose tolerance). Compared with the control subjects, women with previous GDM had relatively impaired insulin secretion (decreased insulinogenic index and delayed peak insulin...... for subsequent development of overt diabetes (logistic regression analysis). CONCLUSIONS: Women who develop GDM have a relative insulin secretion deficiency, the severity of which is predictive for later development of diabetes. Furthermore, our data indicate that their relatively reduced beta-cell function may...

  10. Effect of ultrasound frequency on the Nakagami statistics of human liver tissues.

    Directory of Open Access Journals (Sweden)

    Po-Hsiang Tsui

    Full Text Available The analysis of the backscattered statistics using the Nakagami parameter is an emerging ultrasound technique for assessing hepatic steatosis and fibrosis. Previous studies indicated that the echo amplitude distribution of a normal liver follows the Rayleigh distribution (the Nakagami parameter m is close to 1. However, using different frequencies may change the backscattered statistics of normal livers. This study explored the frequency dependence of the backscattered statistics in human livers and then discussed the sources of ultrasound scattering in the liver. A total of 30 healthy participants were enrolled to undergo a standard care ultrasound examination on the liver, which is a natural model containing diffuse and coherent scatterers. The liver of each volunteer was scanned from the right intercostal view to obtain image raw data at different central frequencies ranging from 2 to 3.5 MHz. Phantoms with diffuse scatterers only were also made to perform ultrasound scanning using the same protocol for comparisons with clinical data. The Nakagami parameter-frequency correlation was evaluated using Pearson correlation analysis. The median and interquartile range of the Nakagami parameter obtained from livers was 1.00 (0.98-1.05 for 2 MHz, 0.93 (0.89-0.98 for 2.3 MHz, 0.87 (0.84-0.92 for 2.5 MHz, 0.82 (0.77-0.88 for 3.3 MHz, and 0.81 (0.76-0.88 for 3.5 MHz. The Nakagami parameter decreased with the increasing central frequency (r = -0.67, p < 0.0001. However, the effect of ultrasound frequency on the statistical distribution of the backscattered envelopes was not found in the phantom results (r = -0.147, p = 0.0727. The current results demonstrated that the backscattered statistics of normal livers is frequency-dependent. Moreover, the coherent scatterers may be the primary factor to dominate the frequency dependence of the backscattered statistics in a liver.

  11. Statistical approaches to assessing single and multiple outcome measures in dry eye therapy and diagnosis.

    Science.gov (United States)

    Tomlinson, Alan; Hair, Mario; McFadyen, Angus

    2013-10-01

    Dry eye is a multifactorial disease which would require a broad spectrum of test measures in the monitoring of its treatment and diagnosis. However, studies have typically reported improvements in individual measures with treatment. Alternative approaches involve multiple, combined outcomes being assessed by different statistical analyses. In order to assess the effect of various statistical approaches to the use of single and combined test measures in dry eye, this review reanalyzed measures from two previous studies (osmolarity, evaporation, tear turnover rate, and lipid film quality). These analyses assessed the measures as single variables within groups, pre- and post-intervention with a lubricant supplement, by creating combinations of these variables and by validating these combinations with the combined sample of data from all groups of dry eye subjects. The effectiveness of single measures and combinations in diagnosis of dry eye was also considered. Copyright © 2013. Published by Elsevier Inc.

  12. Is There a Common Summary Statistical Process for Representing the Mean and Variance? A Study Using Illustrations of Familiar Items.

    Science.gov (United States)

    Yang, Yi; Tokita, Midori; Ishiguchi, Akira

    2018-01-01

    A number of studies revealed that our visual system can extract different types of summary statistics, such as the mean and variance, from sets of items. Although the extraction of such summary statistics has been studied well in isolation, the relationship between these statistics remains unclear. In this study, we explored this issue using an individual differences approach. Observers viewed illustrations of strawberries and lollypops varying in size or orientation and performed four tasks in a within-subject design, namely mean and variance discrimination tasks with size and orientation domains. We found that the performances in the mean and variance discrimination tasks were not correlated with each other and demonstrated that extractions of the mean and variance are mediated by different representation mechanisms. In addition, we tested the relationship between performances in size and orientation domains for each summary statistic (i.e. mean and variance) and examined whether each summary statistic has distinct processes across perceptual domains. The results illustrated that statistical summary representations of size and orientation may share a common mechanism for representing the mean and possibly for representing variance. Introspections for each observer performing the tasks were also examined and discussed.

  13. Statistical Seismology and Induced Seismicity

    Science.gov (United States)

    Tiampo, K. F.; González, P. J.; Kazemian, J.

    2014-12-01

    While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and

  14. Supposed cancer risk from mammography. Reply to previous statements

    Energy Technology Data Exchange (ETDEWEB)

    Oeser, H; Koeppe, P; Rach, K [Freie Univ. Berlin (Germany, F.R.). Klinik fuer Radiologie, Nuklearmedizin und Physikalische Therapie

    1976-12-01

    The view that exposure to diagnostic radiation presents a cancer risk to the female breast should be considered together with the fact that the major factor is ageing of the patient. This risk factor is hidden in experimental and statistical studies on cancer production by exongenous agents; for instance, in studies of radiation effects, it is inherent in the time taken. The assumption that mammography presents a cancer risk is unjustifiable and is denied.

  15. Evaluation of undergraduate nursing students' attitudes towards statistics courses, before and after a course in applied statistics.

    Science.gov (United States)

    Hagen, Brad; Awosoga, Olu; Kellett, Peter; Dei, Samuel Ofori

    2013-09-01

    Undergraduate nursing students must often take a course in statistics, yet there is scant research to inform teaching pedagogy. The objectives of this study were to assess nursing students' overall attitudes towards statistics courses - including (among other things) overall fear and anxiety, preferred learning and teaching styles, and the perceived utility and benefit of taking a statistics course - before and after taking a mandatory course in applied statistics. The authors used a pre-experimental research design (a one-group pre-test/post-test research design), by administering a survey to nursing students at the beginning and end of the course. The study was conducted at a University in Western Canada that offers an undergraduate Bachelor of Nursing degree. Participants included 104 nursing students, in the third year of a four-year nursing program, taking a course in statistics. Although students only reported moderate anxiety towards statistics, student anxiety about statistics had dropped by approximately 40% by the end of the course. Students also reported a considerable and positive change in their attitudes towards learning in groups by the end of the course, a potential reflection of the team-based learning that was used. Students identified preferred learning and teaching approaches, including the use of real-life examples, visual teaching aids, clear explanations, timely feedback, and a well-paced course. Students also identified preferred instructor characteristics, such as patience, approachability, in-depth knowledge of statistics, and a sense of humor. Unfortunately, students only indicated moderate agreement with the idea that statistics would be useful and relevant to their careers, even by the end of the course. Our findings validate anecdotal reports on statistics teaching pedagogy, although more research is clearly needed, particularly on how to increase students' perceptions of the benefit and utility of statistics courses for their nursing

  16. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  17. Catch statistics for belugas in West Greenland 1862 to 1999

    Directory of Open Access Journals (Sweden)

    MP Heide-Jørgensen

    2002-07-01

    Full Text Available Information and statistics including trade statistics on catches of white whales or belugas (Delphinapterus leucas in West Greenland since 1862 are presented. The period before 1952 was dominated by large catches south of 66o N that peaked with 1,380 reported kills in 1922. Catch levels in the past five decades are evaluated on the basis of official catch statistics, trade in mattak (whale skin, sampling of jaws and reports from local residents and other observers. Options are given for corrections of catch statistics based upon auxiliary statistics on trade of mattak, catches in previous decades for areas without reporting and on likely levels of loss rates in different hunting operations. The fractions of the reported catches that are caused by ice entrapments of whales are estimated. During 1954-1999 total reported catches ranged from 216 to 1,874 and they peaked around 1970. Correcting for underreporting and killed-but-lost whales increases the catch reports by 42% on average for 1954-1998. If the whales killed in ice entrapments are removed then the corrected catch estimate is on average 28% larger than the reported catches.

  18. Index of subfactors and statistics of quantum fields. Pt. 2

    International Nuclear Information System (INIS)

    Longo, R.

    1990-01-01

    The endomorphism semigroup End(M) of an infinite factor M is endowed with a natural conjugation (modulo inner automorphisms) anti ρ=ρ -1. γ, where γ is the canonical endomorphism of ρ(M) into M. In Quantum Field Theory conjugate endomorphisms are shown to correspond to conjugate superselection sectors in the description of Doplicher, Haag and Roberts. On the other hand one easily sees that conjugate endormorphisms correspond to conjugate correspondences in the setting of A. Connes. In particular we identify the canonical tower associated with the inclusion ρ(A(O)is contained inA(O) relative to a sector ρ. As a corollary, making use of our previously established index-statistics correspondence, we conpletely describe, in low dimensional theories, the statistics of a selfconjugate superselection sector ρ with 3 or less channels, in particular with statistical dimension d(ρ)<2, by obtaining the braid group representations of V. Jones and Birman, Wenzyl and Murakami. The statistics is thus described in these cases by the polynomial invariants for knots and links of Jones and Kauffman. Selfconjugate sectors are subdivided in real and pseudoreal ones and the effect of this distinction on the statistics is analyzed. The FYHLMO polynomial describes arbitrary 2-channels sectors. (orig.)

  19. Changing viewer perspectives reveals constraints to implicit visual statistical learning.

    Science.gov (United States)

    Jiang, Yuhong V; Swallow, Khena M

    2014-10-07

    Statistical learning-learning environmental regularities to guide behavior-likely plays an important role in natural human behavior. One potential use is in search for valuable items. Because visual statistical learning can be acquired quickly and without intention or awareness, it could optimize search and thereby conserve energy. For this to be true, however, visual statistical learning needs to be viewpoint invariant, facilitating search even when people walk around. To test whether implicit visual statistical learning of spatial information is viewpoint independent, we asked participants to perform a visual search task from variable locations around a monitor placed flat on a stand. Unbeknownst to participants, the target was more often in some locations than others. In contrast to previous research on stationary observers, visual statistical learning failed to produce a search advantage for targets in high-probable regions that were stable within the environment but variable relative to the viewer. This failure was observed even when conditions for spatial updating were optimized. However, learning was successful when the rich locations were referenced relative to the viewer. We conclude that changing viewer perspective disrupts implicit learning of the target's location probability. This form of learning shows limited integration with spatial updating or spatiotopic representations. © 2014 ARVO.

  20. Applied statistics in the pharmaceutical industry with case studies using S-PLUS

    CERN Document Server

    Krause, Andreas

    2001-01-01

    The purpose of this book is to provide a general guide to statistical methods used in the pharmaceutical industry, and to illustrate how to use S-PLUS to implement these methods. Specifically, the goal is to: *Illustrate statistical applications in the pharmaceutical industry; *Illustrate how the statistical applications can be carried out using S-PLUS; *Illustrate why S-PLUS is a useful software package for carrying out these applications; *Discuss the results and implications of a particular application; The target audience for this book is very broad, including: *Graduate students in biostatistics; *Statisticians who are involved in the industry as research scientists, regulators, academics, and/or consultants who want to know more about how to use S-PLUS and learn about other sub-fields within the indsutry that they may not be familiar with; *Statisticians in other fields who want to know more about statistical applications in the pharmaceutical industry.