Statistical significance of cis-regulatory modules
Directory of Open Access Journals (Sweden)
Smith Andrew D
2007-01-01
Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.
REANALYSIS OF F-STATISTIC GRAVITATIONAL-WAVE SEARCHES WITH THE HIGHER CRITICISM STATISTIC
International Nuclear Information System (INIS)
Bennett, M. F.; Melatos, A.; Delaigle, A.; Hall, P.
2013-01-01
We propose a new method of gravitational-wave detection using a modified form of higher criticism, a statistical technique introduced by Donoho and Jin. Higher criticism is designed to detect a group of sparse, weak sources, none of which are strong enough to be reliably estimated or detected individually. We apply higher criticism as a second-pass method to synthetic F-statistic and C-statistic data for a monochromatic periodic source in a binary system and quantify the improvement relative to the first-pass methods. We find that higher criticism on C-statistic data is more sensitive by ∼6% than the C-statistic alone under optimal conditions (i.e., binary orbit known exactly) and the relative advantage increases as the error in the orbital parameters increases. Higher criticism is robust even when the source is not monochromatic (e.g., phase-wandering in an accreting system). Applying higher criticism to a phase-wandering source over multiple time intervals gives a ∼> 30% increase in detectability with few assumptions about the frequency evolution. By contrast, in all-sky searches for unknown periodic sources, which are dominated by the brightest source, second-pass higher criticism does not provide any benefits over a first-pass search.
The thresholds for statistical and clinical significance
DEFF Research Database (Denmark)
Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per
2014-01-01
BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...
The Study of Second Higher Education through Mathematical Statistics
Directory of Open Access Journals (Sweden)
Olga V. Kremer
2014-05-01
Full Text Available The article deals with the statistic reasons, age and wages of people who get the second higher education. People opt for the second higher education mostly due to many economical and physiological factors. According to our research, the age is a key motivator for the second higher education. Based on statistical data the portrait of a second higher education student was drawn.
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Statistics Report on TEQSA Registered Higher Education Providers
Australian Government Tertiary Education Quality and Standards Agency, 2015
2015-01-01
This statistics report provides a comprehensive snapshot of national statistics on all parts of the sector for the year 2013, by bringing together data collected directly by TEQSA with data sourced from the main higher education statistics collections managed by the Australian Government Department of Education and Training. The report provides…
Significance levels for studies with correlated test statistics.
Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S
2008-07-01
When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.
Caveats for using statistical significance tests in research assessments
DEFF Research Database (Denmark)
Schneider, Jesper Wiborg
2013-01-01
controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators...
Farrell, Mary Beth
2018-06-01
This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being
Statistically significant relational data mining :
Energy Technology Data Exchange (ETDEWEB)
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Directory of Open Access Journals (Sweden)
Priya Ranganathan
2015-01-01
Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper
Health significance and statistical uncertainty. The value of P-value.
Consonni, Dario; Bertazzi, Pier Alberto
2017-10-27
The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958
Statistics Report on TEQSA Registered Higher Education Providers, 2016
Australian Government Tertiary Education Quality and Standards Agency, 2016
2016-01-01
This Statistics Report is the third release of selected higher education sector data held by the Australian Government Tertiary Education Quality and Standards Agency (TEQSA) for its quality assurance activities. It provides a snapshot of national statistics on all parts of the sector by bringing together data collected directly by TEQSA with data…
Statistical insights from Romanian data on higher education
Directory of Open Access Journals (Sweden)
Andreea Ardelean
2015-09-01
Full Text Available This paper aims to use cluster analysis to make a comparative analysis at regional level concerning the Romanian higher education. The evolution of higher education in post-communist period will also be presented, using quantitative traits. Although the focus is on university education, this will also include references to the total education by comparison. Then, to highlight the importance of higher education, the chi-square test will be applied to check whether there is an association between statistical regions and education level of the unemployed.
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
South Carolina Higher Education Statistical Abstract, 2014. 36th Edition
Armour, Mim, Ed.
2014-01-01
The South Carolina Higher Education Statistical Abstract is a comprehensive, single-source compilation of tables and graphs which report data frequently requested by the Governor, Legislators, college and university staff, other state government officials, and the general public. The 2014 edition of the Statistical Abstract marks the 36th year of…
South Carolina Higher Education Statistical Abstract, 2015. 37th Edition
Armour, Mim, Ed.
2015-01-01
The South Carolina Higher Education Statistical Abstract is a comprehensive, single-source compilation of tables and graphs which report data frequently requested by the Governor, Legislators, college and university staff, other state government officials, and the general public. The 2015 edition of the Statistical Abstract marks the 37th year of…
Swiss solar power statistics 2007 - Significant expansion
International Nuclear Information System (INIS)
Hostettler, T.
2008-01-01
This article presents and discusses the 2007 statistics for solar power in Switzerland. A significant number of new installations is noted as is the high production figures from newer installations. The basics behind the compilation of the Swiss solar power statistics are briefly reviewed and an overview for the period 1989 to 2007 is presented which includes figures on the number of photovoltaic plant in service and installed peak power. Typical production figures in kilowatt-hours (kWh) per installed kilowatt-peak power (kWp) are presented and discussed for installations of various sizes. Increased production after inverter replacement in older installations is noted. Finally, the general political situation in Switzerland as far as solar power is concerned are briefly discussed as are international developments.
Test for the statistical significance of differences between ROC curves
International Nuclear Information System (INIS)
Metz, C.E.; Kronman, H.B.
1979-01-01
A test for the statistical significance of observed differences between two measured Receiver Operating Characteristic (ROC) curves has been designed and evaluated. The set of observer response data for each ROC curve is assumed to be independent and to arise from a ROC curve having a form which, in the absence of statistical fluctuations in the response data, graphs as a straight line on double normal-deviate axes. To test the significance of an apparent difference between two measured ROC curves, maximum likelihood estimates of the two parameters of each curve and the associated parameter variances and covariance are calculated from the corresponding set of observer response data. An approximate Chi-square statistic with two degrees of freedom is then constructed from the differences between the parameters estimated for each ROC curve and from the variances and covariances of these estimates. This statistic is known to be truly Chi-square distributed only in the limit of large numbers of trials in the observer performance experiments. Performance of the statistic for data arising from a limited number of experimental trials was evaluated. Independent sets of rating scale data arising from the same underlying ROC curve were paired, and the fraction of differences found (falsely) significant was compared to the significance level, α, used with the test. Although test performance was found to be somewhat dependent on both the number of trials in the data and the position of the underlying ROC curve in the ROC space, the results for various significance levels showed the test to be reliable under practical experimental conditions
On detection and assessment of statistical significance of Genomic Islands
Directory of Open Access Journals (Sweden)
Chaudhuri Probal
2008-04-01
Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.
Increasing the statistical significance of entanglement detection in experiments.
Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei
2010-05-28
Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.
Testing the Difference of Correlated Agreement Coefficients for Statistical Significance
Gwet, Kilem L.
2016-01-01
This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…
Higher-order scene statistics of breast images
Abbey, Craig K.; Sohl-Dickstein, Jascha N.; Olshausen, Bruno A.; Eckstein, Miguel P.; Boone, John M.
2009-02-01
Researchers studying human and computer vision have found description and construction of these systems greatly aided by analysis of the statistical properties of naturally occurring scenes. More specifically, it has been found that receptive fields with directional selectivity and bandwidth properties similar to mammalian visual systems are more closely matched to the statistics of natural scenes. It is argued that this allows for sparse representation of the independent components of natural images [Olshausen and Field, Nature, 1996]. These theories have important implications for medical image perception. For example, will a system that is designed to represent the independent components of natural scenes, where objects occlude one another and illumination is typically reflected, be appropriate for X-ray imaging, where features superimpose on one another and illumination is transmissive? In this research we begin to examine these issues by evaluating higher-order statistical properties of breast images from X-ray projection mammography (PM) and dedicated breast computed tomography (bCT). We evaluate kurtosis in responses of octave bandwidth Gabor filters applied to PM and to coronal slices of bCT scans. We find that kurtosis in PM rises and quickly saturates for filter center frequencies with an average value above 0.95. By contrast, kurtosis in bCT peaks near 0.20 cyc/mm with kurtosis of approximately 2. Our findings suggest that the human visual system may be tuned to represent breast tissue more effectively in bCT over a specific range of spatial frequencies.
Statistical Significance for Hierarchical Clustering
Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.
2017-01-01
Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990
Statistical significance of trends in monthly heavy precipitation over the US
Mahajan, Salil
2011-05-11
Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.
2009-01-01
In high-dimensional studies such as genome-wide association studies, the correction for multiple testing in order to control total type I error results in decreased power to detect modest effects. We present a new analytical approach based on the higher criticism statistic that allows identification of the presence of modest effects. We apply our method to the genome-wide study of rheumatoid arthritis provided in the Genetic Analysis Workshop 16 Problem 1 data set. There is evidence for unknown bias in this study that could be explained by the presence of undetected modest effects. We compared the asymptotic and empirical thresholds for the higher criticism statistic. Using the asymptotic threshold we detected the presence of modest effects genome-wide. We also detected modest effects using 90th percentile of the empirical null distribution as a threshold; however, there is no such evidence when the 95th and 99th percentiles were used. While the higher criticism method suggests that there is some evidence for modest effects, interpreting individual single-nucleotide polymorphisms with significant higher criticism statistics is of undermined value. The goal of higher criticism is to alert the researcher that genetic effects remain to be discovered and to promote the use of more targeted and powerful studies to detect the remaining effects. PMID:20018032
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.
Kramer, Karen L; Veile, Amanda; Otárola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1) as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2) competition from young siblings will negatively impact child growth during the post weaning period; 3) however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.
Directory of Open Access Journals (Sweden)
Karen L Kramer
Full Text Available Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1 as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2 competition from young siblings will negatively impact child growth during the post weaning period; 3 however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.
Increasing the statistical significance of entanglement detection in experiments
Energy Technology Data Exchange (ETDEWEB)
Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)
2010-07-01
Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.
Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels
Yilmaz, Ferkan
2012-03-01
In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present simple and efficient closed-form expression to the higher order moments of the channel capacity of dual hop transmission system with Rayleigh fading channels. In order to analyze the behavior of the higher order capacity statistics and investigate the usefulness of the mathematical analysis, some selected numerical and simulation results are presented. Our results are found to be in perfect agreement. © 2012 IEEE.
Reporting effect sizes as a supplement to statistical significance ...
African Journals Online (AJOL)
The purpose of the article is to review the statistical significance reporting practices in reading instruction studies and to provide guidelines for when to calculate and report effect sizes in educational research. A review of six readily accessible (online) and accredited journals publishing research on reading instruction ...
Your Chi-Square Test Is Statistically Significant: Now What?
Sharpe, Donald
2015-01-01
Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data…
Directory of Open Access Journals (Sweden)
Melissa Coulson
2010-07-01
Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.
Testing statistical significance scores of sequence comparison methods with structure similarity
Directory of Open Access Journals (Sweden)
Leunissen Jack AM
2006-10-01
Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.
Connection between weighted LPC and higher-order statistics for AR model estimation
Kamp, Y.; Ma, C.
1993-01-01
This paper establishes the relationship between a weighted linear prediction method used for robust analysis of voiced speech and the autoregressive modelling based on higher-order statistics, known as cumulants
Statistical significance versus clinical relevance.
van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G
2017-04-01
In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies
International Nuclear Information System (INIS)
Weber, K.H.
1993-01-01
In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de
Statistical Significance and Effect Size: Two Sides of a Coin.
Fan, Xitao
This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Significant Statistics: Viewed with a Contextual Lens
Tait-McCutcheon, Sandi
2010-01-01
This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
DEFF Research Database (Denmark)
Engsted, Tom
I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model...
Wilkinson, Michael
2014-03-01
Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.
Directory of Open Access Journals (Sweden)
Zhang Zhang
2012-03-01
Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.
Higher-Order Statistical Correlations and Mutual Information Among Particles in a Quantum Well
Yépez, V. S.; Sagar, R. P.; Laguna, H. G.
2017-12-01
The influence of wave function symmetry on statistical correlation is studied for the case of three non-interacting spin-free quantum particles in a unidimensional box, in position and in momentum space. Higher-order statistical correlations occurring among the three particles in this quantum system is quantified via higher-order mutual information and compared to the correlation between pairs of variables in this model, and to the correlation in the two-particle system. The results for the higher-order mutual information show that there are states where the symmetric wave functions are more correlated than the antisymmetric ones with same quantum numbers. This holds in position as well as in momentum space. This behavior is opposite to that observed for the correlation between pairs of variables in this model, and the two-particle system, where the antisymmetric wave functions are in general more correlated. These results are also consistent with those observed in a system of three uncoupled oscillators. The use of higher-order mutual information as a correlation measure, is monitored and examined by considering a superposition of states or systems with two Slater determinants.
Higher-Order Statistical Correlations and Mutual Information Among Particles in a Quantum Well
International Nuclear Information System (INIS)
Yépez, V. S.; Sagar, R. P.; Laguna, H. G.
2017-01-01
The influence of wave function symmetry on statistical correlation is studied for the case of three non-interacting spin-free quantum particles in a unidimensional box, in position and in momentum space. Higher-order statistical correlations occurring among the three particles in this quantum system is quantified via higher-order mutual information and compared to the correlation between pairs of variables in this model, and to the correlation in the two-particle system. The results for the higher-order mutual information show that there are states where the symmetric wave functions are more correlated than the antisymmetric ones with same quantum numbers. This holds in position as well as in momentum space. This behavior is opposite to that observed for the correlation between pairs of variables in this model, and the two-particle system, where the antisymmetric wave functions are in general more correlated. These results are also consistent with those observed in a system of three uncoupled oscillators. The use of higher-order mutual information as a correlation measure, is monitored and examined by considering a superposition of states or systems with two Slater determinants. (author)
Systematic reviews of anesthesiologic interventions reported as statistically significant
DEFF Research Database (Denmark)
Imberger, Georgina; Gluud, Christian; Boylan, John
2015-01-01
statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that may......: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median number...
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Higher order capacity statistics of multi-hop transmission systems over Rayleigh fading channels
Yilmaz, Ferkan; Tabassum, Hina; Alouini, Mohamed-Slim
2012-01-01
In this paper, we present an exact analytical expression to evaluate the higher order statistics of the channel capacity for amplify and forward (AF) multihop transmission systems operating over Rayleigh fading channels. Furthermore, we present
P-Value, a true test of statistical significance? a cautionary note ...
African Journals Online (AJOL)
While it's not the intention of the founders of significance testing and hypothesis testing to have the two ideas intertwined as if they are complementary, the inconvenient marriage of the two practices into one coherent, convenient, incontrovertible and misinterpreted practice has dotted our standard statistics textbooks and ...
Zhang, Zhang
2012-03-22
Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.
Classification of lung sounds using higher-order statistics: A divide-and-conquer approach.
Naves, Raphael; Barbosa, Bruno H G; Ferreira, Danton D
2016-06-01
Lung sound auscultation is one of the most commonly used methods to evaluate respiratory diseases. However, the effectiveness of this method depends on the physician's training. If the physician does not have the proper training, he/she will be unable to distinguish between normal and abnormal sounds generated by the human body. Thus, the aim of this study was to implement a pattern recognition system to classify lung sounds. We used a dataset composed of five types of lung sounds: normal, coarse crackle, fine crackle, monophonic and polyphonic wheezes. We used higher-order statistics (HOS) to extract features (second-, third- and fourth-order cumulants), Genetic Algorithms (GA) and Fisher's Discriminant Ratio (FDR) to reduce dimensionality, and k-Nearest Neighbors and Naive Bayes classifiers to recognize the lung sound events in a tree-based system. We used the cross-validation procedure to analyze the classifiers performance and the Tukey's Honestly Significant Difference criterion to compare the results. Our results showed that the Genetic Algorithms outperformed the Fisher's Discriminant Ratio for feature selection. Moreover, each lung class had a different signature pattern according to their cumulants showing that HOS is a promising feature extraction tool for lung sounds. Besides, the proposed divide-and-conquer approach can accurately classify different types of lung sounds. The classification accuracy obtained by the best tree-based classifier was 98.1% for classification accuracy on training, and 94.6% for validation data. The proposed approach achieved good results even using only one feature extraction tool (higher-order statistics). Additionally, the implementation of the proposed classifier in an embedded system is feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…
Brouwer, D.; Meijer, R.R.; Zevalkink, D.J.
2013-01-01
Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual
Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza
2014-01-01
This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…
DEFF Research Database (Denmark)
Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per
2014-01-01
BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...
Jordan, Julie-Ann; McGladdery, Gary; Dyer, Kevin
2014-08-01
This study examined levels of mathematics and statistics anxiety, as well as general mental health amongst undergraduate students with dyslexia (n = 28) and those without dyslexia (n = 71). Students with dyslexia had higher levels of mathematics anxiety relative to those without dyslexia, while statistics anxiety and general mental health were comparable for both reading ability groups. In terms of coping strategies, undergraduates with dyslexia tended to use planning-based strategies and seek instrumental support more frequently than those without dyslexia. Higher mathematics anxiety was associated with having a dyslexia diagnosis, as well as greater levels of worrying, denial, seeking instrumental support and less use of the positive reinterpretation coping strategy. By contrast, statistics anxiety was not predicted by dyslexia diagnosis, but was instead predicted by overall worrying and the use of denial and emotion focused coping strategies. The results suggest that disability practitioners should be aware that university students with dyslexia are at risk of high mathematics anxiety. Additionally, effective anxiety reduction strategies such as positive reframing and thought challenging would form a useful addition to the support package delivered to many students with dyslexia. Copyright © 2014 John Wiley & Sons, Ltd.
The Statistical Knowledge Gap in Higher Degree by Research Students: The Supervisors' Perspective
Baglin, James; Hart, Claire; Stow, Sarah
2017-01-01
This study sought to gain an understanding of the current statistical training and support needs for Australian Higher Degree by Research (HDR) students and their supervisors. The data reported herein are based on the survey responses of 191 (18.7%) eligible supervisors from a single Australian institution. The survey was composed of both…
Competitive Intelligence: Significance in Higher Education
Barrett, Susan E.
2010-01-01
Historically noncompetitive, the higher education sector is now having to adjust dramatically to new and increasing demands on numerous levels. To remain successfully operational within the higher educational market universities today must consider all relevant forces which can impact present and future planning. Those institutions that were…
International Nuclear Information System (INIS)
Pérez-Holanda, Sergio; Blanco, Ignacio; Menéndez, Manuel; Rodrigo, Luis
2014-01-01
The association between alpha-1 antitrypsin (AAT) deficiency and colorectal cancer (CRC) is currently controversial. The present study compares AAT serum concentrations and gene frequencies between a group of CRC patients and a control group of healthy unrelated people (HUP). 267 CRC subjects (63% males, 72 ± 10 years old) were enlisted from a Hospital Clinic setting in Asturias, Spain. The HUP group comprised 327 subjects (67% males, mean age 70 ± 7.5 years old) from the same geographical region. Outcome measures were AAT serum concentrations measured by nephelometry, and AAT phenotyping characterization by isoelectric focusing. Significantly higher serum concentrations were found among CRC (208 ± 60) than in HUP individuals (144 ± 20.5) (p = 0.0001). No differences were found in the phenotypic distribution of the Pi*S and Pi*Z allelic frequencies (p = 0.639), although the frequency of Pi*Z was higher in CRC (21%) than in HUP subjects (15%). The only statistically significant finding in this study was the markedly higher AAT serum concentrations found in CRC subjects compared with HUP controls, irrespective of whether their Pi* phenotype was normal (Pi*MM) or deficient (Pi*MS, Pi*MZ and Pi*SZ). Although there was a trend towards the more deficient Pi* phenotype the more advanced the tumor, the results were inconclusive due to the small sample size. Consequently, more powerful studies are needed to reach firmer conclusions on this matter
Diedrich, Alice; Schlegl, Sandra; Greetfeld, Martin; Fumi, Markus; Voderholzer, Ulrich
2018-03-01
This study examines the statistical and clinical significance of symptom changes during an intensive inpatient treatment program with a strong psychotherapeutic focus for individuals with severe bulimia nervosa. 295 consecutively admitted bulimic patients were administered the Structured Interview for Anorexic and Bulimic Syndromes-Self-Rating (SIAB-S), the Eating Disorder Inventory-2 (EDI-2), the Brief Symptom Inventory (BSI), and the Beck Depression Inventory-II (BDI-II) at treatment intake and discharge. Results indicated statistically significant symptom reductions with large effect sizes regarding severity of binge eating and compensatory behavior (SIAB-S), overall eating disorder symptom severity (EDI-2), overall psychopathology (BSI), and depressive symptom severity (BDI-II) even when controlling for antidepressant medication. The majority of patients showed either reliable (EDI-2: 33.7%, BSI: 34.8%, BDI-II: 18.1%) or even clinically significant symptom changes (EDI-2: 43.2%, BSI: 33.9%, BDI-II: 56.9%). Patients with clinically significant improvement were less distressed at intake and less likely to suffer from a comorbid borderline personality disorder when compared with those who did not improve to a clinically significant extent. Findings indicate that intensive psychotherapeutic inpatient treatment may be effective in about 75% of severely affected bulimic patients. For the remaining non-responding patients, inpatient treatment might be improved through an even stronger focus on the reduction of comorbid borderline personality traits.
Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.
Deegear, James
This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…
Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B
2013-03-23
Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Van Aert, R.C.M.; Van Assen, M.A.L.M.
2018-01-01
The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter
A tutorial on hunting statistical significance by chasing N
Directory of Open Access Journals (Sweden)
Denes Szucs
2016-09-01
Full Text Available There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticise some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post-hoc along potential but unplanned independent grouping variables. The first approach 'hacks' the number of participants in studies, the second approach ‘hacks’ the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20-50%, or more false positives.
International Nuclear Information System (INIS)
Kim, Jong Ho; Shin, Eak Kyun
1999-01-01
Volume-LVEF relationship is one of the most important factors of automatic EF quantification algorithm from gated myocardial perfusion SPECT(gMPS) (Germano et al. JNM, 1995). Gender difference whereby normal LVEF measurements are higher in females assessed by gMPS (Yao et al. JNM 1997). To validate true physiologic value of LVEF vs sampling or measured error, various parameters were evaluated statistically in both gender and age matched 200 subjects (mean age= 58.41±15.01) with normal LVEF more than 50%, and a low likelihood of coronary artery disease. Correlation between LVEDVi(ml/m2) and LVEF was highly significant (r=-0.62, p<0.0001) with similar correlations noted in both male (r=-0.45, p<0.0001) and female (r=-0.67, p<0.0001) subgroups. By multivariate analysis, LV volume and stroke volume was the most significant factor influencing LVEF in male and female, respectively. In conclusion, there is a significant negative correlation between LV volume and LVEF as measured by Tc-99m gated SPECT. Higher normal LVEF value should be applied to females as assessed by post-stress resting Tc-99m Sestamibi gated myocardial perfusion SPECT
Symmetries, invariants and generating functions: higher-order statistics of biased tracers
Munshi, Dipak
2018-01-01
Gravitationally collapsed objects are known to be biased tracers of an underlying density contrast. Using symmetry arguments, generalised biasing schemes have recently been developed to relate the halo density contrast δh with the underlying density contrast δ, divergence of velocity θ and their higher-order derivatives. This is done by constructing invariants such as s, t, ψ,η. We show how the generating function formalism in Eulerian standard perturbation theory (SPT) can be used to show that many of the additional terms based on extended Galilean and Lifshitz symmetry actually do not make any contribution to the higher-order statistics of biased tracers. Other terms can also be drastically simplified allowing us to write the vertices associated with δh in terms of the vertices of δ and θ, the higher-order derivatives and the bias coefficients. We also compute the cumulant correlators (CCs) for two different tracer populations. These perturbative results are valid for tree-level contributions but at an arbitrary order. We also take into account the stochastic nature bias in our analysis. Extending previous results of a local polynomial model of bias, we express the one-point cumulants Script SN and their two-point counterparts, the CCs i.e. Script Cpq, of biased tracers in terms of that of their underlying density contrast counterparts. As a by-product of our calculation we also discuss the results using approximations based on Lagrangian perturbation theory (LPT).
Higher order statistical moment application for solar PV potential analysis
Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan
2016-10-01
Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.
DEFF Research Database (Denmark)
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...
Statistical significance estimation of a signal within the GooFit framework on GPUs
Directory of Open Access Journals (Sweden)
Cristella Leonardo
2017-01-01
Full Text Available In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
Holtzman, Jessica N; Miller, Shefali; Hooshmand, Farnaz; Wang, Po W; Chang, Kiki D; Hill, Shelley J; Rasgon, Natalie L; Ketter, Terence A
2015-07-01
The strengths and limitations of considering childhood-and adolescent-onset bipolar disorder (BD) separately versus together remain to be established. We assessed this issue. BD patients referred to the Stanford Bipolar Disorder Clinic during 2000-2011 were assessed with the Systematic Treatment Enhancement Program for BD Affective Disorders Evaluation. Patients with childhood- and adolescent-onset were compared to those with adult-onset for 7 unfavorable bipolar illness characteristics with replicated associations with early-onset patients. Among 502 BD outpatients, those with childhood- (adolescent- (13-18 years, N=218) onset had significantly higher rates for 4/7 unfavorable illness characteristics, including lifetime comorbid anxiety disorder, at least ten lifetime mood episodes, lifetime alcohol use disorder, and prior suicide attempt, than those with adult-onset (>18 years, N=174). Childhood- but not adolescent-onset BD patients also had significantly higher rates of first-degree relative with mood disorder, lifetime substance use disorder, and rapid cycling in the prior year. Patients with pooled childhood/adolescent - compared to adult-onset had significantly higher rates for 5/7 of these unfavorable illness characteristics, while patients with childhood- compared to adolescent-onset had significantly higher rates for 4/7 of these unfavorable illness characteristics. Caucasian, insured, suburban, low substance abuse, American specialty clinic-referred sample limits generalizability. Onset age is based on retrospective recall. Childhood- compared to adolescent-onset BD was more robustly related to unfavorable bipolar illness characteristics, so pooling these groups attenuated such relationships. Further study is warranted to determine the extent to which adolescent-onset BD represents an intermediate phenotype between childhood- and adult-onset BD. Copyright © 2015 Elsevier B.V. All rights reserved.
Sierevelt, Inger N.; van Oldenrijk, Jakob; Poolman, Rudolf W.
2007-01-01
In this paper we describe several issues that influence the reporting of statistical significance in relation to clinical importance, since misinterpretation of p values is a common issue in orthopaedic literature. Orthopaedic research is tormented by the risks of false-positive (type I error) and
Fernández, Leandro; Monbaliu, Jaak; Onorato, Miguel; Toffoli, Alessandro
2014-05-01
This research is focused on the study of nonlinear evolution of irregular wave fields in water of arbitrary depth by comparing field measurements and numerical simulations.It is now well accepted that modulational instability, known as one of the main mechanisms for the formation of rogue waves, induces strong departures from Gaussian statistics. However, whereas non-Gaussian properties are remarkable when wave fields follow one direction of propagation over an infinite water depth, wave statistics only weakly deviate from Gaussianity when waves spread over a range of different directions. Over finite water depth, furthermore, wave instability attenuates overall and eventually vanishes for relative water depths as low as kh=1.36 (where k is the wavenumber of the dominant waves and h the water depth). Recent experimental results, nonetheless, seem to indicate that oblique perturbations are capable of triggering and sustaining modulational instability even if khthe aim of this research is to understand whether the combined effect of directionality and finite water depth has a significant effect on wave statistics and particularly on the occurrence of extremes. For this purpose, numerical experiments have been performed solving the Euler equation of motion with the Higher Order Spectral Method (HOSM) and compared with data of short crested wave fields for different sea states observed at the Lake George (Australia). A comparative analysis of the statistical properties (i.e. density function of the surface elevation and its statistical moments skewness and kurtosis) between simulations and in-situ data provides a confrontation between the numerical developments and real observations in field conditions.
Yilmaz, Ferkan
2012-12-01
The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.
Yilmaz, Ferkan; Alouini, Mohamed-Slim
2012-01-01
The higher-order statistics (HOS) of the channel capacity μn=E[logn (1+γ end)], where n ∈ N denotes the order of the statistics, has received relatively little attention in the literature, due in part to the intractability of its analysis. In this letter, we propose a novel and unified analysis, which is based on the moment generating function (MGF) technique, to exactly compute the HOS of the channel capacity. More precisely, our mathematical formalism can be readily applied to maximal-ratio-combining (MRC) receivers operating in generalized fading environments. The mathematical formalism is illustrated by some numerical examples focusing on the correlated generalized fading environments. © 2012 IEEE.
International Nuclear Information System (INIS)
DUDEK, J; SZPAK, B; FORNAL, B; PORQUET, M-G
2011-01-01
In this and the follow-up article we briefly discuss what we believe represents one of the most serious problems in contemporary nuclear structure: the question of statistical significance of parametrizations of nuclear microscopic Hamiltonians and the implied predictive power of the underlying theories. In the present Part I, we introduce the main lines of reasoning of the so-called Inverse Problem Theory, an important sub-field in the contemporary Applied Mathematics, here illustrated on the example of the Nuclear Mean-Field Approach.
Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.
2011-01-01
In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…
A Note on Comparing the Power of Test Statistics at Low Significance Levels.
Morris, Nathan; Elston, Robert
2011-01-01
It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.
DEFF Research Database (Denmark)
Serviss, Jason T.; Gådin, Jesper R.; Eriksson, Per
2017-01-01
, e.g. genes in a specific pathway, alone can separate samples into these established classes. Despite this, the evaluation of class separations is often subjective and performed via visualization. Here we present the ClusterSignificance package; a set of tools designed to assess the statistical...... significance of class separations downstream of dimensionality reduction algorithms. In addition, we demonstrate the design and utility of the ClusterSignificance package and utilize it to determine the importance of long non-coding RNA expression in the identity of multiple hematological malignancies....
van Tulder, M.W.; Malmivaara, A.; Hayden, J.; Koes, B.
2007-01-01
STUDY DESIGN. Critical appraisal of the literature. OBJECIVES. The objective of this study was to assess if results of back pain trials are statistically significant and clinically important. SUMMARY OF BACKGROUND DATA. There seems to be a discrepancy between conclusions reported by authors and
Kaiser, Franciscus; Hillegers, Harm; Legro, Iwen
2005-01-01
In this fourth IHEM trend report, key statistical issues like student flows (new entrants, enrolment, and graduation, broken down by discipline and gender), rates of participation, staff, and finance are presented for the ten IHEM core countries over the period 1995 till recent. National statistical
Directory of Open Access Journals (Sweden)
JiYeoun Lee
2009-01-01
Full Text Available A preprocessing scheme based on linear prediction coefficient (LPC residual is applied to higher-order statistics (HOSs for automatic assessment of an overall pathological voice quality. The normalized skewness and kurtosis are estimated from the LPC residual and show statistically meaningful distributions to characterize the pathological voice quality. 83 voice samples of the sustained vowel /a/ phonation are used in this study and are independently assessed by a speech and language therapist (SALT according to the grade of the severity of dysphonia of GRBAS scale. These are used to train and test classification and regression tree (CART. The best result is obtained using an optima l decision tree implemented by a combination of the normalized skewness and kurtosis, with an accuracy of 92.9%. It is concluded that the method can be used as an assessment tool, providing a valuable aid to the SALT during clinical evaluation of an overall pathological voice quality.
Indirectional statistics and the significance of an asymmetry discovered by Birch
International Nuclear Information System (INIS)
Kendall, D.G.; Young, G.A.
1984-01-01
Birch (1982, Nature, 298, 451) reported an apparent 'statistical asymmetry of the Universe'. The authors here develop 'indirectional analysis' as a technique for investigating statistical effects of this kind and conclude that the reported effect (whatever may be its origin) is strongly supported by the observations. The estimated pole of the asymmetry is at RA 13h 30m, Dec. -37deg. The angular error in its estimation is unlikely to exceed 20-30deg. (author)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo
2018-06-05
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.
Access to Higher Education in China: Differences in Opportunity
Wang, Houxiong
2011-01-01
Access to higher education in China has opened up significantly in the move towards a mass higher education system. However, aggregate growth does not necessarily imply fair or reasonable distribution of opportunity. In fact, the expansion of higher education has a rather more complex influence on opportunity when admissions statistics are viewed…
Kim, Sung-Min; Choi, Yosoon
2017-06-18
To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
Directory of Open Access Journals (Sweden)
Sung-Min Kim
2017-06-01
Full Text Available To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH, high content with a low z-score (HL, low content with a high z-score (LH, and low content with a low z-score (LL. The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
Energy Technology Data Exchange (ETDEWEB)
Crow, C.J.
1985-01-01
Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.
ASSESSING SELF-STUDY WORK’S SIGNIFICANT SKILLS FOR SUCCESSFUL LEARNING IN THE HIGHER SCHOOL
Directory of Open Access Journals (Sweden)
Galina V. Milovanova
2017-06-01
Full Text Available Introduction: the problem of organizing students’ independent work/self-study is not new, but the changes in the higher school for the last two decades show that the experience accumulated in the traditional educational model can be applied only when it is processed in the present-day conditions. The article analyses the innovative component of the educational process in terms of a significant increase in the volume of compulsory independent work in the university. Particular attention is paid to determining the levels of the formation of skills for independent work in terms of students’ readiness for its implementa¬tion. The aim of the research is to identify the most significant skills of independent work for successful study at the university. Materials and Methods: the research is based on general scholarly methods: analysis, comparison, generalisation. A questionnaire survey was carried out and a correlation analysis of the results was presented. The mathematical statistics methods in Excel application were u sed for processing the survey data. Results: the article focused on the relevance of formation the students’ ability to work independently in the learning process. Requirements for professionals recognize the need for knowledge and skills, but more importantly, the ability and readiness to complete this knowledge and be in a state of continuous education and self-education. In turn, readiness to self-education cannot exist without independent work. The ratio of students to work independently and their skills’ levels in this area of the gnostic, design, structural, organisational and communicative blocks were identified because o f the research. Discussion and Conclusions: the levels of the formation of the skills for independent work influence on the success of the learning. There is a correlation between indicators of achievement and the ability to work independently. Organisation and communication skills have significant
Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks
2016-04-26
Systems, Statistics & Management Science, University of Alabama, USA. 1 DISTRIBUTION A: Distribution approved for public release. Contents 1 Summary 5...13 5 Application to Real Networks 18 5.1 2012 FBS Football Schedule Network... football schedule network. . . . . . . . . . . . . . . . . . . . . . 21 14 Stem plot of degree-ordered vertices versus the degree for college football
Institute of Scientific and Technical Information of China (English)
XU Dian-Yan
2003-01-01
The free energy and entropy of Reissner-Nordstrom black holes in higher-dimensional space-time are calculated by the quantum statistic method with a brick wall model. The space-time of the black holes is divided into three regions: region 1, (r > r0); region 2, (r0 > r > n); and region 3, (T-J > r > 0), where r0 is the radius of the outer event horizon, and r, is the radius of the inner event horizon. Detailed calculation shows that the entropy contributed by region 2 is zero, the entropy contributed by region 1 is positive and proportional to the outer event horizon area, the entropy contributed by region 3 is negative and proportional to the inner event horizon area. The total entropy contributed by all the three regions is positive and proportional to the area difference between the outer and inner event horizons. As rt approaches r0 in the nearly extreme case, the total quantum statistical entropy approaches zero.
Kellerer-Pirklbauer, Andreas
2016-04-01
Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter
Conducting tests for statistically significant differences using forest inventory data
James A. Westfall; Scott A. Pugh; John W. Coulston
2013-01-01
Many forest inventory and monitoring programs are based on a sample of ground plots from which estimates of forest resources are derived. In addition to evaluating metrics such as number of trees or amount of cubic wood volume, it is often desirable to make comparisons between resource attributes. To properly conduct statistical tests for differences, it is imperative...
Higher-Order Moment Characterisation of Rogue Wave Statistics in Supercontinuum Generation
DEFF Research Database (Denmark)
Sørensen, Simon Toft; Bang, Ole; Wetzel, Benjamin
2012-01-01
The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour.......The noise characteristics of supercontinuum generation are characterized using higherorder statistical moments. Measures of skew and kurtosis, and the coefficient of variation allow quantitative identification of spectral regions dominated by rogue wave like behaviour....
Gontscharuk, Veronika; Landwehr, Sandra; Finner, Helmut
2015-01-01
The higher criticism (HC) statistic, which can be seen as a normalized version of the famous Kolmogorov-Smirnov statistic, has a long history, dating back to the mid seventies. Originally, HC statistics were used in connection with goodness of fit (GOF) tests but they recently gained some attention in the context of testing the global null hypothesis in high dimensional data. The continuing interest for HC seems to be inspired by a series of nice asymptotic properties related to this statistic. For example, unlike Kolmogorov-Smirnov tests, GOF tests based on the HC statistic are known to be asymptotically sensitive in the moderate tails, hence it is favorably applied for detecting the presence of signals in sparse mixture models. However, some questions around the asymptotic behavior of the HC statistic are still open. We focus on two of them, namely, why a specific intermediate range is crucial for GOF tests based on the HC statistic and why the convergence of the HC distribution to the limiting one is extremely slow. Moreover, the inconsistency in the asymptotic and finite behavior of the HC statistic prompts us to provide a new HC test that has better finite properties than the original HC test while showing the same asymptotics. This test is motivated by the asymptotic behavior of the so-called local levels related to the original HC test. By means of numerical calculations and simulations we show that the new HC test is typically more powerful than the original HC test in normal mixture models. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P values values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
I. Arismendi; S. L. Johnson; J. B. Dunham
2015-01-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical...
Kiekkas, Panagiotis; Panagiotarou, Aliki; Malja, Alvaro; Tahirai, Daniela; Zykai, Rountina; Bakalis, Nick; Stefanopoulos, Nikolaos
2015-12-01
Although statistical knowledge and skills are necessary for promoting evidence-based practice, health sciences students have expressed anxiety about statistics courses, which may hinder their learning of statistical concepts. To evaluate the effects of a biostatistics course on nursing students' attitudes toward statistics and to explore the association between these attitudes and their performance in the course examination. One-group quasi-experimental pre-test/post-test design. Undergraduate nursing students of the fifth or higher semester of studies, who attended a biostatistics course. Participants were asked to complete the pre-test and post-test forms of The Survey of Attitudes Toward Statistics (SATS)-36 scale at the beginning and end of the course respectively. Pre-test and post-test scale scores were compared, while correlations between post-test scores and participants' examination performance were estimated. Among 156 participants, post-test scores of the overall SATS-36 scale and of the Affect, Cognitive Competence, Interest and Effort components were significantly higher than pre-test ones, indicating that the course was followed by more positive attitudes toward statistics. Among 104 students who participated in the examination, higher post-test scores of the overall SATS-36 scale and of the Affect, Difficulty, Interest and Effort components were significantly but weakly correlated with higher examination performance. Students' attitudes toward statistics can be improved through appropriate biostatistics courses, while positive attitudes contribute to higher course achievements and possibly to improved statistical skills in later professional life. Copyright © 2015 Elsevier Ltd. All rights reserved.
Statistical learning and selective inference.
Taylor, Jonathan; Tibshirani, Robert J
2015-06-23
We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.
Hashim, Muhammad Jawad
2010-09-01
Post-hoc secondary data analysis with no prespecified hypotheses has been discouraged by textbook authors and journal editors alike. Unfortunately no single term describes this phenomenon succinctly. I would like to coin the term "sigsearch" to define this practice and bring it within the teaching lexicon of statistics courses. Sigsearch would include any unplanned, post-hoc search for statistical significance using multiple comparisons of subgroups. It would also include data analysis with outcomes other than the prespecified primary outcome measure of a study as well as secondary data analyses of earlier research.
Naik, Ganesh R; Kumar, Dinesh K
2011-01-01
The electromyograpy (EMG) signal provides information about the performance of muscles and nerves. The shape of the muscle signal and motor unit action potential (MUAP) varies due to the movement of the position of the electrode or due to changes in contraction level. This research deals with evaluating the non-Gaussianity in Surface Electromyogram signal (sEMG) using higher order statistics (HOS) parameters. To achieve this, experiments were conducted for four different finger and wrist actions at different levels of Maximum Voluntary Contractions (MVCs). Our experimental analysis shows that at constant force and for non-fatiguing contractions, probability density functions (PDF) of sEMG signals were non-Gaussian. For lesser MVCs (below 30% of MVC) PDF measures tends to be Gaussian process. The above measures were verified by computing the Kurtosis values for different MVCs.
National Center for Educational Statistics (DHEW/OE), Washington, DC.
In response to needs expressed by the community of higher education institutions, the National Center for Educational Statistics has produced early estimates of a selected group of mean salaries of instructional faculty in institutions of higher education in 1972-73. The number and salaries of male and female instructional staff by rank are of…
Changing world extreme temperature statistics
Finkel, J. M.; Katz, J. I.
2018-04-01
We use the Global Historical Climatology Network--daily database to calculate a nonparametric statistic that describes the rate at which all-time daily high and low temperature records have been set in nine geographic regions (continents or major portions of continents) during periods mostly from the mid-20th Century to the present. This statistic was defined in our earlier work on temperature records in the 48 contiguous United States. In contrast to this earlier work, we find that in every region except North America all-time high records were set at a rate significantly (at least $3\\sigma$) higher than in the null hypothesis of a stationary climate. Except in Antarctica, all-time low records were set at a rate significantly lower than in the null hypothesis. In Europe, North Africa and North Asia the rate of setting new all-time highs increased suddenly in the 1990's, suggesting a change in regional climate regime; in most other regions there was a steadier increase.
DEFF Research Database (Denmark)
Nielsen, Tine; Kreiner, Svend
Short abstract Motivated by experiencing with students’ psychological barriers for learning statistics we modified and extended the Statistical Anxiety Rating Scale (STARS) to develop a contemporary Danish measure of attitudes and relationship to statistics for use with higher education students...... with evidence of DIF in all cases: One TCA-item functioned differentially relative to age, one WS-item functioned differentially relative to statistics course (first or second), and two IA-items functioned differentially relative to statistics course and academic discipline (sociology, public health...
Directory of Open Access Journals (Sweden)
E. A. Tatokchin
2017-01-01
Full Text Available Development of the modern educational technologies caused by broad introduction of comput-er testing and development of distant forms of education does necessary revision of methods of an examination of pupils. In work it was shown, need transition to mathematical criteria, exami-nations of knowledge which are deprived of subjectivity. In article the review of the problems arising at realization of this task and are offered approaches for its decision. The greatest atten-tion is paid to discussion of a problem of objective transformation of rated estimates of the ex-pert on to the scale estimates of the student. In general, the discussion this question is was con-cluded that the solution to this problem lies in the creation of specialized intellectual systems. The basis for constructing intelligent system laid the mathematical model of self-organizing nonequilibrium dissipative system, which is a group of students. This article assumes that the dissipative system is provided by the constant influx of new test items of the expert and non-equilibrium – individual psychological characteristics of students in the group. As a result, the system must self-organize themselves into stable patterns. This patern will allow for, relying on large amounts of data, get a statistically significant assessment of student. To justify the pro-posed approach in the work presents the data of the statistical analysis of the results of testing a large sample of students (> 90. Conclusions from this statistical analysis allowed to develop intelligent system statistically significant examination of student performance. It is based on data clustering algorithm (k-mean for the three key parameters. It is shown that this approach allows you to create of the dynamics and objective expertise evaluation.
Directory of Open Access Journals (Sweden)
M. Amate
2007-01-01
Full Text Available An original algorithm for the detection of small objects in a noisy background is proposed. Its application to underwater objects detection by sonar imaging is addressed. This new method is based on the use of higher-order statistics (HOS that are locally estimated on the images. The proposed algorithm is divided into two steps. In a first step, HOS (skewness and kurtosis are estimated locally using a square sliding computation window. Small deterministic objects have different statistical properties from the background they are thus highlighted. The influence of the signal-to-noise ratio (SNR on the results is studied in the case of Gaussian noise. Mathematical expressions of the estimators and of the expected performances are derived and are experimentally confirmed. In a second step, the results are focused by a matched filter using a theoretical model. This enables the precise localization of the regions of interest. The proposed method generalizes to other statistical distributions and we derive the theoretical expressions of the HOS estimators in the case of a Weibull distribution (both when only noise is present or when a small deterministic object is present within the filtering window. This enables the application of the proposed technique to the processing of synthetic aperture sonar data containing underwater mines whose echoes have to be detected and located. Results on real data sets are presented and quantitatively evaluated using receiver operating characteristic (ROC curves.
Directory of Open Access Journals (Sweden)
Anita Lindmark
Full Text Available When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance.The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method.Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252 and high specificity (0.991. There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence.The study emphasizes the importance of combining clinical relevance and level of statistical
Lindmark, Anita; van Rompaye, Bart; Goetghebeur, Els; Glader, Eva-Lotta; Eriksson, Marie
2016-01-01
When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke) to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance. The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method. Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252) and high specificity (0.991). There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence. The study emphasizes the importance of combining clinical relevance and level of statistical confidence when
Directory of Open Access Journals (Sweden)
Laura Badenes-Ribera
2018-06-01
Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not
Arismendi, Ivan; Johnson, Sherri L.; Dunham, Jason B.
2015-01-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.
Directory of Open Access Journals (Sweden)
Sadreyev Ruslan I
2004-08-01
Full Text Available Abstract Background Profile-based analysis of multiple sequence alignments (MSA allows for accurate comparison of protein families. Here, we address the problems of detecting statistically confident dissimilarities between (1 MSA position and a set of predicted residue frequencies, and (2 between two MSA positions. These problems are important for (i evaluation and optimization of methods predicting residue occurrence at protein positions; (ii detection of potentially misaligned regions in automatically produced alignments and their further refinement; and (iii detection of sites that determine functional or structural specificity in two related families. Results For problems (1 and (2, we propose analytical estimates of P-value and apply them to the detection of significant positional dissimilarities in various experimental situations. (a We compare structure-based predictions of residue propensities at a protein position to the actual residue frequencies in the MSA of homologs. (b We evaluate our method by the ability to detect erroneous position matches produced by an automatic sequence aligner. (c We compare MSA positions that correspond to residues aligned by automatic structure aligners. (d We compare MSA positions that are aligned by high-quality manual superposition of structures. Detected dissimilarities reveal shortcomings of the automatic methods for residue frequency prediction and alignment construction. For the high-quality structural alignments, the dissimilarities suggest sites of potential functional or structural importance. Conclusion The proposed computational method is of significant potential value for the analysis of protein families.
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values
Statistical determination of significant curved I-girder bridge seismic response parameters
Seo, Junwon
2013-06-01
Curved steel bridges are commonly used at interchanges in transportation networks and more of these structures continue to be designed and built in the United States. Though the use of these bridges continues to increase in locations that experience high seismicity, the effects of curvature and other parameters on their seismic behaviors have been neglected in current risk assessment tools. These tools can evaluate the seismic vulnerability of a transportation network using fragility curves. One critical component of fragility curve development for curved steel bridges is the completion of sensitivity analyses that help identify influential parameters related to their seismic response. In this study, an accessible inventory of existing curved steel girder bridges located primarily in the Mid-Atlantic United States (MAUS) was used to establish statistical characteristics used as inputs for a seismic sensitivity study. Critical seismic response quantities were captured using 3D nonlinear finite element models. Influential parameters from these quantities were identified using statistical tools that incorporate experimental Plackett-Burman Design (PBD), which included Pareto optimal plots and prediction profiler techniques. The findings revealed that the potential variation in the influential parameters included number of spans, radius of curvature, maximum span length, girder spacing, and cross-frame spacing. These parameters showed varying levels of influence on the critical bridge response.
Are studies reporting significant results more likely to be published?
Koletsi, Despina; Karagianni, Anthi; Pandis, Nikolaos; Makou, Margarita; Polychronopoulou, Argy; Eliades, Theodore
2009-11-01
Our objective was to assess the hypothesis that there are variations of the proportion of articles reporting a significant effect, with a higher percentage of those articles published in journals with impact factors. The contents of 5 orthodontic journals (American Journal of Orthodontics and Dentofacial Orthopedics, Angle Orthodontist, European Journal of Orthodontics, Journal of Orthodontics, and Orthodontics and Craniofacial Research), published between 2004 and 2008, were hand-searched. Articles with statistical analysis of data were included in the study and classified into 4 categories: behavior and psychology, biomaterials and biomechanics, diagnostic procedures and treatment, and craniofacial growth, morphology, and genetics. In total, 2622 articles were examined, with 1785 included in the analysis. Univariate and multivariate logistic regression analyses were applied with statistical significance as the dependent variable, and whether the journal had an impact factor, the subject, and the year were the independent predictors. A higher percentage of articles showed significant results relative to those without significant associations (on average, 88% vs 12%) for those journals. Overall, these journals published significantly more studies with significant results, ranging from 75% to 90% (P = 0.02). Multivariate modeling showed that journals with impact factors had a 100% increased probability of publishing a statistically significant result compared with journals with no impact factor (odds ratio [OR], 1.99; 95% CI, 1.19-3.31). Compared with articles on biomaterials and biomechanics, all other subject categories showed lower probabilities of significant results. Nonsignificant findings in behavior and psychology and diagnosis and treatment were 1.8 (OR, 1.75; 95% CI, 1.51-2.67) and 3.5 (OR, 3.50; 95% CI, 2.27-5.37) times more likely to be published, respectively. Journals seem to prefer reporting significant results; this might be because of authors
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Directory of Open Access Journals (Sweden)
Xiao-Juan Wang
2017-09-01
Full Text Available Objective: To investigate the value and significance of serum CEA, CA125, SCC-Ag, CA199 and CYFRA21-1 in the diagnosis of cervical cancer by comparing the detection of five serum markers. Methods: A total of 108 cases were divided into three groups, including 60 cervical cancerpatients and 20 cervical intraepithelial neoplasiain patients treated in our hospital from September 2015 to September 2016 and 28 healthy women. Radioimmunoassay was used to detect and compare the serum levels of CA125, CA199, CYFRA21-1 and ELISA method was used to detect and compare the serum levels of SCC-Ag, CEA. Results: (1 There was no statistically significant difference in the serum CEA, CA125, SCC-Ag, CA199, CYFRA21-1 levels between CIN group and control group. The serums CEA, CA125, SCC-Ag, CA199, CYFRA21-1 levels of cervical cancer patients were significantly higher than the other two groups. The differences were statistically significant. (2There were statistically significant differences in the serum CEA, CA125, SCC-Ag, CA199, CYFRA21-1 levels between different cervical pathological type groups.The serum CA125, CA199, CEA levels of cervical glandular cancer patients were significantly higher than the other two groups. The differences were statistically significant. The serum SCC-Ag, CYFRA21-1 levels of cervical squamous cancer patients were significantly higher than the other two groups. The differences were statistically significant. Conclusion: The serums CEA, CA125, SCC-Ag, CA199, CYFRA21-1 levels of cervical cancer patients were significantly higher than cervical intraepithelial neoplasiain patients and healthy women. The serum CA125, CA199, CEA levels of cervical glandular cancer patients were significantly higher and the serum SCC-Ag, CYFRA21-1 levels of cervical squamous cancer patients were significantly higher. The five tumor markers can be used in diagnosis of cervical cancer and they are also worthy in distinguishing cervical pathological types.
Exclusion statistics and integrable models
International Nuclear Information System (INIS)
Mashkevich, S.
1998-01-01
The definition of exclusion statistics, as given by Haldane, allows for a statistical interaction between distinguishable particles (multi-species statistics). The thermodynamic quantities for such statistics ca be evaluated exactly. The explicit expressions for the cluster coefficients are presented. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models. The interesting questions of generalizing this correspondence onto the higher-dimensional and the multi-species cases remain essentially open
Higher-order turbulence statistics of wave–current flow over a submerged hemisphere
Energy Technology Data Exchange (ETDEWEB)
Barman, Krishnendu; Debnath, Koustuv; Mazumder, Bijoy S, E-mail: debnath_koustuv@yahoo.com [Department of Aerospace Engineering and Applied Mechanics, Indian Institute of Engineering Science and Technology, Shibpur, Howrah 711103, West Bengal (India)
2017-04-15
Higher-order turbulence characteristics such as turbulence production, turbulence kinetic energy flux, third order moments and velocity spectra associated with turbulent bursting events due to the influence of a submerged hemisphere under wave–current interactions are presented. The velocity components were measured using three dimensional (3D) 16 MHz micro-acoustic Doppler velocimetry (Micro-ADV). In the wave–current interactions, the contributions of turbulent bursting events such as ejections and sweeps significantly reduce in comparison to the current-only case. The distributions of the mean time intervals of ejection and sweeping events are found to alter due to the superposition of surface waves. Results also depict that the turbulence production in the wake region of the hemisphere reduces remarkably, due to the superposition of surface waves on the current. Further, spectral and co-spectral analysis demonstrates that there is a significant reduction of power spectral peak for both longitudinal and bottom-normal velocities upon superposition of surface waves, which signifies a remarkable change in energy distribution between different frequencies of waves. (paper)
The prevalence of coeliac disease is significantly higher in children compared with adults.
Mariné, M; Farre, C; Alsina, M; Vilar, P; Cortijo, M; Salas, A; Fernández-Bañares, F; Rosinach, M; Santaolalla, R; Loras, C; Marquès, T; Cusí, V; Hernández, M I; Carrasco, A; Ribes, J; Viver, J M; Esteve, M
2011-02-01
Some limited studies of coeliac disease have shown higher frequency of coeliac disease in infancy and adolescence than in adulthood. This finding has remained unnoticed and not adequately demonstrated. To assess whether there are age and gender differences in coeliac disease prevalence. A total of 4230 subjects were included consecutively (1 to ≥80 years old) reproducing the reference population by age and gender. Sample size was calculated assuming a population-based coeliac disease prevalence of 1:250. After an interim analysis, the paediatric sample was expanded (2010 children) due to high prevalence in this group. Anti-transglutaminase and antiendomysial antibodies were determined and duodenal biopsy was performed if positive. Log-linear models were fitted to coeliac disease prevalence by age allowing calculation of percentage change of prevalence. Differences between groups were compared using Chi-squared test. Twenty-one subjects had coeliac disease (male/female 1:2.5). Coeliac disease prevalence in the total population was 1:204. Coeliac disease prevalence was higher in children (1:71) than in adults (1:357) (P = 0.00005). A significant decrease of prevalence in older generations was observed [change of prevalence by age of -5% (95% CI: -7.58 to -2.42%)]. In the paediatric expanded group (1-14 years), a decrease of coeliac disease prevalence was also observed [prevalence change: -17% (95% CI: -25.02 to -6.10)]. The prevalence of coeliac disease in childhood was five times higher than in adults. Whether this difference is due to environmental factors influencing infancy, or latency of coeliac disease in adulthood, remains to be demonstrated in prospective longitudinal studies. © 2010 Blackwell Publishing Ltd.
Siddiqi, Ariba; Arjunan, Sridhar P; Kumar, Dinesh K
2016-08-01
Age-associated changes in the surface electromyogram (sEMG) of Tibialis Anterior (TA) muscle can be attributable to neuromuscular alterations that precede strength loss. We have used our sEMG model of the Tibialis Anterior to interpret the age-related changes and compared with the experimental sEMG. Eighteen young (20-30 years) and 18 older (60-85 years) performed isometric dorsiflexion at 6 different percentage levels of maximum voluntary contractions (MVC), and their sEMG from the TA muscle was recorded. Six different age-related changes in the neuromuscular system were simulated using the sEMG model at the same MVCs as the experiment. The maximal power of the spectrum, Gaussianity and Linearity Test Statistics were computed from the simulated and experimental sEMG. A correlation analysis at α=0.05 was performed between the simulated and experimental age-related change in the sEMG features. The results show the loss in motor units was distinguished by the Gaussianity and Linearity test statistics; while the maximal power of the PSD distinguished between the muscular factors. The simulated condition of 40% loss of motor units with halved the number of fast fibers best correlated with the age-related change observed in the experimental sEMG higher order statistical features. The simulated aging condition found by this study corresponds with the moderate motor unit remodelling and negligible strength loss reported in literature for the cohorts aged 60-70 years.
Directory of Open Access Journals (Sweden)
Leitner Dietmar
2005-04-01
Full Text Available Abstract Background A reliable prediction of the Xaa-Pro peptide bond conformation would be a useful tool for many protein structure calculation methods. We have analyzed the Protein Data Bank and show that the combined use of sequential and structural information has a predictive value for the assessment of the cis versus trans peptide bond conformation of Xaa-Pro within proteins. For the analysis of the data sets different statistical methods such as the calculation of the Chou-Fasman parameters and occurrence matrices were used. Furthermore we analyzed the relationship between the relative solvent accessibility and the relative occurrence of prolines in the cis and in the trans conformation. Results One of the main results of the statistical investigations is the ranking of the secondary structure and sequence information with respect to the prediction of the Xaa-Pro peptide bond conformation. We observed a significant impact of secondary structure information on the occurrence of the Xaa-Pro peptide bond conformation, while the sequence information of amino acids neighboring proline is of little predictive value for the conformation of this bond. Conclusion In this work, we present an extensive analysis of the occurrence of the cis and trans proline conformation in proteins. Based on the data set, we derived patterns and rules for a possible prediction of the proline conformation. Upon adoption of the Chou-Fasman parameters, we are able to derive statistically relevant correlations between the secondary structure of amino acid fragments and the Xaa-Pro peptide bond conformation.
CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY
Directory of Open Access Journals (Sweden)
ILEANA BRUDIU
2009-05-01
Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.
Piotrowska-Piatek, Agnieszka
2017-01-01
In the context of the ongoing changes in the management systems of higher education, the issue of higher education institutions' (HEIs) relationships with external stakeholders are of key importance. This article discusses this problem from the perspective of Polish higher education system. The aim of it is to answer the following questions: (1)…
Predictors of Career Adaptability Skill among Higher Education Students in Nigeria
Directory of Open Access Journals (Sweden)
Amos Shaibu Ebenehi
2016-12-01
Full Text Available This paper examined predictors of career adaptability skill among higher education students in Nigeria. A sample of 603 higher education students randomly selected from six colleges of education in Nigeria participated in this study. A set of self-reported questionnaire was used for data collection, and multiple linear regression analysis was used to analyze the data. Results indicated that 33.3% of career adaptability skill was explained by the model. Four out of the five predictor variables significantly predicted career adaptability skill among higher education students in Nigeria. Among the four predictors, career self-efficacy sources was the most statistically significant predictor of career adaptability skill among higher education students in Nigeria, followed by personal goal orientation, career future concern, and perceived social support respectively. Vocational identity did not statistically predict career adaptability skill among higher education students in Nigeria. The study suggested that similar study should be replicated in other parts of the world in view of the importance of career adaptability skill to the smooth transition of graduates from school to the labor market. The study concluded by requesting stakeholders of higher institutions in Nigeria to provide career exploration database for the students, and encourage career intervention program in order to enhance career adaptability skill among the students.
Statistics in the Workplace: A Survey of Use by Recent Graduates with Higher Degrees
Harraway, John A.; Barker, Richard J.
2005-01-01
A postal survey was conducted regarding statistical techniques, research methods and software used in the workplace by 913 graduates with PhD and Masters degrees in the biological sciences, psychology, business, economics, and statistics. The study identified gaps between topics and techniques learned at university and those used in the workplace,…
Exclusion statistics and integrable models
International Nuclear Information System (INIS)
Mashkevich, S.
1998-01-01
The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question
SyntEyes KTC: higher order statistical eye model for developing keratoconus.
Rozema, Jos J; Rodriguez, Pablo; Ruiz Hidalgo, Irene; Navarro, Rafael; Tassignon, Marie-José; Koppen, Carina
2017-05-01
To present and validate a stochastic eye model for developing keratoconus to e.g. improve optical corrective strategies. This could be particularly useful for researchers that do not have access to original keratoconic data. The Scheimpflug tomography, ocular biometry and wavefront of 145 keratoconic right eyes were collected. These data were processed using principal component analysis for parameter reduction, followed by a multivariate Gaussian fit that produces a stochastic model for keratoconus (SyntEyes KTC). The output of this model is filtered to remove the occasional incorrect topography patterns by either an automatic or manual procedure. Finally, the output of this keratoconus model is matched to that of the original model for normal eyes using the non-corneal biometry to obtain a description of keratoconus development. The synthetic data generated by the model were found to be significantly equal to the original data (non-parametric Mann-Whitney equivalence test; 145/154 passed). The variability of the synthetic data, however, was often significantly less than that of the original data, especially for the higher order Zernike terms of corneal elevation (non-parametric Levene test; p eyes with incorrect topographies. Interpolation between matched pairs of normal and keratoconic SyntEyes appears to provide an adequate model for keratoconus progression. The synthetic data provided by the proposed keratoconus model closely resembles actual clinical data and may be used for a range of research applications when (sufficient) real data is not available. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.
Yilmaz, Ferkan; Tabassum, Hina; Alouini, Mohamed-Slim
2014-01-01
Higher order statistics (HOS) of the channel capacity provide useful information regarding the level of reliability of signal transmission at a particular rate. In this paper, we propose a novel and unified analysis, which is based on the moment-generating function (MGF) approach, to efficiently and accurately compute the HOS of the channel capacity for amplify-and-forward (AF) multihop transmission over generalized fading channels. More precisely, our easy-to-use and tractable mathematical formalism requires only the reciprocal MGFs of the transmission hop signal-to-noise ratio (SNR). Numerical and simulation results, which are performed to exemplify the usefulness of the proposed MGF-based analysis, are shown to be in perfect agreement. © 2013 IEEE.
International Nuclear Information System (INIS)
Parvan, A.S.
2016-01-01
The Tsallis statistics was applied to describe the experimental data on the transverse momentum distributions of hadrons. We considered the energy dependence of the parameters of the Tsallis-factorized statistics, which is now widely used for the description of the experimental transverse momentum distributions of hadrons, and the Tsallis statistics for the charged pions produced in pp collisions at high energies. We found that the results of the Tsallis-factorized statistics deviate from the results of the Tsallis statistics only at low NA61/SHINE energies when the value of the entropic parameter is close to unity. At higher energies, when the value of the entropic parameter deviates essentially from unity, the Tsallis-factorized statistics satisfactorily recovers the results of the Tsallis statistics. (orig.)
Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior
2011-09-23
Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.
High order statistical signatures from source-driven measurements of subcritical fissile systems
International Nuclear Information System (INIS)
Mattingly, J.K.
1998-01-01
This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements
Jiang, Haiping; Marot, Julien; Fossati, Caroline; Bourennane, Salah
2011-12-01
In real-world conditions, contours are most often blurred in digital images because of acquisition conditions such as movement, light transmission environment, and defocus. Among image segmentation methods, Hough transform requires a computational load which increases with the number of noise pixels, level set methods also require a high computational load, and some other methods assume that the contours are one-pixel wide. For the first time, we retrieve the characteristics of multiple possibly concentric blurred circles. We face correlated noise environment, to get closer to real-world conditions. For this, we model a blurred circle by a few parameters--center coordinates, radius, and spread--which characterize its mean position and gray level variations. We derive the signal model which results from signal generation on circular antenna. Linear antennas provide the center coordinates. To retrieve the circle radii, we adapt the second-order statistics TLS-ESPRIT method for non-correlated noise environment, and propose a novel version of TLS-ESPRIT based on higher-order statistics for correlated noise environment. Then, we derive a least-squares criterion and propose an alternating least-squares algorithm to retrieve simultaneously all spread values of concentric circles. Experiments performed on hand-made and real-world images show that the proposed methods outperform the Hough transform and a level set method dedicated to blurred contours in terms of computational load. Moreover, the proposed model and optimization method provide the information of the contour grey level variations.
DEFF Research Database (Denmark)
Bagge, Søren; Lund, Malthe; Rønn, Regin
2012-01-01
Carabid beetles play an important role as consumers of pest organisms in forestry and agriculture. Application of pesticides may negatively affect abundance and activity of carabid beetles, thus reducing their potential beneficial effect. We investigated how abundance and diversity of pitfall...... trapped carabid beetles (Coleoptera, Carabidae) varied between conventionally and organically managed Caucasian Fir (Abies nordmanniana (Stev.)) plantations, in northern Zealand, Denmark. We recorded significantly higher numbers of carabid beetle specimens and species at conventionally than at organically...
Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu
2015-09-21
Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...
Geometric statistical inference
International Nuclear Information System (INIS)
Periwal, Vipul
1999-01-01
A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined
A novel statistic for genome-wide interaction analysis.
Directory of Open Access Journals (Sweden)
Xuesen Wu
2010-09-01
Full Text Available Although great progress in genome-wide association studies (GWAS has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked. The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
Wavelet Transform Based Higher Order Statistical Analysis of Wind and Wave Time Histories
Habib Huseni, Gulamhusenwala; Balaji, Ramakrishnan
2017-10-01
Wind, blowing on the surface of the ocean, imparts the energy to generate the waves. Understanding the wind-wave interactions is essential for an oceanographer. This study involves higher order spectral analyses of wind speeds and significant wave height time histories, extracted from European Centre for Medium-Range Weather Forecast database at an offshore location off Mumbai coast, through continuous wavelet transform. The time histories were divided by the seasons; pre-monsoon, monsoon, post-monsoon and winter and the analysis were carried out to the individual data sets, to assess the effect of various seasons on the wind-wave interactions. The analysis revealed that the frequency coupling of wind speeds and wave heights of various seasons. The details of data, analysing technique and results are presented in this paper.
International Nuclear Information System (INIS)
Brekke, L.; Imbo, T.D.
1992-01-01
The authors study the inequivalent quantizations of (1 + 1)-dimensional nonlinear sigma models with space manifold S 1 and target manifold X. If x is multiply connected, these models possess topological solitons. After providing a definition of spin and statistics for these solitons and demonstrating a spin-statistics correlation, we give various examples where the solitons can have exotic statistics. In some of these models, the solitons may obey a generalized version of fractional statistics called ambistatistics. In this paper the relevance of these 2d models to the statistics of vortices in (2 + 1)-dimensional spontaneously broken gauge theories is discussed. The authors close with a discussion concerning the extension of our results to higher dimensions
Kepler Planet Detection Metrics: Statistical Bootstrap Test
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
After statistics reform : Should we still teach significance testing?
A. Hak (Tony)
2014-01-01
textabstractIn the longer term null hypothesis significance testing (NHST) will disappear because p- values are not informative and not replicable. Should we continue to teach in the future the procedures of then abolished routines (i.e., NHST)? Three arguments are discussed for not teaching NHST in
Efficacy of a Word- and Text-Based Intervention for Students With Significant Reading Difficulties.
Vaughn, Sharon; Roberts, Garrett J; Miciak, Jeremy; Taylor, Pat; Fletcher, Jack M
2018-05-01
We examine the efficacy of an intervention to improve word reading and reading comprehension in fourth- and fifth-grade students with significant reading problems. Using a randomized control trial design, we compare the fourth- and fifth-grade reading outcomes of students with severe reading difficulties who were provided a researcher-developed treatment with reading outcomes of students in a business-as-usual (BAU) comparison condition. A total of 280 fourth- and fifth-grade students were randomly assigned within school in a 1:1 ratio to either the BAU comparison condition ( n = 139) or the treatment condition ( n = 141). Treatment students were provided small-group tutoring for 30 to 45 minutes for an average of 68 lessons (mean hours of instruction = 44.4, SD = 11.2). Treatment students performed statistically significantly higher than BAU students on a word reading measure (effect size [ES] = 0. 58) and a measure of reading fluency (ES = 0.46). Though not statistically significant, effect sizes for students in the treatment condition were consistently higher than BAU students for decoding measures (ES = 0.06, 0.08), and mixed for comprehension (ES = -0.02, 0.14).
Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona
2017-01-01
In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural
Paechter, Manuela; Macher, Daniel; Martskvishvili, Khatuna; Wimmer, Sigrid; Papousek, Ilona
2017-01-01
In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men). Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in the structural
Directory of Open Access Journals (Sweden)
Manuela Paechter
2017-07-01
Full Text Available In many social science majors, e.g., psychology, students report high levels of statistics anxiety. However, these majors are often chosen by students who are less prone to mathematics and who might have experienced difficulties and unpleasant feelings in their mathematics courses at school. The present study investigates whether statistics anxiety is a genuine form of anxiety that impairs students' achievements or whether learners mainly transfer previous experiences in mathematics and their anxiety in mathematics to statistics. The relationship between mathematics anxiety and statistics anxiety, their relationship to learning behaviors and to performance in a statistics examination were investigated in a sample of 225 undergraduate psychology students (164 women, 61 men. Data were recorded at three points in time: At the beginning of term students' mathematics anxiety, general proneness to anxiety, school grades, and demographic data were assessed; 2 weeks before the end of term, they completed questionnaires on statistics anxiety and their learning behaviors. At the end of term, examination scores were recorded. Mathematics anxiety and statistics anxiety correlated highly but the comparison of different structural equation models showed that they had genuine and even antagonistic contributions to learning behaviors and performance in the examination. Surprisingly, mathematics anxiety was positively related to performance. It might be that students realized over the course of their first term that knowledge and skills in higher secondary education mathematics are not sufficient to be successful in statistics. Part of mathematics anxiety may then have strengthened positive extrinsic effort motivation by the intention to avoid failure and may have led to higher effort for the exam preparation. However, via statistics anxiety mathematics anxiety also had a negative contribution to performance. Statistics anxiety led to higher procrastination in
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
The Role of Cultural Capital in Higher Education Access and Institutional Choice
Directory of Open Access Journals (Sweden)
Iva Košutić
2017-03-01
Full Text Available This paper aims to explore social inequalities in school achievement and educational decision-making of the final-year students of secondary schools in the City of Zagreb and Zagreb County, Croatia (N = 534. The theoretical framework of the paper was Bourdieu’s theory of cultural and social reproduction (1977a. The main objectives were an analysis of the association between the students’ cultural capital and their school achievement and analyses of the predictive power of the cultural capital theory in the context of educational decisions in the transition to tertiary education. In the analysis of school achievement, sequential multiple regression analysis was used, while in the analyses of educational decisions logistic regression analyses were performed (binary and multinomial logistic regression. The results indicated that cultural capital had statistically significant correlation with school performance. Among the cultural capital indicators, statistically significant predictors of the probability of the intention to enrol into vocational higher education were the material dimension of cultural capital and naturalness of higher education aspirations of students. For the prediction of the probability of intention to enrol in university, significant predictors were embodied cultural capital, the naturalness of higher education aspirations of students, and father’s educational level. The study results on a selected sample of graduates tend to support Bourdieu’s theory of cultural reproduction through education.
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
Directory of Open Access Journals (Sweden)
Vujović Svetlana R.
2013-01-01
Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an
Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.
1999-01-01
Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier
Liu, Wei; Ding, Jinhui
2018-04-01
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... them at higher risk. Reported Cases of Human Plague - United States, 1970-2016 Since the mid–20th ...
Directory of Open Access Journals (Sweden)
Anna Siri
2016-12-01
Examining the data broken down by gender, the correlations were higher and statistically significant in males than in females. GT-based data for drop-out resulted best modeled by an ARMA(1,0 model. Considering the cross correlation of Canadian regions, all of them resulted statistically significant at lag 0, apart from for New Brunswick, Newfoundland and Labrador and the Prince Edward island. A number or cross-correlations resulted statistically significant also at lag −1 (namely, Alberta, Manitoba, New Brunswick and Saskatchewan.
Education for Sustainable Development in Higher Education Institutions
Directory of Open Access Journals (Sweden)
César Tapia-Fonllem
2017-01-01
Full Text Available The role that higher education plays in the promotion of sustainable development outstands in the declarations on Education for Sustainable Development (ESD, besides being a research priority in higher education. However, few studies exist that evaluate sustainable lifestyles among university students. The aim of this study was to analyze the mission and vision, processes and actions undertaken to promote sustainability in higher education institutions, and to compare the pro-sustainability orientation (PSO reported by 360 students coursing first or last semesters at college. The study was intended to evaluate the influence that four higher education institutions in Sonora, Mexico, have on students’ PSO. Results of the study indicate that a coherent PSO factor emerges from the interrelations among pro-environmental dispositional and behavioral variables reported by students. However, university programs and actions do not produce statistically significant differences between freshmen and senior students. Possible reasons explaining the lack of positive influence of those universities on students’ PSO are discussed.
A general solution strategy of modified power method for higher mode solutions
International Nuclear Information System (INIS)
Zhang, Peng; Lee, Hyunsuk; Lee, Deokjung
2016-01-01
A general solution strategy of the modified power iteration method for calculating higher eigenmodes has been developed and applied in continuous energy Monte Carlo simulation. The new approach adopts four features: 1) the eigen decomposition of transfer matrix, 2) weight cancellation for higher modes, 3) population control with higher mode weights, and 4) stabilization technique of statistical fluctuations using multi-cycle accumulations. The numerical tests of neutron transport eigenvalue problems successfully demonstrate that the new strategy can significantly accelerate the fission source convergence with stable convergence behavior while obtaining multiple higher eigenmodes at the same time. The advantages of the new strategy can be summarized as 1) the replacement of the cumbersome solution step of high order polynomial equations required by Booth's original method with the simple matrix eigen decomposition, 2) faster fission source convergence in inactive cycles, 3) more stable behaviors in both inactive and active cycles, and 4) smaller variances in active cycles. Advantages 3 and 4 can be attributed to the lower sensitivity of the new strategy to statistical fluctuations due to the multi-cycle accumulations. The application of the modified power method to continuous energy Monte Carlo simulation and the higher eigenmodes up to 4th order are reported for the first time in this paper. -- Graphical abstract: -- Highlights: •Modified power method is applied to continuous energy Monte Carlo simulation. •Transfer matrix is introduced to generalize the modified power method. •All mode based population control is applied to get the higher eigenmodes. •Statistic fluctuation can be greatly reduced using accumulated tally results. •Fission source convergence is accelerated with higher mode solutions.
Productivity effects of higher education human capital in selected countries of Sub-Saharan Africa
Directory of Open Access Journals (Sweden)
Koye Gerry Bokana
2017-06-01
Full Text Available This study aimed to analyse the productivity effects of higher education enrolment (HEE, higher education output (HEO and the associated productivity gap (GP on selected countries in Sub-Saharan Africa (SSA over the period between 1981 and 2014. It was hypothesized in the study that HEE and HEO had statistically significant positive impact on productivity in the selected sub-Saharan Africa countries over the stated period. Fixed effect Least Square Dummy Variable (LSDV and a robust version of System Generalized Methods of Moment (SYSGMM were adopted as model estimating techniques. Results from the LSDV model indicated that HEE had no statistically significant positive impact on productivity growth in the twenty-one SSA countries. This non-significance was corrected in the dynamic model, but with negative effects on the growth rate of total factor productivity (TFP. The study further compared the worldwide technological frontier with those of the SSA countries under investigation and discovered that countries like Gabon, Mauritius and Swaziland ranked high, while Burundi needs to improve on its productivity determinants. The major conclusion of this study is therefore that higher education human capital should be supported with strong policy implementation, as this can have a positive impact on productivity growth.
Statistical-Based Insights in Spence’s Theory of Honest Signaling
Directory of Open Access Journals (Sweden)
Mihaela Grecu
2015-09-01
Full Text Available Since Michael Spence revealed the secrets of (dishonest signalling on labour market, an increasing body of literature in various fields struggled to find the best way to solve the game under imperfect information that describes the interaction between the employer and the employee. Despite the value of the signal originally acknowledged by Spence, the university degree, a recent trend of increasing in unemployment rate among graduates of higher education suggests that between higher education and labour market may be a less significant connection than universities claim, potentially resulting in a decreasing power of the signal consisting of an university diploma. The aim of this study is to provide statistical evidence of the connection between higher education and labour market in Romania and to discuss some of the factors that potentially cause young people to choose a particular study program. Based on statistical analysis, we investigate the gap between the number of graduates in Law and the labour market capacity in the field, and draw conclusions regarding the accuracy of the mechanism that leads to equilibrium between supply and demand on the university market.
Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo
The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.
Directory of Open Access Journals (Sweden)
Svetlana V. Smirnova
2013-01-01
Full Text Available The features of using information technologies within applied statisticians in psychology are considered in the article. Requirements to statistical preparation of psychology students in the conditions of information society are analyzed.
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M
2015-03-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.
Clinicopathological significance of c-MYC in esophageal squamous cell carcinoma.
Lian, Yu; Niu, Xiangdong; Cai, Hui; Yang, Xiaojun; Ma, Haizhong; Ma, Shixun; Zhang, Yupeng; Chen, Yifeng
2017-07-01
Esophageal squamous cell carcinoma is one of the most common malignant tumors. The oncogene c-MYC is thought to be important in the initiation, promotion, and therapy resistance of cancer. In this study, we aim to investigate the clinicopathologic roles of c-MYC in esophageal squamous cell carcinoma tissue. This study is aimed at discovering and analyzing c-MYC expression in a series of human esophageal tissues. A total of 95 esophageal squamous cell carcinoma samples were analyzed by the western blotting and immunohistochemistry techniques. Then, correlation of c-MYC expression with clinicopathological features of esophageal squamous cell carcinoma patients was statistically analyzed. In most esophageal squamous cell carcinoma cases, the c-MYC expression was positive in tumor tissues. The positive rate of c-MYC expression in tumor tissues was 61.05%, obviously higher than the adjacent normal tissues (8.42%, 8/92) and atypical hyperplasia tissues (19.75%, 16/95). There was a statistical difference among adjacent normal tissues, atypical hyperplasia tissues, and tumor tissues. Overexpression of the c-MYC was detected in 61.05% (58/95) esophageal squamous cell carcinomas, which was significantly correlated with the degree of differentiation (p = 0.004). The positive rate of c-MYC expression was 40.0% in well-differentiated esophageal tissues, with a significantly statistical difference (p = 0.004). The positive rate of c-MYC was 41.5% in T1 + T2 esophageal tissues and 74.1% in T3 + T4 esophageal tissues, with a significantly statistical difference (p = 0.001). The positive rate of c-MYC was 45.0% in I + II esophageal tissues and 72.2% in III + IV esophageal tissues, with a significantly statistical difference (p = 0.011). The c-MYC expression strongly correlated with clinical staging (p = 0.011), differentiation degree (p = 0.004), lymph node metastasis (p = 0.003), and invasion depth (p = 0.001) of patients with esophageal squamous cell carcinoma. The c-MYC was
Di Florio, Adriano
2017-10-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
Death Certification Errors and the Effect on Mortality Statistics.
McGivern, Lauri; Shulman, Leanne; Carney, Jan K; Shapiro, Steven; Bundock, Elizabeth
Errors in cause and manner of death on death certificates are common and affect families, mortality statistics, and public health research. The primary objective of this study was to characterize errors in the cause and manner of death on death certificates completed by non-Medical Examiners. A secondary objective was to determine the effects of errors on national mortality statistics. We retrospectively compared 601 death certificates completed between July 1, 2015, and January 31, 2016, from the Vermont Electronic Death Registration System with clinical summaries from medical records. Medical Examiners, blinded to original certificates, reviewed summaries, generated mock certificates, and compared mock certificates with original certificates. They then graded errors using a scale from 1 to 4 (higher numbers indicated increased impact on interpretation of the cause) to determine the prevalence of minor and major errors. They also compared International Classification of Diseases, 10th Revision (ICD-10) codes on original certificates with those on mock certificates. Of 601 original death certificates, 319 (53%) had errors; 305 (51%) had major errors; and 59 (10%) had minor errors. We found no significant differences by certifier type (physician vs nonphysician). We did find significant differences in major errors in place of death ( P statistics. Surveillance and certifier education must expand beyond local and state efforts. Simplifying and standardizing underlying literal text for cause of death may improve accuracy, decrease coding errors, and improve national mortality statistics.
STATISTICAL ANALYSIS OF THE HEAVY NEUTRAL ATOMS MEASURED BY IBEX
International Nuclear Information System (INIS)
Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.
2015-01-01
We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O and Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O and Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath
DEFF Research Database (Denmark)
Madsen, Tobias
2017-01-01
In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...
Significance evaluation in factor graphs
DEFF Research Database (Denmark)
Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet
2017-01-01
in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...
Statistical assessment of coal charge effect on metallurgical coke quality
Directory of Open Access Journals (Sweden)
Pavlína Pustějovská
2016-06-01
Full Text Available The paper studies coke quality. Blast furnace technique has been interested in iron ore charge; meanwhile coke was not studied because, in previous conditions, it seemed to be good enough. Nowadays, requirements for blast furnace coke has risen, especially, requirements for coke reactivity. The level of reactivity parameter is determined primarily by the composition and properties of coal mixtures for coking. The paper deals with a statistical analysis of the tightness and characteristics of the relationship between selected properties of coal mixture and coke reactivity. Software Statgraphic using both simple linear regression and multiple linear regressions was used for the calculations. Obtained regression equations provide a statistically significant prediction of the reactivity of coke, or its strength after reduction of CO2, and, thus, their subsequent management by change in composition and properties of coal mixture. There were determined indexes CSR/CRI for coke. Fifty – four results were acquired in the experimental parts where correlation between index CRI and coal components were studied. For linear regression the determinant was 55.0204%, between parameters CRI – Inertinit 21.5873%. For regression between CRI and coal components it was 31.03%. For multiple linear regression between CRI and 3 feedstock components determinant was 34.0691%. The final correlation has shown the decrease in final coke reactivity for higher ash, higher content of volatile combustible in coal increases the total coke reactivity and higher amount of inertinit in coal increases the reactivity. Generally, coke quality is significantly affected by coal processing, carbonization and maceral content of coal mixture.
Statistical analysis of traversal behavior under different types of traffic lights
Wang, Boran; Wang, Ziyang; Li, Zhiyin
2017-12-01
According to the video observation, it is found that the traffic signal type signal has a significant effect on the illegal crossing behavior of pedestrians at the intersection. Through the method of statistical analysis and variance analysis, the difference between the violation rate and the waiting position of pedestrians at different intersecting lights is compared, and the influence of traffic signal type on pedestrian crossing behavior is evaluated. The results show that the violation rate of the intersection of the static pedestrian lights is significantly higher than that of the countdown signal lights. There are significant differences in the waiting position of the intersection of different signal lights.
STATISTICAL INSIGHT INTO THE BINDING REGIONS IN DISORDERED HUMAN PROTEOME
Directory of Open Access Journals (Sweden)
Uttam Pal
2016-03-01
Full Text Available The human proteome contains a significant number of intrinsically disordered proteins (IDPs. They show unusual structural features that enable them to participate in diverse cellular functions and play significant roles in cell signaling and reorganization processes. In addition, the actions of IDPs, their functional cooperativity, conformational alterations and folding often accompany binding to a target macromolecule. Applying bioinformatics approaches and with the aid of statistical methodologies, we investigated the statistical parameters of binding regions (BRs found in disordered human proteome. In this report, we detailed the bioinformatics analysis of binding regions found in the IDPs. Statistical models for the occurrence of BRs, their length distribution and percent occupancy in the parent proteins are shown. The frequency of BRs followed a Poisson distribution pattern with increasing expectancy with the degree of disorderedness. The length of the individual BRs also followed Poisson distribution with a mean of 6 residues, whereas, percentage of residues in BR showed a normal distribution pattern. We also explored the physicochemical properties such as the grand average of hydropathy (GRAVY and the theoretical isoelectric points (pIs. The theoretical pIs of the BRs followed a bimodal distribution as in the parent proteins. However, the mean acidic/basic pIs were significantly lower/higher than that of the proteins, respectively. We further showed that the amino acid composition of BRs was enriched in hydrophobic residues such as Ala, Val, Ile, Leu and Phe compared to the average sequence content of the proteins. Sequences in a BR showed conformational adaptability mostly towards flexible coil structure and followed by helix, however, the ordered secondary structural conformation was significantly lower in BRs than the proteins. Combining and comparing these statistical information of BRs with other methods may be useful for high
Statistical analysis of disruptions in JET
International Nuclear Information System (INIS)
De Vries, P.C.; Johnson, M.F.; Segui, I.
2009-01-01
The disruption rate (the percentage of discharges that disrupt) in JET was found to drop steadily over the years. Recent campaigns (2005-2007) show a yearly averaged disruption rate of only 6% while from 1991 to 1995 this was often higher than 20%. Besides the disruption rate, the so-called disruptivity, or the likelihood of a disruption depending on the plasma parameters, has been determined. The disruptivity of plasmas was found to be significantly higher close to the three main operational boundaries for tokamaks; the low-q, high density and β-limit. The frequency at which JET operated close to the density-limit increased six fold over the last decade; however, only a small reduction in disruptivity was found. Similarly the disruptivity close to the low-q and β-limit was found to be unchanged. The most significant reduction in disruptivity was found far from the operational boundaries, leading to the conclusion that the improved disruption rate is due to a better technical capability of operating JET, instead of safer operations close to the physics limits. The statistics showed that a simple protection system was able to mitigate the forces of a large fraction of disruptions, although it has proved to be at present more difficult to ameliorate the heat flux.
A Synergy Cropland of China by Fusing Multiple Existing Maps and Statistics.
Lu, Miao; Wu, Wenbin; You, Liangzhi; Chen, Di; Zhang, Li; Yang, Peng; Tang, Huajun
2017-07-12
Accurate information on cropland extent is critical for scientific research and resource management. Several cropland products from remotely sensed datasets are available. Nevertheless, significant inconsistency exists among these products and the cropland areas estimated from these products differ considerably from statistics. In this study, we propose a hierarchical optimization synergy approach (HOSA) to develop a hybrid cropland map of China, circa 2010, by fusing five existing cropland products, i.e., GlobeLand30, Climate Change Initiative Land Cover (CCI-LC), GlobCover 2009, MODIS Collection 5 (MODIS C5), and MODIS Cropland, and sub-national statistics of cropland area. HOSA simplifies the widely used method of score assignment into two steps, including determination of optimal agreement level and identification of the best product combination. The accuracy assessment indicates that the synergy map has higher accuracy of spatial locations and better consistency with statistics than the five existing datasets individually. This suggests that the synergy approach can improve the accuracy of cropland mapping and enhance consistency with statistics.
Change and clinical significance of serum PG in patients with chronic gastritis
Directory of Open Access Journals (Sweden)
Wei-Hua Huan
2017-06-01
Full Text Available Objective: To observe the change and clinical significance of serum PG in patients with chronic atrophic gastritis (CAG. Methods: ELISA was used to detect the peripheral blood PG level in patients confirmed with CAG, gastric polyps, and gastric cancer who were admitted in our hospital from January, 2015 to January, 2016. The normal individuals who came for physical examinations were served as the control group. The peripheral blood PG level in patients with various gastric diseases was observed. Results: The serum PG Ⅰ expression and PG I/PG Ⅱ in the gastritis group were significantly lower than those in the gastric polyps group and control group, but were significantly higher than those in the gastric cancer group; while PG Ⅱ expression was significantly higher than that in the gastric polyps group and control group, but was significantly lower than those in the gastric cancer group. PG Ⅰ expression and PG I/ PG Ⅱ in the gastric polyps group were significantly higher than those in the gastritis group and gastric cancer group, while PG Ⅱ expression was significantly lower than that in the gastritis group and gastric cancer group. PG Ⅰ expression and PG I/ PG Ⅱ in the gastric cancer group were significantly lower than those in the other three groups, while PG Ⅱ expression was significantly higher than that in the other three groups. The serum PG Ⅰ expression in patients with positive HP infection in the gastritis group and gastric cancer group was significantly higher than that in patients with negative HP infection, but the comparison of PG I/ PG Ⅱ was not statistically significant. The serum PG Ⅰ expression and PG I/ PG Ⅱ in patients with negative and positive HP infection in the gastritis group were significantly higher than those in patients with negative and positive HP infection in the gastric cancer group; while PG Ⅱ expression was significantly was significantly lower than that in the gastric cancer group
Statistical lamb wave localization based on extreme value theory
Harley, Joel B.
2018-04-01
Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.
Mask effects on cosmological studies with weak-lensing peak statistics
International Nuclear Information System (INIS)
Liu, Xiangkun; Pan, Chuzhong; Fan, Zuhui; Wang, Qiao
2014-01-01
With numerical simulations, we analyze in detail how the bad data removal, i.e., the mask effect, can influence the peak statistics of the weak-lensing convergence field reconstructed from the shear measurement of background galaxies. It is found that high peak fractions are systematically enhanced because of the presence of masks; the larger the masked area is, the higher the enhancement is. In the case where the total masked area is about 13% of the survey area, the fraction of peaks with signal-to-noise ratio ν ≥ 3 is ∼11% of the total number of peaks, compared with ∼7% of the mask-free case in our considered cosmological model. This can have significant effects on cosmological studies with weak-lensing convergence peak statistics, inducing a large bias in the parameter constraints if the effects are not taken into account properly. Even for a survey area of 9 deg 2 , the bias in (Ω m , σ 8 ) is already intolerably large and close to 3σ. It is noted that most of the affected peaks are close to the masked regions. Therefore, excluding peaks in those regions in the peak statistics can reduce the bias effect but at the expense of losing usable survey areas. Further investigations find that the enhancement of the number of high peaks around the masked regions can be largely attributed to the smaller number of galaxies usable in the weak-lensing convergence reconstruction, leading to higher noise than that of the areas away from the masks. We thus develop a model in which we exclude only those very large masks with radius larger than 3' but keep all the other masked regions in peak counting statistics. For the remaining part, we treat the areas close to and away from the masked regions separately with different noise levels. It is shown that this two-noise-level model can account for the mask effect on peak statistics very well, and the bias in cosmological parameters is significantly reduced if this model is applied in the parameter fitting.
Comparing higher order models for the EORTC QLQ-C30
DEFF Research Database (Denmark)
Gundy, Chad M; Fayers, Peter M; Grønvold, Mogens
2012-01-01
To investigate the statistical fit of alternative higher order models for summarizing the health-related quality of life profile generated by the EORTC QLQ-C30 questionnaire.......To investigate the statistical fit of alternative higher order models for summarizing the health-related quality of life profile generated by the EORTC QLQ-C30 questionnaire....
Who Is Missing from Higher Education?
Gorard, Stephen
2008-01-01
This paper discusses the difficulties of establishing a clear count of UK higher education students in terms of the categories used for widening participation, such as occupational background or ethnicity. Using some of the best and most complete data available, such as the annual figures from the Higher Education Statistics Agency, the paper then…
Significance of postoperative irradiation for breast cancer
International Nuclear Information System (INIS)
Murai, Nobuko; Ogami, Koji; Nishikawa, Kiyoshi; Koga, Kenji; Waki, Norio; Higashi, Hidefumi; Hayashi, Asami; Shibata, Koichiro; Watanabe, Katsuji
1986-01-01
From 1978 through 1983, 27 patients were treated with surgery followed by irradiation (irradiated group) and 29 with surery alone (non-irradiated group). In the irradiated group, 10 had stage II and 17 stage III; in the non-irradiated group, 25 had stage II and 4 stage III. The most common histology was medullary tubular carcinoma (MTC). There was no significant difference in survivals at 3 years and 5 years between the groups. Similarly, no significant difference was seen among stage II patients. Patients with MTC tended to have worse survivals in the irradiated group than in the non-irradiated group, with no statistically significant difference. Among stage II patients, no major differences in local recurrence were seen between the groups; the incidence of distant metastases tended to be high in the irradiated group. The incidence of both local recurrence and distant metastases for stage III patients showed a tendency to be higher in the irradiated group than in the non-irradiated group. The results indicated no apparent benifit of postoperative irradiation for breast cancer. A randomized clinical trial is needed for the evaluation of postoperative irradiation for breast cancer. (Namekawa, K.)
Dong, Jian-Jun; Li, Qing-Liang; Yin, Hua; Zhong, Cheng; Hao, Jun-Guang; Yang, Pan-Fei; Tian, Yu-Hong; Jia, Shi-Ru
2014-10-15
Sensory evaluation is regarded as a necessary procedure to ensure a reproducible quality of beer. Meanwhile, high-throughput analytical methods provide a powerful tool to analyse various flavour compounds, such as higher alcohol and ester. In this study, the relationship between flavour compounds and sensory evaluation was established by non-linear models such as partial least squares (PLS), genetic algorithm back-propagation neural network (GA-BP), support vector machine (SVM). It was shown that SVM with a Radial Basis Function (RBF) had a better performance of prediction accuracy for both calibration set (94.3%) and validation set (96.2%) than other models. Relatively lower prediction abilities were observed for GA-BP (52.1%) and PLS (31.7%). In addition, the kernel function of SVM played an essential role of model training when the prediction accuracy of SVM with polynomial kernel function was 32.9%. As a powerful multivariate statistics method, SVM holds great potential to assess beer quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
Nguyen, Thuyuyen H.; Newby, Michael; Skordi, Panayiotis G.
2015-01-01
Statistics is a required subject of study in many academic disciplines, including business, education and psychology, that causes problems for many students. This has long been recognised and there have been a number of studies into students' attitudes towards statistics, particularly statistical anxiety. However, none of these studies…
Mirosław Mrozkowiak; Hanna Żukowska
2015-01-01
Mrozkowiak Mirosław, Żukowska Hanna. Znaczenie Dobrego Krzesła, jako elementu szkolnego i domowego środowiska ucznia, w profilaktyce zaburzeń statyki postawy ciała = The significance of Good Chair as part of children’s school and home environment in the preventive treatment of body statistics distortions. Journal of Education, Health and Sport. 2015;5(7):179-215. ISSN 2391-8306. DOI 10.5281/zenodo.19832 http://ojs.ukw.edu.pl/index.php/johs/article/view/2015%3B5%287%29%3A179-215 https:...
Can a significance test be genuinely Bayesian?
Pereira, Carlos A. de B.; Stern, Julio Michael; Wechsler, Sergio
2008-01-01
The Full Bayesian Significance Test, FBST, is extensively reviewed. Its test statistic, a genuine Bayesian measure of evidence, is discussed in detail. Its behavior in some problems of statistical inference like testing for independence in contingency tables is discussed.
Directory of Open Access Journals (Sweden)
Т. М. Кравчук
2015-06-01
Full Text Available Research objective: to determine the health-improving potential of dancing exercises used in physical education of female students of higher educational institutions. Research methods: study and analysis of pedagogical, scientific and methodological literature on the subject matter of the research; observations, questionnaires, functional tests; statistical methods of data reduction. Conclusions. As part of the study, the use of dancing exercises in the physical education of female students of higher educational institutions proved contributing to a significant increase in the level of their physical health in general and improvement of some of its indicators, including strength and life indices, heart rate recovery time after 20 squats. Dancing exercises also boost spirits, improve health and activity of the female students, which the study proved statistically.
Research and development statistics 2001
2002-01-01
This publication provides recent basic statistics on the resources devoted to R&D in OECD countries. The statistical series are presented for the last seven years for which data are available and cover expenditure by source of funds and type of costs; personnel by occupation and/or level of qualification; both at the national level by performance sector, for enterprises by industry, and for higher education by field of science. The publication also provides information on the output of science and technology (S&T) activities relating to the technology balance of payments.
Institute of Scientific and Technical Information of China (English)
Jian-Gang Dai; Yong-Feng Wu; Mei Li
2017-01-01
Objective: To study the serum tumor markers, immunoglobulin, TNF-α and hs-CRP in breast cancer in different pathological stages of the concentration, and to analyze the clinical significance of early diagnosis of breast cancer. Methods: A total of 130 patients with breast cancer were divided into stage I, II, III and IV according to clinical pathology. In addition, 40 patients with benign breast disease and 35 healthy subjects were selected as benign breast disease group and control group. Serum tumor markers, immunoglobulins, TNF-αand hs-CRP concentrations were measured and compared of all subjects. Results: There were no significant difference in serum tumor markers, immunoglobulin and inflammatory factors between the control group and the benign breast cancer group. The level of serum tumor markers in breast cancer group was significantly higher than that in control group and benign breast cancer group. The levels of serum CA125, CA153 and CEA were gradually increased with the severity enhancing from stage I and IV of breast cancer, and he difference was statistically significant. The level of serum immunoglobulin in breast cancer group was significantly higher than that in control group and benign breast cancer group. The levels of serum IgG and IgM increased gradually severity enhancing from stage I and IV of breast cancer, and the difference was statistically significant. The level of serum TNF-α and hs-CRP in serum of breast cancer group was significantly higher than that of control group and benign breast cancer group. The serum levels of TNF-α and hs-CRP increased gradually with severity enhancing from stage I and IV of breast cancer, and the difference was statistically significant. Conclusion: The level of serum tumor markers in breast cancer patients is increasing. Humoral and inflammatory responses are activated to varying degrees and increase with the aggregation of disease. They may involve regulating the occurrence and metastasis of breast
Significance of specificity of Tinetti B-POMA test and fall risk factor in third age of life.
Avdić, Dijana; Pecar, Dzemal
2006-02-01
As for the third age, psychophysical abilities of humans gradually decrease, while the ability of adaptation to endogenous and exogenous burdens is going down. In 1987, "Harada" et al. (1) have found out that 9.5 million persons in USA have difficulties running daily activities, while 59% of them (which is 5.6 million) are older than 65 years in age. The study has encompassed 77 questioned persons of both sexes with their average age 71.73 +/- 5.63 (scope of 65-90 years in age), chosen by random sampling. Each patient has been questioned in his/her own home and familiar to great extent with the methodology and aims of the questionnaire. Percentage of questioned women was 64.94% (50 patients) while the percentage for men was 35.06% (27 patients). As for the value of risk factor score achieved conducting the questionnaire and B-POMA test, there are statistically significant differences between men and women, as well as between patients who fell and those who never did. As for the way of life (alone or in the community), there are no significant statistical differences. Average results gained through B-POMA test in this study are statistically significantly higher in men and patients who did not provide data about falling, while there was no statistically significant difference in the way of life. In relation to the percentage of maximum number of positive answers to particular questions, regarding gender, way of life and the data about falling, there were no statistically significant differences between the value of B-POMA test and the risk factor score (the questionnaire).
Statistics for experimentalists
Cooper, B E
2014-01-01
Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...
Fractal dimension and image statistics of anal intraepithelial neoplasia
International Nuclear Information System (INIS)
Ahammer, H.; Kroepfl, J.M.; Hackl, Ch.; Sedivy, R.
2011-01-01
Research Highlights: → Human papillomaviruses cause anal intraepithelial neoplasia (AIN). → Digital image processing was carried out to classify the grades of AIN quantitatively. → The fractal dimension as well as grey value statistics was calculated. → Higher grades of AIN yielded higher values of the fractal dimension. → An automatic detection system is feasible. - Abstract: It is well known that human papillomaviruses (HPV) induce a variety of tumorous lesions of the skin. HPV-subtypes also cause premalignant lesions which are termed anal intraepithelial neoplasia (AIN). The clinical classification of AIN is of growing interest in clinical practice, due to increasing HPV infection rates throughout human population. The common classification approach is based on subjective inspections of histological slices of anal tissues with all the drawbacks of depending on the status and individual variances of the trained pathologists. Therefore, a nonlinear quantitative classification method including the calculation of the fractal dimension and first order as well as second order image statistical parameters was developed. The absolute values of these quantitative parameters reflected the distinct grades of AIN very well. The quantitative approach has the potential to decrease classification errors significantly and it could be used as a widely applied screening technique.
On two methods of statistical image analysis
Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.
1999-01-01
The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,
Weighted statistical parameters for irregularly sampled time series
Rimoldini, Lorenzo
2014-01-01
Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.
Statistics Using Just One Formula
Rosenthal, Jeffrey S.
2018-01-01
This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…
Statistics Anxiety among Postgraduate Students
Koh, Denise; Zawi, Mohd Khairi
2014-01-01
Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…
The significance of age and sex for the absence of immune response to hepatitis B vaccination
Directory of Open Access Journals (Sweden)
Rosić Ilija
2008-01-01
Full Text Available INTRODUCTION Seroepidemiological investigations after the administration of hepatitis B vaccine have shown that even 15% of vaccinated healthy persons do not generate immune response to the vaccines currently in use. OBJECTIVE The aim of the research is to test the immunogenicity of hepatitis B vaccine in different age groups on the adult vaccinated population sample in Serbia. METHOD The tested general population sample consisted of 154 adult subjects. Immunization was done using the recombinant fungal vaccine obtained by genetic engineering (Euvax B vaccine, manufacturer LG, distributor Sanofi Pasteur. All tested subjects in the research received 1 ml of hepatitis B vaccine administered intramuscularly into the deltoid muscle by 0, 1, 6 schedule. RESULTS In the tested sample, 3.13% of persons aged up to 29 years, 6.25% aged 30-35 year and 19.23% of the tested persons aged 40 years and older had no immune response. The relative risk of “no response" findings was twice higher in the group aged 30-39 as compared to the population aged up to 29 years. The detected risk was six times higher for the population of 40 years and older in comparison to the population aged up to 29 years. Also, the relative risk of “no response" findings for the population of 40 years and older was more than three times higher than for the group aged 30-39. Absent immune response in relation to sex was found to be higher in male subjects. CONCLUSION The rates of “no response" finding was the following: 3.13% in the group aged up to 29 years, 6.25% in the group aged 30-39, as well as in the group aged 40 years and older (19.23%. Immune response in relation to age groups was statistically significantly different (p<0.001, while there was a statistically significant correlation (C=0.473; p<0.001 between the age of the subjects and the immune response. In relation to sex, the “no response" finding was found to be increased in the males, but without any statistically
Directory of Open Access Journals (Sweden)
Carlos Lago-Peñas
2010-06-01
Full Text Available The aim of the present study was to analyze men's football competitions, trying to identify which game-related statistics allow to discriminate winning, drawing and losing teams. The sample used corresponded to 380 games from the 2008-2009 season of the Spanish Men's Professional League. The game-related statistics gathered were: total shots, shots on goal, effectiveness, assists, crosses, offsides commited and received, corners, ball possession, crosses against, fouls committed and received, corners against, yellow and red cards, and venue. An univariate (t-test and multivariate (discriminant analysis of data was done. The results showed that winning teams had averages that were significantly higher for the following game statistics: total shots (p < 0.001, shots on goal (p < 0.01, effectiveness (p < 0.01, assists (p < 0.01, offsides committed (p < 0.01 and crosses against (p < 0.01. Losing teams had significantly higher averages in the variable crosses (p < 0.01, offsides received (p < 0. 01 and red cards (p < 0.01. Discriminant analysis allowed to conclude the following: the variables that discriminate between winning, drawing and losing teams were the total shots, shots on goal, crosses, crosses against, ball possession and venue. Coaches and players should be aware for these different profiles in order to increase knowledge about game cognitive and motor solicitation and, therefore, to evaluate specificity at the time of practice and game planning
Lower serotonin level and higher rate of fibromyalgia syndrome with advancing pregnancy.
Atasever, Melahat; Namlı Kalem, Muberra; Sönmez, Çiğdem; Seval, Mehmet Murat; Yüce, Tuncay; Sahin Aker, Seda; Koç, Acar; Genc, Hakan
2017-09-01
The aim of the study is to investigate the relationship between changes in serotonin levels during pregnancy and fibromyalgia syndrome (FS) and the relationships between FS and the physical/psychological state, biochemical and hormonal parameters, which may be related to the musculoskeletal system. This study is a prospective case-control study conducted with 277 pregnant women at the obstetric unit of Ankara University Faculty of Medicine, in the period between January and June 2015. FS was determined based on the presence or absence of the 2010 ACR diagnostic criteria and all the volunteers were asked to answer the questionnaires as Fibromyalgia Impact Criteria (FIQ), Widespread Pain Index (WPI), Symptom Severity Scale (SS), Beck Depression Inventory and Visual Analog Scale (VAS). Biochemical and hormonal markers (glucose, TSH, T4, Ca (calcium), P (phosphate), PTH (parathyroid hormone) and serotonin levels) relating to muscle and bone metabolism were measured. In the presence of fibromyalgia, the physical and psychological parameters are negatively affected (p serotonin levels may contribute to the development of fibromyalgia but this was not statistically significant. The Beck Depression Inventory scale statistically showed that increasing scores also increase the risk of fibromyalgia (p serotonin levels in women with FS are lower than the control group and that serotonin levels reduce as pregnancy progresses. Anxiety and depression in pregnant women with FS are higher than the control group. The presence of depression increases the likelihood of developing FS at a statistically significant level. Serotonin impairment also increases the chance of developing FS, but this correlation has not been shown to be statistically significant.
Statistical learning of music- and language-like sequences and tolerance for spectral shifts.
Daikoku, Tatsuya; Yatomi, Yutaka; Yumoto, Masato
2015-02-01
In our previous study (Daikoku, Yatomi, & Yumoto, 2014), we demonstrated that the N1m response could be a marker for the statistical learning process of pitch sequence, in which each tone was ordered by a Markov stochastic model. The aim of the present study was to investigate how the statistical learning of music- and language-like auditory sequences is reflected in the N1m responses based on the assumption that both language and music share domain generality. By using vowel sounds generated by a formant synthesizer, we devised music- and language-like auditory sequences in which higher-ordered transitional rules were embedded according to a Markov stochastic model by controlling fundamental (F0) and/or formant frequencies (F1-F2). In each sequence, F0 and/or F1-F2 were spectrally shifted in the last one-third of the tone sequence. Neuromagnetic responses to the tone sequences were recorded from 14 right-handed normal volunteers. In the music- and language-like sequences with pitch change, the N1m responses to the tones that appeared with higher transitional probability were significantly decreased compared with the responses to the tones that appeared with lower transitional probability within the first two-thirds of each sequence. Moreover, the amplitude difference was even retained within the last one-third of the sequence after the spectral shifts. However, in the language-like sequence without pitch change, no significant difference could be detected. The pitch change may facilitate the statistical learning in language and music. Statistically acquired knowledge may be appropriated to process altered auditory sequences with spectral shifts. The relative processing of spectral sequences may be a domain-general auditory mechanism that is innate to humans. Copyright © 2014 Elsevier Inc. All rights reserved.
On the Statistical Dependency of Identity Theft on Demographics
di Crescenzo, Giovanni
An improved understanding of the identity theft problem is widely agreed to be necessary to succeed in counter-theft efforts in legislative, financial and research institutions. In this paper we report on a statistical study about the existence of relationships between identity theft and area demographics in the US. The identity theft data chosen was the number of citizen complaints to the Federal Trade Commission in a large number of US municipalities. The list of demographics used for any such municipality included: estimated population, median resident age, estimated median household income, percentage of citizens with a high school or higher degree, percentage of unemployed residents, percentage of married residents, percentage of foreign born residents, percentage of residents living in poverty, density of law enforcement employees, crime index, and political orientation according to the 2004 presidential election. Our study findings, based on linear regression techniques, include statistically significant relationships between the number of identity theft complaints and a non-trivial subset of these demographics.
Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.
Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M
2014-01-01
Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.
Directory of Open Access Journals (Sweden)
Dominic Beaulieu-Prévost
2006-03-01
Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Preventing statistical errors in scientific journals.
Nuijten, M.B.
2016-01-01
There is evidence for a high prevalence of statistical reporting errors in psychology and other scientific fields. These errors display a systematic preference for statistically significant results, distorting the scientific literature. There are several possible causes for this systematic error
Assessment of Problem-Based Learning in the Undergraduate Statistics Course
Karpiak, Christie P.
2011-01-01
Undergraduate psychology majors (N = 51) at a mid-sized private university took a statistics examination on the first day of the research methods course, a course for which a grade of "C" or higher in statistics is a prerequisite. Students who had taken a problem-based learning (PBL) section of the statistics course (n = 15) were compared to those…
A Simple Statistical Thermodynamics Experiment
LoPresto, Michael C.
2010-01-01
Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Significance of blood pressure variability in patients with sepsis.
Pandey, Nishant Raj; Bian, Yu-Yao; Shou, Song-Tao
2014-01-01
This study was undertaken to observe the characteristics of blood pressure variability (BPV) and sepsis and to investigate changes in blood pressure and its value on the severity of illness in patients with sepsis. Blood parameters, APACHE II score, and 24-hour ambulatory BP were analyzed in 89 patients with sepsis. In patients with APACHE II score>19, the values of systolic blood pressure (SBPV), diasystolic blood pressure (DBPV), non-dipper percentage, cortisol (COR), lactate (LAC), platelet count (PLT) and glucose (GLU) were significantly higher than in those with APACHE II score ≤19 (Pblood cell (WBC), creatinine (Cr), PaO2, C-reactive protein (CRP), adrenocorticotropic hormone (ACTH) and tumor necrosis factor α (TNF-α) were not statistically significant (P>0.05). Correlation analysis showed that APACHE II scores correlated significantly with SBPV and DBPV (P0.05). Logistic regression analysis of SBPV, DBPV, APACHE II score, and LAC was used to predict prognosis in terms of survival and non-survival rates. Receiver operating characteristics curve (ROC) showed that DBPV was a better predictor of survival rate with an AUC value of 0.890. However, AUC of SBPV, APACHE II score, and LAC was 0.746, 0.831 and 0.915, respectively. The values of SBPV, DBPV and non-dipper percentage are higher in patients with sepsis. DBPV and SBPV can be used to predict the survival rate of patients with sepsis.
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Higher-order aberrations and visual acuity after LASEK.
Urgancioglu, Berrak; Bilgihan, Kamil; Ozturk, Sertac
2008-08-01
To determine ocular higher-order aberrations (HOAs) in eyes with supernormal vision after myopic astigmatic laser subepithelial keratomileusis (LASEK) and to compare the findings with those in eyes with natural supernormal vision. Ocular HOAs were measured after LASEK in 20 eyes of 12 myopic astigmatic patients with postoperative uncorrected visual acuity (UCVA) of >20/16 (group 1). Patients who were included in the study had no visual symptoms like glare, halo or double vision. The measurements were taken 8.3 +/- 3 months after LASEK surgery. In group 2 ocular HOAs were examined in 20 eyes of 10 subjects with natural UCVA of >20/16 as a control. Measurements were taken across a pupil with a diameter of 4.0 mm and 6.0 mm. Root-mean-square (RMS) values of HOAs, Z(3)-1, Z(3)1, Z(4)0, Z(5)-1, Z(5)1 and Z(6)0 were analyzed. The mean RMS values for each order were higher in group 1 when compared with group 2 at 4.0 mm and 6.0 mm pupil diameters. There was no statistically significant difference between groups in spherical and coma aberrations (P > 0.05). Mean RMS values for total HOAs were 0.187 +/- 0.09 microm at 4.0 mm and 0.438 +/- 0.178 microm at 6.0 mm pupil in group 1 and 0.120 +/- 0.049 microm at 4.0 mm and 0.344 +/- 0.083 microm at 6.0 mm pupil in group 2. The difference between groups in total HOAs was statistically significant at 4.0 mm and 6.0 mm pupil diameters (P < 0.05). Ocular HOAs exist in eyes with supernormal vision. After LASEK, the amount of HOAs of the eye increases under both mesopic and photopic conditions. However the amount of HOA increase does not seem to be consistent with visual symptoms.
Personal dosimetry statistics and specifics of low dose evaluation
International Nuclear Information System (INIS)
Avila, R.E.; Gómez Salinas, R.A.; Oyarzún Cortés, C.H.
2015-01-01
The dose statistics of a personal dosimetry service, considering 35,000+ readings, display a sharp peak at low dose (below 0.5 mSv) with skewness to higher values. A measure of the dispersion is that approximately 65% of the doses fall below the average plus 2 standard deviations, an observation which may prove helpful to radiation protection agencies. Categorizing the doses by the concomitant use of a finger ring dosimeter, that skewness is larger in the whole body, and ring dosimeters. The use of Harshaw 5500 readers at high gain leads to frequent values of the glow curve that are judged to be spurious, i.e. values not belonging to the roughly normal noise over the curve. A statistical criterion is shown for identifying those anomalous values, and replacing them with the local behavior, as fit by a cubic polynomial. As a result, the doses above 0.05 mSv which are affected by more than 2% comprise over 10% of the data base. The low dose peak of the statistics, above, has focused our attention on the evaluation of LiF(Mg,Ti) dosimeters exposed at low dose, and read with Harshaw 5500 readers. The standard linear procedure, via an overall reader calibration factor, is observed to fail at low dose, in detailed calibrations from 0.02 mSv to 1 Sv. A significant improvement is achieved by a piecewise polynomials calibration curve. A cubic, at low dose is matched, at ∼10 mSv, in value and first derivative, to a linear dependence at higher doses. This improvement is particularly noticeable below 2 mSv, where over 60% of the evaluated dosimeters are found. (author)
Energy Technology Data Exchange (ETDEWEB)
Shin, Dong Seok; Kim, Dong Su; Kim, Tae Ho; Kim, Kyeong Hyeon; Yoon, Do Kun; Suh, Tae Suk [The Catholic University of Korea, Seoul (Korea, Republic of); Kang, Seong Hee [Seoul National University Hospital, Seoul (Korea, Republic of); Cho, Min Seok [Asan Medical Center, Seoul (Korea, Republic of); Noh, Yu Yoon [Eulji University Hospital, Daejeon (Korea, Republic of)
2017-04-15
Three-dimensional dose (3D dose) can consider coverage of moving target, however it is difficult to provide dosimetric effect which occurs by respiratory motions. Four-dimensional dose (4D dose) which uses deformable image registration (DIR) algorithm from four-dimensional computed tomography (4DCT) images can consider dosimetric effect by respiratory motions. The dose difference between 3D dose and 4D dose can be varied according to the geometrical relationship between a planning target volume (PTV) and an organ at risk (OAR). The purpose of this study is to evaluate the correlation between the overlap volume histogram (OVH), which quantitatively shows the geometrical relationship between the PTV and OAR, and the dose differences. In conclusion, no significant statistical correlation was found between the OVH and dose differences. However, it was confirmed that a higher difference between the 3D and 4D doses could occur in cases that have smaller OVH value. No significant statistical correlation was found between the OVH and dose differences. However, it was confirmed that a higher difference between the 3D and 4D doses could occur in cases that have smaller OVH value.
Integer Set Compression and Statistical Modeling
DEFF Research Database (Denmark)
Larsson, N. Jesper
2014-01-01
enumeration of elements may be arbitrary or random, but where statistics is kept in order to estimate probabilities of elements. We present a recursive subset-size encoding method that is able to benefit from statistics, explore the effects of permuting the enumeration order based on element probabilities......Compression of integer sets and sequences has been extensively studied for settings where elements follow a uniform probability distribution. In addition, methods exist that exploit clustering of elements in order to achieve higher compression performance. In this work, we address the case where...
Statistics: a Bayesian perspective
National Research Council Canada - National Science Library
Berry, Donald A
1996-01-01
...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...
Statistical Analysis and Evaluation of the Depth of the Ruts on Lithuanian State Significance Roads
Directory of Open Access Journals (Sweden)
Erinijus Getautis
2011-04-01
Full Text Available The aim of this work is to gather information about the national flexible pavement roads ruts depth, to determine its statistical dispersijon index and to determine their validity for needed requirements. Analysis of scientific works of ruts apearance in the asphalt and their influence for driving is presented in this work. Dynamical models of ruts in asphalt are presented in the work as well. Experimental outcome data of rut depth dispersijon in the national highway of Lithuania Vilnius – Kaunas is prepared. Conclusions are formulated and presented. Article in Lithuanian
How to construct the statistic network? An association network of herbaceous
Directory of Open Access Journals (Sweden)
WenJun Zhang
2012-06-01
Full Text Available In present study I defined a new type of network, the statistic network. The statistic network is a weighted and non-deterministic network. In the statistic network, a connection value, i.e., connection weight, represents connection strength and connection likelihood between two nodes and its absolute value falls in the interval (0,1]. The connection value is expressed as a statistical measure such as correlation coefficient, association coefficient, or Jaccard coefficient, etc. In addition, all connections of the statistic network can be statistically tested for their validity. A connection is true if the connection value is statistically significant. If all connection values of a node are not statistically significant, it is an isolated node. An isolated node has not any connection to other nodes in the statistic network. Positive and negative connection values denote distinct connectiontypes (positive or negative association or interaction. In the statistic network, two nodes with the greater connection value will show more similar trend in the change of their states. At any time we can obtain a sample network of the statistic network. A sample network is a non-weighted and deterministic network. Thestatistic network, in particular the plant association network that constructed from field sampling, is mostly an information network. Most of the interspecific relationships in plant community are competition and cooperation. Therefore in comparison to animal networks, the methodology of statistic network is moresuitable to construct plant association networks. Some conclusions were drawn from this study: (1 in the plant association network, most connections are weak and positive interactions. The association network constructed from Spearman rank correlation has most connections and isolated taxa are fewer. From net linear correlation,linear correlation, to Spearman rank correlation, the practical number of connections and connectance in the
Leung, Felix W; Koo, Malcolm; Cadoni, Sergio; Falt, Premysl; Hsieh, Yu-Hsi; Amato, Arnaldo; Erriu, Matteo; Fojtik, Petr; Gallittu, Paolo; Hu, Chi-Tan; Leung, Joseph W; Liggi, Mauro; Paggi, Silvia; Radaelli, Franco; Rondonotti, Emanuele; Smajstrla, Vit; Tseng, Chih-Wei; Urban, Ondrej
2018-03-02
To test the hypothesis that water exchange (WE) significantly increases adenoma detection rates (ADR) compared with water immersion (WI). Low ADR was linked to increased risk for interval colorectal cancers and related deaths. Two recent randomized controlled trials of head-to-head comparison of WE, WI, and traditional air insufflation (AI) each showed that WE achieved significantly higher ADR than AI, but not WI. The data were pooled from these 2 studies to test the above hypothesis. Two trials (5 sites, 14 colonoscopists) that randomized 1875 patients 1:1:1 to AI, WI, or WE were pooled and analyzed with ADR as the primary outcome. The ADR of AI (39.5%) and WI (42.4%) were comparable, significantly lower than that of WE (49.6%) (vs. AI P=0.001; vs. WI P=0.033). WE insertion time was 3 minutes longer than that of AI (Prate (vs. AI) of the >10 mm advanced adenomas. Right colon combined advanced and sessile serrated ADR of AI (3.4%) and WI (5%) were comparable and were significantly lower than that of WE (8.5%) (vs. AI P<0.001; vs. WI P=0.039). Compared with AI and WI, the superior ADR of WE offsets the drawback of a significantly longer insertion time. For quality improvement focused on increasing adenoma detection, WE is preferred over WI. The hypothesis that WE could lower the risk of interval colorectal cancers and related deaths should be tested.
Trajkovski, Vladimir; Petlichkovski, Aleksandar; Efinska-Mladenovska, Olivija; Trajkov, Dejan; Arsov, Todor; Strezova, Ana; Ajdinski, Ljubomir; Spiroski, Mirko
2008-01-01
Specific IgA, IgG, and IgE antibodies to food antigens in 35 participants with autistic disorder and 21 of their siblings in the Republic of Macedonia were examined. Statistically significant higher plasma concentration of IgA antibodies against alpha-lactalbumin, beta-lactoglobulin, casein, and gliadin were found in the children with autistic…
Journal data sharing policies and statistical reporting inconsistencies in psychology.
Nuijten, M.B.; Borghuis, J.; Veldkamp, C.L.S.; Dominguez Alvarez, L.; van Assen, M.A.L.M.; Wicherts, J.M.
2018-01-01
In this paper, we present three retrospective observational studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of
Educating Grade 6 students for higher-order thinking and its influence on creativity
Directory of Open Access Journals (Sweden)
Wajeeh Daher
2017-08-01
Full Text Available Educating students for higher-order thinking provides them with tools that turn them into more critical thinkers. This supports them in overcoming life problems that they encounter, as well as becoming an integral part of the society. This students’ education is attended to by educational organisations that emphasise the positive consequences of educating students for higher-order thinking, including creative thinking. One way to do that is through educational programmes that educate for higher-order thinking. One such programme is the Cognitive Research Trust (CoRT thinking programme. The present research intended to examine the effect of the participation of Grade 6 students in a CoRT programme on their creative thinking. Fifty-three students participated in the research; 27 participated in a CoRT programme, while 26 did not participate in such programme. The ANCOVA test showed that the students who participated in the CoRT programme outperformed significantly, in creative thinking, the students who did not. Moreover, the students in the CoRT programme whose achievement scores were between 86 and 100 outperformed significantly the other achievement groups of students. Furthermore, students with reported high ability outperformed significantly the other ability groups of students. The results did not show statistically significant differences in students’ creativity attributed to gender.
Primary and secondary cases in Escherichia coli O157 outbreaks: a statistical analysis.
LENUS (Irish Health Repository)
Snedeker, Kate G
2009-01-01
BACKGROUND: Within outbreaks of Escherichia coli O157 (E. coli O157), at least 10-15% of cases are thought to have been acquired by secondary transmission. However, there has been little systematic quantification or characterisation of secondary outbreak cases worldwide. The aim of this study was to characterise secondary outbreak cases, estimate the overall proportion of outbreak cases that were the result of secondary transmission and to analyse the relationships between primary and secondary outbreak cases by mode of transmission, country and median age. METHODS: Published data was obtained from 90 confirmed Escherichia coli O157 outbreaks in Great Britain, Ireland, Scandinavia, Canada, the United States and Japan, and the outbreaks were described in terms of modes of primary and secondary transmission, country, case numbers and median case age. Outbreaks were tested for statistically significant differences in the number of ill, confirmed, primary and secondary cases (analysis of variance and Kruskal-Wallis) and in the rate of secondary cases between these variables (Generalised Linear Models). RESULTS: The outbreaks had a median of 13.5 confirmed cases, and mean proportion of 0.195 secondary cases. There were statistically significant differences in the numbers of ill, confirmed, primary and secondary cases between modes of primary transmission (p < 0.021), and in primary and secondary cases between median age categories (p < 0.039) and modes of secondary transmission (p < 0.001).Secondary case rates differed statistically significantly between modes of secondary and primary transmission and median age categories (all p < 0.001), but not between countries (p = 0.23). Statistically significantly higher rates of secondary transmission were found in outbreaks with a median age <6 years and those with secondary transmission via person to person spread in nurseries. No statistically significant interactions were found between country, mode of transmission and age
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Directory of Open Access Journals (Sweden)
Sreeram V Ramagopalan
2015-04-01
Full Text Available Background: We and others have shown a significant proportion of interventional trials registered on ClinicalTrials.gov have their primary outcomes altered after the listed study start and completion dates. The objectives of this study were to investigate whether changes made to primary outcomes are associated with the likelihood of reporting a statistically significant primary outcome on ClinicalTrials.gov. Methods: A cross-sectional analysis of all interventional clinical trials registered on ClinicalTrials.gov as of 20 November 2014 was performed. The main outcome was any change made to the initially listed primary outcome and the time of the change in relation to the trial start and end date. Findings: 13,238 completed interventional trials were registered with ClinicalTrials.gov that also had study results posted on the website. 2555 (19.3% had one or more statistically significant primary outcomes. Statistical analysis showed that registration year, funding source and primary outcome change after trial completion were associated with reporting a statistically significant primary outcome. Conclusions: Funding source and primary outcome change after trial completion are associated with a statistically significant primary outcome report on clinicaltrials.gov.
McGinnigle, Samantha; Eperjesi, Frank; Naroo, Shehzad A
2014-04-01
To study the effects of ocular lubricants on higher order aberrations in normal and self-diagnosed dry eyes. Unpreserved hypromellose drops, Tears Again™ liposome spray and a combination of both were administered to the right eye of 24 normal and 24 dry eye subjects following classification according to a 5 point questionnaire. Total ocular higher order aberrations, coma, spherical aberration and Strehl ratios for higher order aberrations were measured using the Nidek OPD-Scan III (Nidek Technologies, Gamagori, Japan) at baseline, immediately after application and after 60 min. The aberration data were analyzed over a 5mm natural pupil using Zernike polynomials. Each intervention was assessed on a separate day and comfort levels were recorded before and after application. Corneal staining was assessed and product preference recorded after the final measurement for each intervention. Hypromellose drops caused an increase in total higher order aberrations (p=dry eyes) and a reduction in Strehl ratio (normal eyes: p=dry eyes p=0.01) immediately after instillation. There were no significant differences between normal and self-diagnosed dry eyes for response to intervention and no improvement in visual quality or reduction in higher order aberrations after 60 min. Differences in comfort levels failed to reach statistical significance. Combining treatments does not offer any benefit over individual treatments in self-diagnosed dry eyes and no individual intervention reached statistical significance. Symptomatic subjects with dry eye and no corneal staining reported an improvement in comfort after using lubricants. Copyright © 2013 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.
Statistical analysis and data management
International Nuclear Information System (INIS)
Anon.
1981-01-01
This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found
Directory of Open Access Journals (Sweden)
Mashhood Ahmed Sheikh
2017-08-01
mediate the association between childhood adversity and ADS in adulthood. However, when education was excluded as a mediator-response confounding variable, the indirect effect of childhood adversity on ADS in adulthood was statistically significant (p < 0.05. This study shows that a careful inclusion of potential confounding variables is important when assessing mediation.
Zhang, P; Jones, R M
2014-01-01
Beam-excited higher order modes (HOM) can be used to provide beam diagnostics. Here we focus on 3.9 GHz superconducting accelerating cavities. In particular we study dipole mode excitation and its application to beam position determinations. In order to extract beam position information, linear regression can be used. Due to a large number of sampling points in the waveforms, statistical methods are used to effectively reduce the dimension of the system, such as singular value decomposition (SVD) and k-means clustering. These are compared with the direct linear regression (DLR) on the entire waveforms. A cross-validation technique is used to study the sample independent precisions of the position predictions given by these three methods. A RMS prediction error in the beam position of approximately 50 micron can be achieved by DLR and SVD, while k-means clustering suggests 70 micron.
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.
Student Drop-Out from German Higher Education Institutions
Heublein, Ulrich
2014-01-01
28% of students of any one year currently give up their studies in bachelor degree programmes at German higher education institutions. Drop-out is to be understood as the definite termination in the higher education system without obtaining an academic degree. The drop-out rate is thereby calculated with the help of statistical estimation…
Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P
1999-01-01
Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149
A perceptual space of local image statistics.
Victor, Jonathan D; Thengone, Daniel J; Rizvi, Syed M; Conte, Mary M
2015-12-01
Local image statistics are important for visual analysis of textures, surfaces, and form. There are many kinds of local statistics, including those that capture luminance distributions, spatial contrast, oriented segments, and corners. While sensitivity to each of these kinds of statistics have been well-studied, much less is known about visual processing when multiple kinds of statistics are relevant, in large part because the dimensionality of the problem is high and different kinds of statistics interact. To approach this problem, we focused on binary images on a square lattice - a reduced set of stimuli which nevertheless taps many kinds of local statistics. In this 10-parameter space, we determined psychophysical thresholds to each kind of statistic (16 observers) and all of their pairwise combinations (4 observers). Sensitivities and isodiscrimination contours were consistent across observers. Isodiscrimination contours were elliptical, implying a quadratic interaction rule, which in turn determined ellipsoidal isodiscrimination surfaces in the full 10-dimensional space, and made predictions for sensitivities to complex combinations of statistics. These predictions, including the prediction of a combination of statistics that was metameric to random, were verified experimentally. Finally, check size had only a mild effect on sensitivities over the range from 2.8 to 14min, but sensitivities to second- and higher-order statistics was substantially lower at 1.4min. In sum, local image statistics form a perceptual space that is highly stereotyped across observers, in which different kinds of statistics interact according to simple rules. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dong, Y; Zhou, M; Ba, X J; Si, J W; Li, W T; Wang, Y; Li, D; Li, T
2016-10-18
To determine the clinicopathological significance of the DNA methyltransferase 3B (DNMT3B) overexpression in endometrial carcinomas and to evaluate its correlation with hormone receptor status. Immunohistochemistry was performed to assess the expression of DNMT3B and hormone receptors in 104 endometrial carcinomas. DNMT3B overexpression occurred frequently in endometrioid carcinoma (EC, 54.8%) more than in nonendometrioid carcinoma (NEC, 30.0%) with statistical significance (P=0.028). Furthermore, there was a trend that EC with worse clinico-pathological variables and shorter survival had a higher DNMT3B expression, and the correlation between DNMT3B and tumor grade reached statistical significance (P=0.019).A negative correlation between DNMT3B and estrogen receptor (ER) or progesterone receptor (PR) expression was found in EC. NMT3B overexpression occurred frequently in the ER or PR negative subgroups (78.9%, 86.7%) more than in the positive subgroups (47.7%, 47.8%) with statistical significance (P=0.016, P=0.006). In addition, the DNMT3B overexpression increased in tumors with both ER and PR negative expression (92.9%, P=0.002). However, no such correlation was found in NEC (P>0.05). Sequence analyses demonstrated multiple ER and PR binding sites in the promoter regions of DNMT3B gene. This study showed that the expression of DNMT3B in EC and NEC was different. DNMT3B overexpression in EC was associated with the worse clinicopathological variables and might have predictive value. The methylation status of EC and NEC maybe different. In addition, in EC, DNMT3B overexpression negatively correlated with ER or PR expression. In NEC, the correlation between DNMT3B and ER or PR status was not present.
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Thyroid Autoimmunity and Behçet’s Disease: Is There a Significant Association?
Directory of Open Access Journals (Sweden)
Filiz Cebeci
2013-01-01
Full Text Available Background. Behcet’s disease (BD could be regarded as an autoimmune disease in many aspects. Autoimmune thyroid disease (ATD is frequently accompanied by other various autoimmune diseases. Nevertheless, there is not still enough data showing the association between BD and ATD. In addition, no controlled study is present in the PubMed, which evaluates thyroidal autoimmunity using antithyroid peroxidase antibody in a large series of patients with BD. Methods. We aimed to investigate the frequency of ATD in patients with BD. The study included 124 patients with BD and 99 age- and sex-matched healthy volunteers. Results. Autoimmune thyroiditis was noted in 21 cases (16.9% with BD. In the control group, 22 cases (22.22% were diagnosed as autoimmune thyroiditis. There was no difference between the groups in respect to thyroid autoantibodies (. There were no statistically significant differences between baseline TSH levels of the BD patients and of the controls (. Statistically, the mean serum free T4 levels of the patients with BD were higher than those of the controls (. Conclusions. No association could be found between BD and ATD. Therefore, it is not of significance to investigate thyroid autoimmunity in BD.
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Quantifying the Clinical Significance of Cannabis Withdrawal
Allsop, David J.; Copeland, Jan; Norberg, Melissa M.; Fu, Shanlin; Molnar, Anna; Lewis, John; Budney, Alan J.
2012-01-01
Background and Aims Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV). This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt. Methods and Results A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p = 0.0001). Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p = 0.03). Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p = 0.001). Conclusions Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes. PMID:23049760
Quantifying the clinical significance of cannabis withdrawal.
Directory of Open Access Journals (Sweden)
David J Allsop
Full Text Available Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV. This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt.A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p=0.0001. Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p=0.03. Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p=0.001.Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes.
Muthiah, Saravanan; Singh, R C; Pathak, B D; Avasthi, Piyush Kumar; Kumar, Rishikesh; Kumar, Anil; Srivastava, A K; Dhar, Ajay
2018-01-25
The limited thermoelectric performance of p-type Higher Manganese Silicides (HMS) in terms of their low figure-of-merit (ZT), which is far below unity, is the main bottle-neck for realising an efficient HMS based thermoelectric generator, which has been recognized as the most promising material for harnessing waste-heat in the mid-temperature range, owing to its thermal stability, earth-abundant and environmentally friendly nature of its constituent elements. We report a significant enhancement in the thermoelectric performance of nanostructured HMS synthesized using rapid solidification by optimizing the cooling rates during melt-spinning followed by spark plasma sintering of the resulting melt-spun ribbons. By employing this experimental strategy, an unprecedented ZT ∼ 0.82 at 800 K was realized in spark plasma sintered 5 at% Al-doped MnSi 1.73 HMS, melt spun at an optimized high cooling rate of ∼2 × 10 7 K s -1 . This enhancement in ZT represents a ∼25% increase over the best reported values thus far for HMS and primarily originates from a nano-crystalline microstructure consisting of a HMS matrix (20-40 nm) with excess Si (3-9 nm) uniformly distributed in it. This nanostructure, resulting from the high cooling rates employed during the melt-spinning of HMS, introduces a high density of nano-crystallite boundaries in a wide spectrum of nano-scale dimensions, which scatter the low-to-mid-wavelength heat-carrying phonons. This abundant phonon scattering results in a significantly reduced thermal conductivity of ∼1.5 W m -1 K -1 at 800 K, which primarily contributes to the enhancement in ZT.
Trowler, Paul, Ed.; Saunders, Murray, Ed.; Bamber, Veronica, Ed.
2012-01-01
The "tribes and territories" metaphor for the cultures of academic disciplines and their roots in different knowledge characteristics has been used by those interested in university life and work since the early 1990s. This book draws together research, data and theory to show how higher education has gone through major change since then…
Directory of Open Access Journals (Sweden)
Jelte M Wicherts
Full Text Available BACKGROUND: The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. METHODS AND FINDINGS: We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. CONCLUSIONS: Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Pei, E-mail: pei.zhang@desy.de [School of Physics and Astronomy, The University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Deutsches Elektronen-Synchrotron (DESY), Notkestraße 85, D-22607 Hamburg (Germany); Cockcroft Institute of Science and Technology, Daresbury WA4 4AD (United Kingdom); Baboi, Nicoleta [Deutsches Elektronen-Synchrotron (DESY), Notkestraße 85, D-22607 Hamburg (Germany); Jones, Roger M. [School of Physics and Astronomy, The University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Cockcroft Institute of Science and Technology, Daresbury WA4 4AD (United Kingdom)
2014-01-11
Beam-excited higher order modes (HOMs) can be used to provide beam diagnostics. Here we focus on 3.9 GHz superconducting accelerating cavities. In particular we study dipole mode excitation and its application to beam position determinations. In order to extract beam position information, linear regression can be used. Due to a large number of sampling points in the waveforms, statistical methods are used to effectively reduce the dimension of the system, such as singular value decomposition (SVD) and k-means clustering. These are compared with the direct linear regression (DLR) on the entire waveforms. A cross-validation technique is used to study the sample independent precisions of the position predictions given by these three methods. A RMS prediction error in the beam position of approximately 50 μm can be achieved by DLR and SVD, while k-means clustering suggests 70 μm.
International Nuclear Information System (INIS)
Shakespeare, T.P.; Mukherjee, R.K.; Gebski, V.J.
2003-01-01
Confidence levels, clinical significance curves, and risk-benefit contours are tools improving analysis of clinical studies and minimizing misinterpretation of published results, however no software has been available for their calculation. The objective was to develop software to help clinicians utilize these tools. Excel 2000 spreadsheets were designed using only built-in functions, without macros. The workbook was protected and encrypted so that users can modify only input cells. The workbook has 4 spreadsheets for use in studies comparing two patient groups. Sheet 1 comprises instructions and graphic examples for use. Sheet 2 allows the user to input the main study results (e.g. survival rates) into a 2-by-2 table. Confidence intervals (95%), p-value and the confidence level for Treatment A being better than Treatment B are automatically generated. An additional input cell allows the user to determine the confidence associated with a specified level of benefit. For example if the user wishes to know the confidence that Treatment A is at least 10% better than B, 10% is entered. Sheet 2 automatically displays clinical significance curves, graphically illustrating confidence levels for all possible benefits of one treatment over the other. Sheet 3 allows input of toxicity data, and calculates the confidence that one treatment is more toxic than the other. It also determines the confidence that the relative toxicity of the most effective arm does not exceed user-defined tolerability. Sheet 4 automatically calculates risk-benefit contours, displaying the confidence associated with a specified scenario of minimum benefit and maximum risk of one treatment arm over the other. The spreadsheet is freely downloadable at www.ontumor.com/professional/statistics.htm A simple, self-explanatory, freely available spreadsheet calculator was developed using Excel 2000. The incorporated decision-making tools can be used for data analysis and improve the reporting of results of any
International Nuclear Information System (INIS)
Zhao Yiming; He Yang; Xu Haiyan; Ruan Changgeng
2010-01-01
Objective: To investigate the clinical significance of detecting soluble platelet glycocalicin (sGC) and thrombopoietin (TPO) in the differential diagnosis of idiopathic thrombocytopenic purpura (ITP) and aplastic anemia (AA). Methods: Plasma sGC and serum TPO in 83 patients with ITP, 47 patients with AA and 50 normal individuals were detected by iminunoradiometric assay (IRMA) and enzymelinked immunosorbent assay (ELISA), respectively. Statistical analysis was performed using Q test and P value of 0.05). But serum TPO level in AA group was significantly higher than that in ITP and normal groups: (857.43 ± 228.43) ng/L vs (90.32 ± 39.43) ng/L and (70.29 ± 25.16) ng/L, and they were considered statistically significant (Q=24.45 and 18.25, both P < 0.01). Conclusion: Detecting plasma sGC and serum TPO might be helpful for differentiating ITP and AA and for understanding the pathophysiology of thrombocytopenia. (authors)
Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology
Directory of Open Access Journals (Sweden)
Michèle B. Nuijten
2017-12-01
Full Text Available In this paper, we present three retrospective observational studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011. We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies. In Study 2, we compared reporting inconsistencies in psychology articles published in PLOS journals (with a data sharing policy and Frontiers in Psychology (without a stipulated data sharing policy. In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors. Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing seem extremely effective in promoting data sharing. We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.
Gryko, Anna; Głowińska-Olszewska, Barbara; Płudowska, Katarzyna; Smithson, W Henry; Owłasiuk, Anna; Żelazowska-Rutkowska, Beata; Wojtkielewicz, Katarzyna; Milewski, Robert; Chlabicz, Sławomir
2017-01-01
In the recent years, alterations in the carbohydrate metabolism, including insulin resistance, are considered as risk factors in the development of hypertension and its complications in young age. Hypertension is associated with significant cardiovascular morbidity and mortality. The onset of pathology responsible for the development of hypertension, as well as levels of biomarkers specific for early stages of atherosclerosis are poorly understood. To compare a group of children whose parents have a history of hypertension (study group) with a group of children with normotensive parents (reference group), with consideration of typical risk factors for atherosclerosis, parameters of lipid and carbohydrate metabolism, anthropometric data and new biomarkers of early cardiovascular disease (hsCRP, adiponectin, sICAM-1). The study population consists of 84 children. Of these, 40 children (mean age 13.6±2.7 years) had a parental history of hypertension, and 44 aged 13.1±3.7 yrs were children of normotensive parents. Anthropometric measurements were taken, and measurements of blood pressure, lipid profile, glucose and insulin levels were carried out. The insulin resistance index (HOMA IR) was calculated. Levels of hsCRP, soluble cell adhesion molecules (sICAM) and adiponectin were measured. There were no statistically significant differences in anthropometric parameters (body mass, SDS BMI, skin folds) between groups. Values of systolic blood pressure were statistically significantly higher in the study group (Me 108 vs. 100 mmHg, p= 0.031), as were glycaemia (Me 80 vs. 67 mg/dl pchildren of hypertensive parents) (Me 1.68 vs. 0.80 mmol/l × mU/l, p=0.007). Lower adiponectin levels (Me 13959.45 vs. 16822 ng/ml, p=0.020) were found in children with a family history of hypertension. No significant differences were found in the levels of sICAM, hsCRP, and parameters of lipid metabolism. Family history of hypertension is correlated with higher values of systolic blood
Clinical significance of intramammary arterial calcifications in diabetic women
Directory of Open Access Journals (Sweden)
Milošević Zorica
2004-01-01
Full Text Available Background. It is well known that intramammary arterial calcifications diagnosed by mammography as a part of generalized diabetic macroangiopathy may be an indirect sign of diabetes mellitus. Hence, the aim of this study was to determine the incidence of intramammary arterial calcifications, the patient’s age when the calcifications occur, as well as to observe the influence of diabetic polineuropathy, type, and the duration of diabetes on the onset of calcifications, in comparison with nondiabetic women. Methods. Mammographic findings of 113 diabetic female patients (21 with type 1 diabetes and 92 with type 2, as well as of 208 nondiabetic women (the control group were analyzed in the prospective study. The data about the type of diabetes, its duration, and polineuropathy were obtained using the questionnaire. Statistical differences were determined by Mann-Whitney test. Results. Intramammary arterial calcifications were identified in 33.3% of the women with type 1 diabetes, in 40.2% with type 2, and in 8.2% of the women from the control group, respectively. The differences comparing the women with type 1, as well as type 2 diabetes and the controls were statistically significant (p=0.0001. Women with intramammary arterial calcifications and type 1 diabetes were younger comparing to the control group (median age 52 years, comparing to 67 years of age, p=0.001, while there was no statistically significant difference in age between the women with calcifications and type 2 diabetes (61 years of age in relation to the control group (p=0.176. The incidence of polineuropathy in diabetic women was higher in the group with intramammary arterial calcifications (52.3% in comparison to the group without calcifications (26.1%, (p=0.005. The association between intramammary arterial calcifications and the duration of diabetes was not found. Conclusion. The obtained results supported the theory that intramammary arterial calcifications, detected by
An application of an optimal statistic for characterizing relative orientations
Jow, Dylan L.; Hill, Ryley; Scott, Douglas; Soler, J. D.; Martin, P. G.; Devlin, M. J.; Fissel, L. M.; Poidevin, F.
2018-02-01
We present the projected Rayleigh statistic (PRS), a modification of the classic Rayleigh statistic, as a test for non-uniform relative orientation between two pseudo-vector fields. In the application here, this gives an effective way of investigating whether polarization pseudo-vectors (spin-2 quantities) are preferentially parallel or perpendicular to filaments in the interstellar medium. For example, there are other potential applications in astrophysics, e.g. when comparing small-scale orientations with larger scale shear patterns. We compare the efficiency of the PRS against histogram binning methods that have previously been used for characterizing the relative orientations of gas column density structures with the magnetic field projected on the plane of the sky. We examine data for the Vela C molecular cloud, where the column density is inferred from Herschel submillimetre observations, and the magnetic field from observations by the Balloon-borne Large-Aperture Submillimetre Telescope in the 250-, 350- and 500-μm wavelength bands. We find that the PRS has greater statistical power than approaches that bin the relative orientation angles, as it makes more efficient use of the information contained in the data. In particular, the use of the PRS to test for preferential alignment results in a higher statistical significance, in each of the four Vela C regions, with the greatest increase being by a factor 1.3 in the South-Nest region in the 250 - μ m band.
Review of the Statistical Techniques in Medical Sciences | Okeh ...
African Journals Online (AJOL)
... medical researcher in selecting the appropriate statistical techniques. Of course, all statistical techniques have certain underlying assumptions, which must be checked before the technique is applied. Keywords: Variable, Prospective Studies, Retrospective Studies, Statistical significance. Bio-Research Vol. 6 (1) 2008: pp.
Statistical analysis of brake squeal noise
Oberst, S.; Lai, J. C. S.
2011-06-01
Despite substantial research efforts applied to the prediction of brake squeal noise since the early 20th century, the mechanisms behind its generation are still not fully understood. Squealing brakes are of significant concern to the automobile industry, mainly because of the costs associated with warranty claims. In order to remedy the problems inherent in designing quieter brakes and, therefore, to understand the mechanisms, a design of experiments study, using a noise dynamometer, was performed by a brake system manufacturer to determine the influence of geometrical parameters (namely, the number and location of slots) of brake pads on brake squeal noise. The experimental results were evaluated with a noise index and ranked for warm and cold brake stops. These data are analysed here using statistical descriptors based on population distributions, and a correlation analysis, to gain greater insight into the functional dependency between the time-averaged friction coefficient as the input and the peak sound pressure level data as the output quantity. The correlation analysis between the time-averaged friction coefficient and peak sound pressure data is performed by applying a semblance analysis and a joint recurrence quantification analysis. Linear measures are compared with complexity measures (nonlinear) based on statistics from the underlying joint recurrence plots. Results show that linear measures cannot be used to rank the noise performance of the four test pad configurations. On the other hand, the ranking of the noise performance of the test pad configurations based on the noise index agrees with that based on nonlinear measures: the higher the nonlinearity between the time-averaged friction coefficient and peak sound pressure, the worse the squeal. These results highlight the nonlinear character of brake squeal and indicate the potential of using nonlinear statistical analysis tools to analyse disc brake squeal.
The efficacy of adult christian support groups in coping with the death of a significant loved one.
Goodman, Herbert; Stone, Mark H
2009-09-01
Psychologists sometimes minimize important resources such as religion and spiritual beliefs for coping with bereavement. Alienation of therapeutic psychology from religious values contrasts to professional and public interest in religious experience and commitment. A supportive viewpoint has come about partially as a result of recognizing important values which clinicians have found absent in many of their clients. Until spiritual belief systems become integrated into the work of clinicians, clients may not be fully integrative in coping with loss. The key finding of this study was that individuals who participated in Christian and secular support groups showed no statistically significant difference in their mean endorsement of negative criteria on the BHS, and no statistically significant difference for their mean score endorsement of positive criteria on the RCOPE. However, a Christian-oriented approach was no less effective than a psychological-oriented one. In both groups, a spiritual connection to a specific or generalized higher power was frequently identified which clients ascribed to facilitating the management of their coping.
The Playground Game: Inquiry‐Based Learning About Research Methods and Statistics
Westera, Wim; Slootmaker, Aad; Kurvers, Hub
2014-01-01
The Playground Game is a web-based game that was developed for teaching research methods and statistics to nursing and social sciences students in higher education and vocational training. The complexity and abstract nature of research methods and statistics poses many challenges for students. The
Implementation of an adaptive training and tracking game in statistics teaching
Groeneveld, C.M.; Kalz, M.; Ras, E.
2014-01-01
Statistics teaching in higher education has a number of challenges. An adaptive training, tracking and teaching tool in a gaming environment aims to address problems inherent in statistics teaching. This paper discusses the implementation of this tool in a large first year university programme and
Statistical distribution of resonance parameters for inelastic scattering of fast neutrons
International Nuclear Information System (INIS)
Radunovic, J.
1973-01-01
This paper deals with the application of statistical method for the analysis of nuclear reactions related to complex nuclei. It is shown that inelastic neutron scattering which occurs by creation of a complex nucleus in the higher energy range can be treated by statistical approach
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.
2012-12-01
Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and
Occupational stress and organisational commitment of employees at higher educational institution
Directory of Open Access Journals (Sweden)
Simbarashe Zhuwao
2015-12-01
Full Text Available The objective of this study was to determine the relationship between occupational stress and organisational commitment of employees at a higher education institution. A random sample (N=30 was chosen from academic staff within the university. The study used a quantitative design. The Organisational Stress Screening Tool (ASSET and Allen and Meyer’s Organisational Commitment Tool (OCT were administered. The study revealed that a statistical significant relationship exists between occupational stress and organizational commitment of employees. The study also showed that academic staff overall experienced average levels of occupational stress and organisational commitment. Job characteristics and work relationship were found to be the major sources of occupation stress. It is recommended that higher education institutions should improve employee participation in decision making to reduce employees’ stress as a result of unmanageable workloads and overload.
Yilmaz, Ferkan
2012-06-01
The exact analysis of the higher-order statistics of the channel capacity (i.e., higher-order ergodic capacity) often leads to complicated expressions involving advanced special functions. In this paper, we provide a generic framework for the computation of the higher-order statistics of the channel capacity over generalized fading channels. As such, this novel framework for the higher-order statistics results in simple, closed-form expressions which are shown to be asymptotically tight bounds in the high signal-to-noise ratio (SNR) regime of a variety of fading environment. In addition, it reveals the existence of differences (i.e., constant capacity gaps in log-domain) among different fading environments. By asymptotically tight bound we mean that the high SNR limit of the difference between the actual higher-order statistics of the channel capacity and its asymptotic bound (i.e., lower bound) tends to zero. The mathematical formalism is illustrated with some selected numerical examples that validate the correctness of our newly derived results. © 2012 IEEE.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
Costs and risks of the import of RES statistics by the Dutch government
Energy Technology Data Exchange (ETDEWEB)
Klessmann, C.; De Jager, D.; Gephart, M.; Winkel, T.
2012-11-15
This paper presents a first estimate of the costs and risks of a potential import of renewable energy statistics by the Dutch Government in order to meet the binding renewable energy (RE) target of 14% by 2020. Recently, the new government has announced that it will increase the ambition from 14% to 16%. Progress so-far has been slow however and meeting these targets requires near to maximum realisable deployment rates of all relevant technologies. It points at the necessity to increase national policy measures (spatial, political, financial, etc.) for all renewable energy technologies or alternatively, to apply the cooperation mechanisms and/or import RES statistics from other countries. It is generally assumed that imported RE statistics, through the cooperation mechanisms of the European RES Directive, will have lower costs than supporting the potentially more expensive domestic technologies that would be needed to meet the targets fully by domestic production. This paper shows that this assumption is questionable, and that the risks of pursuing an import-strategy may be significant. The analysis shows that the use of statistical transfers, which in principle may be a viable option for realising part of the Dutch RE target, is linked to high uncertainties. Important aspects contributing to these uncertainties are: The effectiveness and efficiency of policies in the European Member States to meet domestic RE targets by and up to 2020, and hence the related surplus/shortfall of RE production and resulting market prices for statistical transfers; The price setting mechanisms that will be established between Member States, including the anticipated cost of infringement in case of non-fulfilment of the 2020 targets. Imports will likely be charged against the market prices for (statistical) transfers, not against the cost prices of RE technologies. The price of statistical transfers can be expected to be higher in the case of a clear 'buyer market' in which
Is higher risk sex common among male or female youths?
Berhan, Yifru; Berhan, Asres
2015-01-01
There are several studies that showed the high prevalence of high-risk sexual behaviors among youths, but little is known how significant the proportion of higher risk sex is when the male and female youths are compared. A meta-analysis was done using 26 countries' Demographic and Health Survey data from and outside Africa to make comparisons of higher risk sex among the most vulnerable group of male and female youths. Random effects analytic model was applied and the pooled odds ratios were determined using Mantel-Haenszel statistical method. In this meta-analysis, 19,148 male and 65,094 female youths who reported to have sexual intercourse in a 12-month period were included. The overall OR demonstrated that higher risk sex was ten times more prevalent in male youths than in female youths. The practice of higher risk sex by male youths aged 15-19 years was more than 27-fold higher than that of their female counterparts. Similarly, male youths in urban areas, belonged to a family with middle to highest wealth index, and educated to secondary and above were more than ninefold, eightfold and sixfold at risk of practicing higher risk sex than their female counterparts, respectively. In conclusion, this meta-analysis demonstrated that the practice of risky sexual intercourse by male youths was incomparably higher than female youths. Future risky sex protective interventions should be tailored to secondary and above educated male youths in urban areas.
Federal Policies and Higher Education in the United States.
Prisco, Anne; Hurley, Alicia D.; Carton, Thomas C; Richardson, Richard C., Jr.
The purpose of this report is to describe U.S. federal policies that have helped to shape the context within which state systems of higher education operated during the past decade. It also presents descriptive statistics about the higher education enterprise in the United States, including available performance data. The report is based on the…
Irrigated Area Maps and Statistics of India Using Remote Sensing and National Statistics
Directory of Open Access Journals (Sweden)
Prasad S. Thenkabail
2009-04-01
Full Text Available The goal of this research was to compare the remote-sensing derived irrigated areas with census-derived statistics reported in the national system. India, which has nearly 30% of global annualized irrigated areas (AIAs, and is the leading irrigated area country in the World, along with China, was chosen for the study. Irrigated areas were derived for nominal year 2000 using time-series remote sensing at two spatial resolutions: (a 10-km Advanced Very High Resolution Radiometer (AVHRR and (b 500-m Moderate Resolution Imaging Spectroradiometer (MODIS. These areas were compared with the Indian National Statistical Data on irrigated areas reported by the: (a Directorate of Economics and Statistics (DES of the Ministry of Agriculture (MOA, and (b Ministry of Water Resources (MoWR. A state-by-state comparison of remote sensing derived irrigated areas when compared with MoWR derived irrigation potential utilized (IPU, an equivalent of AIA, provided a high degree of correlation with R2 values of: (a 0.79 with 10-km, and (b 0.85 with MODIS 500-m. However, the remote sensing derived irrigated area estimates for India were consistently higher than the irrigated areas reported by the national statistics. The remote sensing derived total area available for irrigation (TAAI, which does not consider intensity of irrigation, was 101 million hectares (Mha using 10-km and 113 Mha using 500-m. The AIAs, which considers intensity of irrigation, was 132 Mha using 10-km and 146 Mha using 500-m. In contrast the IPU, an equivalent of AIAs, as reported by MoWR was 83 Mha. There are “large variations” in irrigated area statistics reported, even between two ministries (e.g., Directorate of Statistics of Ministry of Agriculture and Ministry of Water Resources of the same national system. The causes include: (a reluctance on part of the states to furnish irrigated area data in view of their vested interests in sharing of water, and (b reporting of large volumes of data
Topography and Higher Order Corneal Aberrations of the Fellow Eye in Unilateral Keratoconus.
Aksoy, Sibel; Akkaya, Sezen; Özkurt, Yelda; Kurna, Sevda; Açıkalın, Banu; Şengör, Tomris
2017-10-01
Comparison of topography and corneal higher order aberrations (HOA) data of fellow normal eyes of unilateral keratoconus patients with keratoconus eyes and control group. The records of 196 patients with keratoconus were reviewed. Twenty patients were identified as unilateral keratoconus. The best corrected visual acuity (BCVA), topography and aberration data of the unilateral keratoconus patients' normal eyes were compared with their contralateral keratoconus eyes and with control group eyes. For statistical analysis, flat and steep keratometry values, average corneal power, cylindrical power, surface regularity index (SRI), surface asymmetry index (SAI), inferior-superior ratio (I-S), keratoconus prediction index, and elevation-depression power (EDP) and diameter (EDD) topography indices were selected. Mean age of the unilateral keratoconus patients was 26.05±4.73 years and that of the control group was 23.6±8.53 years (p>0.05). There was no statistical difference in BCVA between normal and control eyes (p=0.108), whereas BCVA values were significantly lower in eyes with keratoconus (p=0.001). Comparison of quantitative topographic indices between the groups showed that all indices except the I-S ratio were significantly higher in the normal group than in the control group (p<0.05). The most obvious differences were in the SRI, SAI, EDP, and EDD values. All topographic indices were higher in the keratoconus eyes compared to the normal fellow eyes. There was no difference between normal eyes and the control group in terms of spherical aberration, while coma, trefoil, irregular astigmatism, and total HOA values were higher in the normal eyes of unilateral keratoconus patients (p<0.05). All HOA values were higher in keratoconus eyes than in the control group. According to our study, SRI, SAI, EDP, EDD values, and HOA other than spherical aberration were higher in the clinically and topographically normal fellow eyes of unilateral keratoconus patients when compared
[Statistics for statistics?--Thoughts about psychological tools].
Berger, Uwe; Stöbel-Richter, Yve
2007-12-01
Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.
Higher order aberrations in amblyopic children and their role in refractory amblyopia
Directory of Open Access Journals (Sweden)
Arnaldo Dias-Santos
2014-12-01
Full Text Available Objective: Some studies have hypothesized that an unfavourable higher order aberrometric profile could act as an amblyogenic mechanism and may be responsible for some amblyopic cases that are refractory to conventional treatment or cases of “idiopathic” amblyopia. This study compared the aberrometric profile in amblyopic children to that of children with normal visual development and compared the aberrometric profile in corrected amblyopic eyes and refractory amblyopic eyes with that of healthy eyes. Methods: Cross-sectional study with three groups of children – the CA group (22 eyes of 11 children with unilateral corrected amblyopia, the RA group (24 eyes of 13 children with unilateral refractory amblyopia and the C group (28 eyes of 14 children with normal visual development. Higher order aberrations were evaluated using an OPD-Scan III (NIDEK. Comparisons of the aberrometric profile were made between these groups as well as between the amblyopic and healthy eyes within the CA and RA groups. Results: Higher order aberrations with greater impact in visual quality were not significantly higher in the CA and RA groups when compared with the C group. Moreover, there were no statistically significant differences in the higher order aberrometric profile between the amblyopic and healthy eyes within the CA and RA groups. Conclusions: Contrary to lower order aberrations (e.g., myopia, hyperopia, primary astigmatism, higher order aberrations do not seem to be involved in the etiopathogenesis of amblyopia. Therefore, these are likely not the cause of most cases of refractory amblyopia.
On a curvature-statistics theorem
International Nuclear Information System (INIS)
Calixto, M; Aldaya, V
2008-01-01
The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature (κ = ±1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.
On a curvature-statistics theorem
Energy Technology Data Exchange (ETDEWEB)
Calixto, M [Departamento de Matematica Aplicada y Estadistica, Universidad Politecnica de Cartagena, Paseo Alfonso XIII 56, 30203 Cartagena (Spain); Aldaya, V [Instituto de Astrofisica de Andalucia, Apartado Postal 3004, 18080 Granada (Spain)], E-mail: Manuel.Calixto@upct.es
2008-08-15
The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature ({kappa} = {+-}1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.
Wu, Johnny C; Gardner, David P; Ozer, Stuart; Gutell, Robin R; Ren, Pengyu
2009-08-28
The accurate prediction of the secondary and tertiary structure of an RNA with different folding algorithms is dependent on several factors, including the energy functions. However, an RNA higher-order structure cannot be predicted accurately from its sequence based on a limited set of energy parameters. The inter- and intramolecular forces between this RNA and other small molecules and macromolecules, in addition to other factors in the cell such as pH, ionic strength, and temperature, influence the complex dynamics associated with transition of a single stranded RNA to its secondary and tertiary structure. Since all of the factors that affect the formation of an RNAs 3D structure cannot be determined experimentally, statistically derived potential energy has been used in the prediction of protein structure. In the current work, we evaluate the statistical free energy of various secondary structure motifs, including base-pair stacks, hairpin loops, and internal loops, using their statistical frequency obtained from the comparative analysis of more than 50,000 RNA sequences stored in the RNA Comparative Analysis Database (rCAD) at the Comparative RNA Web (CRW) Site. Statistical energy was computed from the structural statistics for several datasets. While the statistical energy for a base-pair stack correlates with experimentally derived free energy values, suggesting a Boltzmann-like distribution, variation is observed between different molecules and their location on the phylogenetic tree of life. Our statistical energy values calculated for several structural elements were utilized in the Mfold RNA-folding algorithm. The combined statistical energy values for base-pair stacks, hairpins and internal loop flanks result in a significant improvement in the accuracy of secondary structure prediction; the hairpin flanks contribute the most.
Directory of Open Access Journals (Sweden)
Jose H. Guardiola
2010-01-01
Full Text Available This paper compares the academic performance of students in three similar elementary statistics courses taught by the same instructor, but with the lab component differing among the three. One course is traditionally taught without a lab component; the second with a lab component using scenarios and an extensive use of technology, but without explicit coordination between lab and lecture; and the third using a lab component with an extensive use of technology that carefully coordinates the lab with the lecture. Extensive use of technology means, in this context, using Minitab software in the lab section, doing homework and quizzes using MyMathlab ©, and emphasizing interpretation of computer output during lectures. Initially, an online instrument based on Gardner’s multiple intelligences theory, is given to students to try to identify students’ learning styles and intelligence types as covariates. An analysis of covariance is performed in order to compare differences in achievement. In this study there is no attempt to measure difference in student performance across the different treatments. The purpose of this study is to find indications of associations among variables that support the claim that statistics labs could be associated with superior academic achievement in one of these three instructional environments. Also, this study tries to identify individual student characteristics that could be associated with superior academic performance. This study did not find evidence of any individual student characteristics that could be associated with superior achievement. The response variable was computed as percentage of correct answers for the three exams during the semester added together. The results of this study indicate a significant difference across these three different instructional methods, showing significantly higher mean scores for the response variable on students taking the lab component that was carefully coordinated with
Expression and clinical significance of Pax6 gene in retinoblastoma
Directory of Open Access Journals (Sweden)
Hai-Dong Huang
2013-07-01
Full Text Available AIM: To discuss the expression and clinical significance of Pax6 gene in retinoblastoma(Rb. METHODS: Totally 15 cases of fresh Rb organizations were selected as observation group and 15 normal retinal organizations as control group. Western-Blot and reverse transcriptase polymerase chain reaction(RT-PCRmethods were used to detect Pax6 protein and Pax6 mRNA expressions of the normal retina organizations and Rb organizations. At the same time, Western Blot method was used to detect the Pax6 gene downstream MATH5 and BRN3b differentiation gene protein level expression. After the comparison between two groups, the expression and clinical significance of Pax6 gene in Rb were discussed. RESULTS: In the observation group, average value of mRNA expression of Pax6 gene was 0.99±0.03; average value of Pax6 gene protein expression was 2.07±0.15; average value of BRN3b protein expression was 0.195±0.016; average value of MATH5 protein expression was 0.190±0.031. They were significantly higher than the control group, and the differences were statistically significant(PCONCLUSION: Abnormal expression of Pax6 gene is likely to accelerate the occurrence of Rb.
MIDAS: Regionally linear multivariate discriminative statistical mapping.
Varol, Erdem; Sotiras, Aristeidis; Davatzikos, Christos
2018-07-01
statistical significance of the derived statistic by analytically approximating its null distribution without the need for computationally expensive permutation tests. The proposed framework was extensively validated using simulated atrophy in structural magnetic resonance imaging (MRI) and further tested using data from a task-based functional MRI study as well as a structural MRI study of cognitive performance. The performance of the proposed framework was evaluated against standard voxel-wise general linear models and other information mapping methods. The experimental results showed that MIDAS achieves relatively higher sensitivity and specificity in detecting group differences. Together, our results demonstrate the potential of the proposed approach to efficiently map effects of interest in both structural and functional data. Copyright © 2018. Published by Elsevier Inc.
Expression and significance of HMGB1, TLR4 and NF-κB p65 in human epidermal tumors
International Nuclear Information System (INIS)
Weng, Hui; Deng, Yunhua; Xie, Yuyan; Liu, Hongbo; Gong, Feili
2013-01-01
High mobility group protein box 1 (HMGB1) is a DNA binding protein located in nucleus. It is released into extracellular fluid where it acts as a novel proinflammatory cytokine which interacts with Toll like receptor 4 (TLR4) to activate nuclear factor-κB (NF-κB). This sequence of events is involved in tumor growth and progression. However, the effects of HMGB1, TLR4 and NF-κB on epidermal tumors remain unclear. Human epidermal tumor specimens were obtained from 96 patients. Immunohistochemistry was used to detect expression of HMGB1, TLR4 and NF-κB p65 in human epidermal tumor and normal skin specimens. Western blot analysis was used to detect the expression of NF-κB p65 in epithelial cell nuclei in human epidermal tumor and normal tissues. Immunohistochemistry and western blot analysis indicated a progressive but statistically significant increase in p65 expression in epithelial nuclei in benign seborrheic keratosis (SK), precancerous lesions (PCL), low malignancy basal cell carcinoma (BCC) and high malignancy squamous cell carcinoma (SCC) (P <0.01). The level of extracellular HMGB1 in SK was significantly higher than in normal skin (NS) (P <0.01), and was higher than in SCC but without statistical significance. The level of TLR4 on epithelial membranes of SCC cells was significantly higher than in SK, PCL, BCC and NS (P <0.01). There was a significant positive correlation between p65 expression in the epithelial nuclei and TLR4 expression on the epithelial cell membranes (r = 0.3212, P <0.01). These findings indicate that inflammation is intensified in parallel with increasing malignancy. They also indicate that the TLR4 signaling pathway, rather than HMGB1, may be the principal mediator of inflammation in high-grade malignant epidermal tumors. Combined detection of p65 in the epithelial nuclei and TLR4 on the epithelial membranes may assist the accurate diagnosis of malignant epidermal tumors
Statistical physics of medical ultrasonic images
International Nuclear Information System (INIS)
Wagner, R.F.; Insana, M.F.; Brown, D.G.; Smith, S.W.
1987-01-01
The physical and statistical properties of backscattered signals in medical ultrasonic imaging are reviewed in terms of: 1) the radiofrequency signal; 2) the envelope (video or magnitude) signal; and 3) the density of samples in simple and in compounded images. There is a wealth of physical information in backscattered signals in medical ultrasound. This information is contained in the radiofrequency spectrum - which is not typically displayed to the viewer - as well as in the higher statistical moments of the envelope or video signal - which are not readily accessed by the human viewer of typical B-scans. This information may be extracted from the detected backscattered signals by straightforward signal processing techniques at low resolution
Testing statistical hypotheses of equivalence
Wellek, Stefan
2010-01-01
Equivalence testing has grown significantly in importance over the last two decades, especially as its relevance to a variety of applications has become understood. Yet published work on the general methodology remains scattered in specialists' journals, and for the most part, it focuses on the relatively narrow topic of bioequivalence assessment.With a far broader perspective, Testing Statistical Hypotheses of Equivalence provides the first comprehensive treatment of statistical equivalence testing. The author addresses a spectrum of specific, two-sided equivalence testing problems, from the
Statistical theory applications and associated computer codes
International Nuclear Information System (INIS)
Prince, A.
1980-01-01
The general format is along the same lines as that used in the O.M. Session, i.e. an introduction to the nature of the physical problems and methods of solution based on the statistical model of the nucleus. Both binary and higher multiple reactions are considered. The computer codes used in this session are a combination of optical model and statistical theory. As with the O.M. sessions, the preparation of input and analysis of output are thoroughly examined. Again, comparison with experimental data serves to demonstrate the validity of the results and possible areas for improvement. (author)
Xu, Lihua; Tan, Huo; Liu, Ruiming; Huang, Qungai; Zhang, Nana; Li, Xi; Wang, Jiani
2017-11-01
The cytoskeleton regulatory protein Mena is reportedly overexpressed in breast cancer; however, data regarding its expression level and clinical significance in gastric carcinoma (GC) is limited. The aim of the present study was to investigate Mena expression levels and prognostic significance in GC. Mena mRNA expression level was determined by reverse transcription-quantitative polymerase chain reaction in 10 paired GC and adjacent normal tissues. The Mena protein expression level was analyzed in paraffin-embedded GC samples and adjacent normal tissues by immunohistochemistry. Statistical analyses were also performed to evaluate the clinicopathological significance of Mena. The results revealed that the mRNA expression level of Mena was significantly higher in G Ct issues compared with in adjacent normal tissues from10 paired samples. In the paraffin-embedded tissue samples, the protein expression level of Mena was higher in G Ct issues compared with in adjacent normal tissues. Compared with adjacent normal tissues, Mena overexpression was observed in 52.83% (56/106) of patients. The overexpression of Mena was significantly associated with the T stage (P=0.033), tumor-node-metastasis (TNM) stage (PMena expression level was an independent prognostic factor for overall survival time. In conclusion, Mena wasoverexpressed in G C tissues and significantly associated with the T stage, TNM stage and overall survival time. Mena may therefore be suitable as a prognostic indicator for patients with GC.
Students' Perspectives of Using Cooperative Learning in a Flipped Statistics Classroom
Chen, Liwen; Chen, Tung-Liang; Chen, Nian-Shing
2015-01-01
Statistics has been recognised as one of the most anxiety-provoking subjects to learn in the higher education context. Educators have continuously endeavoured to find ways to integrate digital technologies and innovative pedagogies in the classroom to eliminate the fear of statistics. The purpose of this study is to systematically identify…
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Statistical Dependence of Pipe Breaks on Explanatory Variables
Directory of Open Access Journals (Sweden)
Patricia Gómez-Martínez
2017-02-01
Full Text Available Aging infrastructure is the main challenge currently faced by water suppliers. Estimation of assets lifetime requires reliable criteria to plan assets repair and renewal strategies. To do so, pipe break prediction is one of the most important inputs. This paper analyzes the statistical dependence of pipe breaks on explanatory variables, determining their optimal combination and quantifying their influence on failure prediction accuracy. A large set of registered data from Madrid water supply network, managed by Canal de Isabel II, has been filtered, classified and studied. Several statistical Bayesian models have been built and validated from the available information with a technique that combines reference periods of time as well as geographical location. Statistical models of increasing complexity are built from zero up to five explanatory variables following two approaches: a set of independent variables or a combination of two joint variables plus an additional number of independent variables. With the aim of finding the variable combination that provides the most accurate prediction, models are compared following an objective validation procedure based on the model skill to predict the number of pipe breaks in a large set of geographical locations. As expected, model performance improves as the number of explanatory variables increases. However, the rate of improvement is not constant. Performance metrics improve significantly up to three variables, but the tendency is softened for higher order models, especially in trunk mains where performance is reduced. Slight differences are found between trunk mains and distribution lines when selecting the most influent variables and models.
Directory of Open Access Journals (Sweden)
Rui-Feng Liu
2017-09-01
Full Text Available Objective: To investigate the change and significance of serum inflammatory factors, neuron specific enolase (NSE, S100 protein and stress hormone levels in patients with brain diseases. Methods: A total of 115 patients with craniocerebral injury were selected as the observation group, according to the Glasgow Coma Scale (GCS, they were divided into light-sized group (n=38, middle-sized group (n=40 and severe-sized group (n=37, at the same time the other 120 healthy subjects were selected as the control group. The levels of serum inflammatory cytokines [tumor necrosis factor alpha (TNF-α and procalcitonin (PCT], neuron specific enolase (NSE, S100 protein and the stress hormone cortisol [(COR, adrenocorticotropic hormone (ACTH, β-endorphin (β-EP] of both groups were compared. Results: The levels of TNF-α, PCT, NSE, S100, COR, ACTH and β-EP in the observation group were (145.73±19.24 ng/L, (2.41±0.64 ng/mL, (38.11±12.28 ng/mL, (0.87±0.32 μg/L, (818.87±121.14 nmol/L, (107.38±13.94 ng/L, (126.74±39.04 ng/mL, which were significantly higher than control group, the difference was statistically significant; Comparison of indexes among the observation group, NF-α, PCT, NSE, S100, COR, ACTH and β-EP levels in the middle-sized group and severe-sized group were significantly higher than those in the light-sized group, and the levels in the severe-sized group were significantly higher than those of the middle-sized group, the difference was statistically significant. Conclusion: The levels of Serum inflammatory factors, NSE, S100 protein and stress hormone were significantly increased in patients with craniocerebral injury, the level was related to the degree of traumatic brain injury, which could be used as an important indicator to assess the severity of the disease.
Challenges for statistics teaching and teacher’s training in Mexico
Directory of Open Access Journals (Sweden)
Sergio Hernández González
2013-08-01
Full Text Available This work will cover the problems that are found in teacher training and professional development in Probability and Statistics in higher education in Mexico. It will be approached through four focuses: a the characterization and training of teachers that drive the development and implementation of curriculum reforms in the teaching of Statistics; b challenges of teachers in the instruction of university-level Statistics; c new curricular reforms with respect to the instruction of Statistics that propose the development of a learning based in projects through the use of appropriate statistical software, and d educational innovation as a body of knowledge in development, by which the shaping of networks consisting of professors who favor the emergence of real innovation is brought about. Starting from these perspectives, the challenges confronted in the teaching and training of Statistics professors will be proposed.
Gendy, Hoda El; Madkour, Bothina; Abdelaty, Sara; Essawy, Fayza; Khattab, Dina; Hammam, Olfat; Nour, Hani H.
2014-01-01
Background Galectins are group of proteins found in the cytoplasm, nucleus, cell surface and extracellular matrix. Galectin 3 (Gal-3) displays pathological expression in a variety of processes such as tumorigenesis. Patients and Method 70 patients classified into the control group, cystitis group, transitional cell carcinoma group, and squamous cell carcinoma group were enrolled in this study which aimed to detect the serum level and the intensity of tissue expression of Gal-3. Results Both serum level and tissue expression of Gal-3 were statistically higher in bladder cancer patients compared to the other groups. Gal-3 level expression increased from low to high grade urothelial tumors, with a statistically significant increase of its level and expression between muscle invasive and non-muscle invasive Ta urothelial tumors. Conclusion The serum Gal-3 level is sensitive and specific for the diagnosis of bladder cancer. The prognostic significance of tissue expression is to be confirmed. PMID:26195948
Statistical and theoretical research
International Nuclear Information System (INIS)
Anon.
1983-01-01
Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology
Significance analysis of lexical bias in microarray data
Directory of Open Access Journals (Sweden)
Falkow Stanley
2003-04-01
Full Text Available Abstract Background Genes that are determined to be significantly differentially regulated in microarray analyses often appear to have functional commonalities, such as being components of the same biochemical pathway. This results in certain words being under- or overrepresented in the list of genes. Distinguishing between biologically meaningful trends and artifacts of annotation and analysis procedures is of the utmost importance, as only true biological trends are of interest for further experimentation. A number of sophisticated methods for identification of significant lexical trends are currently available, but these methods are generally too cumbersome for practical use by most microarray users. Results We have developed a tool, LACK, for calculating the statistical significance of apparent lexical bias in microarray datasets. The frequency of a user-specified list of search terms in a list of genes which are differentially regulated is assessed for statistical significance by comparison to randomly generated datasets. The simplicity of the input files and user interface targets the average microarray user who wishes to have a statistical measure of apparent lexical trends in analyzed datasets without the need for bioinformatics skills. The software is available as Perl source or a Windows executable. Conclusion We have used LACK in our laboratory to generate biological hypotheses based on our microarray data. We demonstrate the program's utility using an example in which we confirm significant upregulation of SPI-2 pathogenicity island of Salmonella enterica serovar Typhimurium by the cation chelator dipyridyl.
Pierre Joubert; Christo van Wyk; Sebastiaan Rothmann
2011-01-01
This article aims to investigate the perceptions of academic staff relating to the incidence of sexual harassment at higher education institutions in South Africa. The results show a relatively low incidence level of sexual harassment, with gender harassment being more prevalent than unwanted sexual attention and quid pro quo harassment. No statistically significant effect of gender, age, population group or years of service was found on the perceptions of the incidence of sexual harassment. ...
Neuroimaging of post-traumatic higher brain dysfunction using 123I-Iomazenil (IMZ) SPECT
International Nuclear Information System (INIS)
Nakagawara, Jyoji; Kamiyama, Kenji; Takahashi, Masaaki; Nakamura, Hirohiko
2010-01-01
In patients with mild traumatic brain injury (MTBI), higher brain dysfunctions which consist of cognitive impairments such as memory, attention, performance and social behavioral disturbances could be rarely apparent. However, higher brain dysfunctions should be identified by neuropsychological tests and supported by a social welfare for handicapped patients. Acknowledgement of higher brain dysfunctions after MTBI without obvious brain damages on morphological neuroimagings could be a social issue under controversy. An imaging of cortical neuron damages in patients with higher brain dysfunctions after MTBI was studied by functional neuroimaging using 123 I-Iomazenil (IMZ) single photon emission computed tomography (SPECT). Statistical imaging analyses using 3 dimensional stereotactic surface projections (3D-SSP) for 123 I-IMZ SPECT and 123 I-IMP SPECT as cerebral blood flow (CBF) studies were performed in 11 patients with higher brain dysfunctions after MTBI. In all patients with higher brain dysfunctions defined by neuropsychological tests, cortical neuron damages were observed in bilateral medial frontal lobes, but reduction of CBF in bilateral medial frontal lobes were less obviously showed in 8 patients (apparent in 3 and little in 5). Group comparison of 3D-SSP of 123 I-IMZ SPECT between 11 patients and 18 normal controls demonstrated significant selective loss of cortical neuron in bilateral medial frontal gyrus (MFG). Extent of abnormal pixels on each cortical gyrus using stereotactic extraction estimation (SEE) for 3D-SSP of 123 I-IMZ SPECT confirmed that 8 patients had abnormal pixel extent >10% in bilateral MFG and 5 patients had abnormal pixel extent >10% in bilateral anterior cingulate gyrus. In patients with MTBI, higher brain dysfunctions seems to correlate with selective loss of cortical neuron within bilateral MFG which could be caused by Wallerian degeneration as secondary phenomena after diffuse axonal injury within corpus callosum. Statistical
Li, Xiangyu; Cai, Hao; Wang, Xianlong; Ao, Lu; Guo, You; He, Jun; Gu, Yunyan; Qi, Lishuang; Guan, Qingzhou; Lin, Xu; Guo, Zheng
2017-10-13
To detect differentially expressed genes (DEGs) in small-scale cell line experiments, usually with only two or three technical replicates for each state, the commonly used statistical methods such as significance analysis of microarrays (SAM), limma and RankProd (RP) lack statistical power, while the fold change method lacks any statistical control. In this study, we demonstrated that the within-sample relative expression orderings (REOs) of gene pairs were highly stable among technical replicates of a cell line but often widely disrupted after certain treatments such like gene knockdown, gene transfection and drug treatment. Based on this finding, we customized the RankComp algorithm, previously designed for individualized differential expression analysis through REO comparison, to identify DEGs with certain statistical control for small-scale cell line data. In both simulated and real data, the new algorithm, named CellComp, exhibited high precision with much higher sensitivity than the original RankComp, SAM, limma and RP methods. Therefore, CellComp provides an efficient tool for analyzing small-scale cell line data. © The Author 2017. Published by Oxford University Press.
Whither Statistics Education Research?
Watson, Jane
2016-01-01
This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…
A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE
Energy Technology Data Exchange (ETDEWEB)
Hillier, A. [Kwasan and Hida Observatories, Kyoto University, Kyoto 607-8471 (Japan); Morton, R. J. [Mathematics and Information Science, Northumbria University, Pandon Building, Camden Street, Newcastle upon Tyne NE1 8ST (United Kingdom); Erdélyi, R., E-mail: andrew@kwasan.kyoto-u.ac.jp [Solar Physics and Space Plasma Research Centre (SP2RC), University of Sheffield, Hicks Building, Hounsfield Road, Sheffield S3 7RH (United Kingdom)
2013-12-20
The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000 s with typical velocity amplitudes ranging between 0.2 and 23 km s{sup –1}. The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7 mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency.
A STATISTICAL STUDY OF TRANSVERSE OSCILLATIONS IN A QUIESCENT PROMINENCE
International Nuclear Information System (INIS)
Hillier, A.; Morton, R. J.; Erdélyi, R.
2013-01-01
The launch of the Hinode satellite has allowed for seeing-free observations at high-resolution and high-cadence making it well suited to study the dynamics of quiescent prominences. In recent years it has become clear that quiescent prominences support small-amplitude transverse oscillations, however, sample sizes are usually too small for general conclusions to be drawn. We remedy this by providing a statistical study of transverse oscillations in vertical prominence threads. Over a 4 hr period of observations it was possible to measure the properties of 3436 waves, finding periods from 50 to 6000 s with typical velocity amplitudes ranging between 0.2 and 23 km s –1 . The large number of observed waves allows the determination of the frequency dependence of the wave properties and derivation of the velocity power spectrum for the transverse waves. For frequencies less than 7 mHz, the frequency dependence of the velocity power is consistent with the velocity power spectra generated from observations of the horizontal motions of magnetic elements in the photosphere, suggesting that the prominence transverse waves are driven by photospheric motions. However, at higher frequencies the two distributions significantly diverge, with relatively more power found at higher frequencies in the prominence oscillations. These results highlight that waves over a large frequency range are ubiquitous in prominences, and that a significant amount of the wave energy is found at higher frequency
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Lee, Romeo B; Maria, Madelene Sta; Estanislao, Susana; Rodriguez, Cristina
2013-11-01
Over the years, the number of international university students has been increasing in the Philippines. Depression tends to be common among this demographic sector, because of the varying challenges and expectations associated with studying abroad. Depression can be prevented if its symptoms, particularly those at higher levels, are identified and addressed early and effectively. This survey examined the social and demographic factors that are significantly associated with higher levels of depressive symptoms. One hundred twenty-six international university students were interviewed using the University Students Depression Inventory. Of the 13 factors analyzed, 3 were found with statistically significant associations with more intense levels of depressive symptoms. These factors were: level of satisfaction with one's financial condition, level of closeness with parents, and level of closeness with peers. In identifying international students with greater risk for depression, characteristics related to their financial condition and primary group relationships can be considered. There is a need to carry out more studies to confirm this initial evidence. The findings can help guide further discourse, research and program to benefit international students with higher levels of depressive symptoms.
Identifying clusters of active transportation using spatial scan statistics.
Huang, Lan; Stinchcomb, David G; Pickle, Linda W; Dill, Jennifer; Berrigan, David
2009-08-01
There is an intense interest in the possibility that neighborhood characteristics influence active transportation such as walking or biking. The purpose of this paper is to illustrate how a spatial cluster identification method can evaluate the geographic variation of active transportation and identify neighborhoods with unusually high/low levels of active transportation. Self-reported walking/biking prevalence, demographic characteristics, street connectivity variables, and neighborhood socioeconomic data were collected from respondents to the 2001 California Health Interview Survey (CHIS; N=10,688) in Los Angeles County (LAC) and San Diego County (SDC). Spatial scan statistics were used to identify clusters of high or low prevalence (with and without age-adjustment) and the quantity of time spent walking and biking. The data, a subset from the 2001 CHIS, were analyzed in 2007-2008. Geographic clusters of significantly high or low prevalence of walking and biking were detected in LAC and SDC. Structural variables such as street connectivity and shorter block lengths are consistently associated with higher levels of active transportation, but associations between active transportation and socioeconomic variables at the individual and neighborhood levels are mixed. Only one cluster with less time spent walking and biking among walkers/bikers was detected in LAC, and this was of borderline significance. Age-adjustment affects the clustering pattern of walking/biking prevalence in LAC, but not in SDC. The use of spatial scan statistics to identify significant clustering of health behaviors such as active transportation adds to the more traditional regression analysis that examines associations between behavior and environmental factors by identifying specific geographic areas with unusual levels of the behavior independent of predefined administrative units.
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-06-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.
Registered nurses' perceptions of rewarding and its significance.
Seitovirta, Jaana; Lehtimäki, Aku-Ville; Vehviläinen-Julkunen, Katri; Mitronen, Lasse; Kvist, Tarja
2017-11-07
To examine reward type preferences and their relationships with the significance of rewarding perceived by registered nurses in Finland. Previous studies have found relationships between nurses' rewarding and their motivation at work, job satisfaction and organisational commitment. Data were collected in a cross-sectional, descriptive, questionnaire survey from 402 registered nurses using the Registered Nurses' Perceptions of Rewarding Scale in 2015, and analysed with descriptive and multivariate statistical methods. Registered nurses assigned slightly higher values to several non-financial than to financial rewards. The non-financial reward types appreciation and feedback from work community, worktime arrangements, work content, and opportunity to develop, influence and participate were highly related to the significance of rewarding. We identified various rewards that registered nurses value, and indications that providing an appropriate array of rewards, particularly non-financial rewards, is a highly beneficial element of nursing management. It is important to understand the value of rewards for nursing management. Nurse managers should offer diverse rewards to their registered nurses to promote excellent performance and to help efforts to secure and maintain high-quality, safe patient care. The use of appropriate rewards is especially crucial to improving registered nurses' reward satisfaction and job satisfaction globally in the nursing profession. © 2017 John Wiley & Sons Ltd.
Rapalino, O; Kamalian, Shervin; Kamalian, Shahmir; Payabvash, S; Souza, L C S; Zhang, D; Mukta, J; Sahani, D V; Lev, M H; Pomerantz, S R
2012-04-01
To safeguard patient health, there is great interest in CT radiation-dose reduction. The purpose of this study was to evaluate the impact of an iterative-reconstruction algorithm, ASIR, on image-quality measures in reduced-dose head CT scans for adult patients. Using a 64-section scanner, we analyzed 100 reduced-dose adult head CT scans at 6 predefined levels of ASIR blended with FBP reconstruction. These scans were compared with 50 CT scans previously obtained at a higher routine dose without ASIR reconstruction. SNR and CNR were computed from Hounsfield unit measurements of normal GM and WM of brain parenchyma. A blinded qualitative analysis was performed in 10 lower-dose CT datasets compared with higher-dose ones without ASIR. Phantom data analysis was also performed. Lower-dose scans without ASIR had significantly lower mean GM and WM SNR (P = .003) and similar GM-WM CNR values compared with higher routine-dose scans. However, at ASIR levels of 20%-40%, there was no statistically significant difference in SNR, and at ASIR levels of ≥60%, the SNR values of the reduced-dose scans were significantly higher (P ASIR levels of ≥40% (P ASIR levels ≥60% (P ASIR in adult head CT scans reduces image noise and increases low-contrast resolution, while allowing lower radiation doses without affecting spatial resolution.
Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)
International Nuclear Information System (INIS)
2003-01-01
This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas
Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)
International Nuclear Information System (INIS)
2004-01-01
This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas
Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)
International Nuclear Information System (INIS)
2002-01-01
This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas
Directory of Open Access Journals (Sweden)
Lutz Bornmann
Full Text Available Using the InCites tool of Thomson Reuters, this study compares normalized citation impact values calculated for China, Japan, France, Germany, United States, and the UK throughout the time period from 1981 to 2010. InCites offers a unique opportunity to study the normalized citation impacts of countries using (i a long publication window (1981 to 2010, (ii a differentiation in (broad or more narrow subject areas, and (iii allowing for the use of statistical procedures in order to obtain an insightful investigation of national citation trends across the years. Using four broad categories, our results show significantly increasing trends in citation impact values for France, the UK, and especially Germany across the last thirty years in all areas. The citation impact of papers from China is still at a relatively low level (mostly below the world average, but the country follows an increasing trend line. The USA exhibits a stable pattern of high citation impact values across the years. With small impact differences between the publication years, the US trend is increasing in engineering and technology but decreasing in medical and health sciences as well as in agricultural sciences. Similar to the USA, Japan follows increasing as well as decreasing trends in different subject areas, but the variability across the years is small. In most of the years, papers from Japan perform below or approximately at the world average in each subject area.
International Nuclear Information System (INIS)
Miao Datong
2001-01-01
Objective: To investigate the damage of blood vessel endothelium and kidney function in patients with essential hypertension plus diabetes mellitus. Methods: Plasma levels of endothelin (Et) and 6-keto-PGF 1α (6-K-PGF 1α ) as well as urine albumin content were measured by radio immunoassay in 75 patients with essential hypertension (EH), among them 34 were complicated with DM, 35 controls were included in this experiment. Results: The plasma level of ET, 6-K-PGF 1α and urine Alb content were significantly higher in the patients than those in the controls (P 1α were also higher but of no statistic significance. Conclusion: The results suggest that the EH patients with DM were complicated with more serous damage in kidney function
Rostas, Jack W; Lively, Timothy B; Brevard, Sidney B; Simmons, Jon D; Frotan, Mohammad A; Gonzalez, Richard P
2017-04-01
The purpose of this study was to identify patients with rib injuries who were at risk for solid organ injury. A retrospective chart review was performed of all blunt trauma patients with rib fractures during the period from July 2007 to July 2012. Data were analyzed for association of rib fractures and solid organ injury. In all, 1,103 rib fracture patients were identified; 142 patients had liver injuries with 109 (77%) associated right rib fractures. Right-sided rib fractures with highest sensitivity for liver injury were middle rib segment (5 to 8) and lower segment (9 to 12) with liver injury sensitivities of 68% and 43%, respectively (P rib fractures. Left middle segment rib fractures and lower segment rib fractures had sensitivities of 80% and 63% for splenic injury, respectively (P Rib fractures higher in the thoracic cage have significant association with solid organ injury. Using rib fractures from middle plus lower segments as indication for abdominal screening will significantly improve rib fracture sensitivity for identification of solid organ injury. Copyright © 2016 Elsevier Inc. All rights reserved.
Leadership: Underrepresentation of Women in Higher Education
Krause, Susan Faye
2017-01-01
In 2014, statisticians at the Bureau of Labor Statistics found that women constitute 45% of the workforce. Women's participation in high-level organizational leadership roles remains low. In higher education, women's representation in top-ranking leadership roles is less than one-third at colleges and universities. The conceptual framework for…
Delphi Decision Methods in Higher Education Administration.
Judd, Robert C.
This document describes and comments on the extent of use of the Delphi method in higher education decision making. Delphi is characterized by: (1) anonymity of response; (2) multiple iterations; (3) convergence of the distribution of answers; and (4) statistical group response (median, interquartile range) preserving intact a distribution that…
Snow, Brian A.; Thro, William E.
2001-01-01
Asserts that from the perspective of America's public institutions of higher education, Blackstone's greatest legacy is his understanding of sovereign immunity. Explores the similarities between Blackstone's understanding of sovereign immunity and the current jurisprudence of the U.S. Supreme Court. (EV)
Encounter Probability of Significant Wave Height
DEFF Research Database (Denmark)
Liu, Z.; Burcharth, H. F.
The determination of the design wave height (often given as the significant wave height) is usually based on statistical analysis of long-term extreme wave height measurement or hindcast. The result of such extreme wave height analysis is often given as the design wave height corresponding to a c...
Less Physician Practice Competition Is Associated With Higher Prices Paid For Common Procedures.
Austin, Daniel R; Baker, Laurence C
2015-10-01
Concentration among physician groups has been steadily increasing, which may affect prices for physician services. We assessed the relationship in 2010 between physician competition and prices paid by private preferred provider organizations for fifteen common, high-cost procedures to understand whether higher concentration of physician practices and accompanying increased market power were associated with higher prices for services. Using county-level measures of the concentration of physician practices and county average prices, and statistically controlling for a range of other regional characteristics, we found that physician practice concentration and prices were significantly associated for twelve of the fifteen procedures we studied. For these procedures, counties with the highest average physician concentrations had prices 8-26 percent higher than prices in the lowest counties. We concluded that physician competition is frequently associated with prices. Policies that would influence physician practice organization should take this into consideration. Project HOPE—The People-to-People Health Foundation, Inc.
Comparison of Measures of Organizational Effectiveness in U.K. Higher Education.
Lysons, Art; Hatherly, David; Mitchell, David A.
1998-01-01
Research on the organizational effectiveness of higher education institutions in the United Kingdom and Australia is compared with research on United States higher education. Focus is on identification of and statistical discrimination between institution types, based on faculty and administrator perceptions and values. (MSE)
Targeted search for continuous gravitational waves: Bayesian versus maximum-likelihood statistics
International Nuclear Information System (INIS)
Prix, Reinhard; Krishnan, Badri
2009-01-01
We investigate the Bayesian framework for detection of continuous gravitational waves (GWs) in the context of targeted searches, where the phase evolution of the GW signal is assumed to be known, while the four amplitude parameters are unknown. We show that the orthodox maximum-likelihood statistic (known as F-statistic) can be rediscovered as a Bayes factor with an unphysical prior in amplitude parameter space. We introduce an alternative detection statistic ('B-statistic') using the Bayes factor with a more natural amplitude prior, namely an isotropic probability distribution for the orientation of GW sources. Monte Carlo simulations of targeted searches show that the resulting Bayesian B-statistic is more powerful in the Neyman-Pearson sense (i.e., has a higher expected detection probability at equal false-alarm probability) than the frequentist F-statistic.
Higher Education in Non-Standard Wage Contracts
Rosti, Luisa; Chelli, Francesco
2012-01-01
Purpose: The purpose of this paper is to verify whether higher education increases the likelihood of young Italian workers moving from non-standard to standard wage contracts. Design/methodology/approach: The authors exploit a data set on labour market flows, produced by the Italian National Statistical Office, by interviewing about 85,000…
[Comment on] Statistical discrimination
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
2000-10-01
Denmark's gross energy consumption increased in 1999 with almost 0,5% while the CO 2 emission decreased with 1,4%. Energy Statistics 1999 shows that the energy consumption in households and the production industries was the same as the year before. The consumption in the trade and service sectors and for transportation increased. The Danish production of petroleum, natural gas and renewable energy increased in 1999 to 1000 PJ which is an increase of 17% compared to 1998. The degree of self-supply increased to 118%, which means that the energy production was 18% higher than the energy consumption in 1999. This was primarily due to a very high increase of production of petroleum of 26%. (LN)
E-Learning in Croatian Higher Education: An Analysis of Students' Perceptions
Dukić, Darko; Andrijanić, Goran
2010-06-01
Over the last years, e-learning has taken an important role in Croatian higher education as a result of strategies defined and measures undertaken. Nonetheless, in comparison to the developed countries, the achievements in e-learning implementation are still unsatisfactory. Therefore, the efforts to advance e-learning within Croatian higher education need to be intensified. It is further necessary to undertake ongoing activities in order to solve possible problems in e-learning system functioning, which requires the development of adequate evaluation instruments and methods. One of the key steps in this process would be examining and analyzing users' attitudes. This paper presents a study of Croatian students' perceptions with regard to certain aspects of e-learning usage. Given the character of this research, adequate statistical methods were required for the data processing. The results of the analysis indicate that, for the most part, Croatian students have positive perceptions of e-learning, particularly as support to time-honored forms of teaching. However, they are not prepared to completely give up the traditional classroom. Using factor analysis, we identified four underlying factors of a collection of variables related to students' perceptions of e-learning. Furthermore, a certain number of statistically significant differences in student attitudes have been confirmed, in terms of gender and year of study. In our study we used discriminant analysis to determine discriminant functions that distinguished defined groups of students. With this research we managed to a certain degree to alleviate the current data insufficiency in the area of e-learning evaluation among Croatian students. Since this type of learning is gaining in importance within higher education, such analyses have to be conducted continuously.
Online neural monitoring of statistical learning.
Batterink, Laura J; Paller, Ken A
2017-05-01
The extraction of patterns in the environment plays a critical role in many types of human learning, from motor skills to language acquisition. This process is known as statistical learning. Here we propose that statistical learning has two dissociable components: (1) perceptual binding of individual stimulus units into integrated composites and (2) storing those integrated representations for later use. Statistical learning is typically assessed using post-learning tasks, such that the two components are conflated. Our goal was to characterize the online perceptual component of statistical learning. Participants were exposed to a structured stream of repeating trisyllabic nonsense words and a random syllable stream. Online learning was indexed by an EEG-based measure that quantified neural entrainment at the frequency of the repeating words relative to that of individual syllables. Statistical learning was subsequently assessed using conventional measures in an explicit rating task and a reaction-time task. In the structured stream, neural entrainment to trisyllabic words was higher than in the random stream, increased as a function of exposure to track the progression of learning, and predicted performance on the reaction time (RT) task. These results demonstrate that monitoring this critical component of learning via rhythmic EEG entrainment reveals a gradual acquisition of knowledge whereby novel stimulus sequences are transformed into familiar composites. This online perceptual transformation is a critical component of learning. Copyright © 2017 Elsevier Ltd. All rights reserved.
Trajectories in higher education: ProUni in focus
Felicetti,Vera Lucia; Cabrera,Alberto F.
2017-01-01
Abstract Trajectories in higher education and the University for All Program (ProUni) are the central theme of this paper. The research question was: To what extent were some factors experienced during university difficulties in the academic trajectory of ProUni and non-ProUni graduates? The approach was quantitative with an explanatory goal. Descriptive and inferential statistics were used in the data analysis. The research subjects were 197 higher education graduates from a Southern Brazil ...
Directory of Open Access Journals (Sweden)
Yun-Jie Zhang
2015-01-01
Full Text Available Objective: To investigate the value of lysophosphatidic acid (LPA in the diagnosis of ovarian cancer. Materials and Methods: We first performed a hospital-based, case-control study involving 123 ovarian cancer patients and 101 benign ovarian tumor patients, and then conducted a meta-analysis with 19 case-control studies to assess the correlation between ovarian cancer and plasma LPA levels. Results: The case-control study results demonstrated that ovarian cancer patients have increased LPA and cancer antigen (CA-125 levels compared to patients with benign ovarian tumor (LPA: Ovarian cancer vs benign ovarian tumor: 5.28 ± 1.52 vs 1.82 ± 0.77 μmol/L; CA-125: Ovarian cancer vs benign ovarian tumor: 87.17 ± 45.81 vs. 14.03 ± 10.14 U/mL, which showed statistically significant differences (both P < 0.05. LPA with advanced sensitivity, specificity, positive predictive value, negative predictive value, and accuracy rate of diagnosis excelled CA-125 in the diagnosis of ovarian cancer (both P < 0.05. The areas under the receiver operating characteristic (ROC curve in the diagnosis of ovarian cancer (LPA: 0.983; CA-125: 0.910 were statistically significant compared with the reference (both P < 0.001 and the difference of the areas of ROC curve between LPA and CA-125 in the diagnosis of ovarian cancer showed statistically significant difference (P < 0.05. The meta-analysis results suggested that plasma LPA levels were higher in ovarian cancer tissues than in benign tissues (standardized mean difference (SMD =2.36, 95% confidence interval (CI: 1.61-3.11, P < 0.001 and normal tissues (SMD = 2.32, 95% CI: 1.77-2.87, P < 0.001. Conclusion: LPA shows greater value in the diagnosis of ovarian cancer compared to CA-125 and may be employed as a biological index to diagnose ovarian cancer.
Statistical mechanics for a class of quantum statistics
International Nuclear Information System (INIS)
Isakov, S.B.
1994-01-01
Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived
Ademoglu, E; Berberoglu, Z; Carlioglu, A; Dellal, F; Gorar, S; Alphan, Z; Uysal, S; Karakurt, F
2014-12-01
The aim of this paper was to compare serum chemerin levels in nonobese and overweight/obese patients with polycystic ovary syndrome (PCOS) with lean controls. Seventy women with newly diagnosed or untreated PCOS and 38 age-matched nonobese healthy controls were enrolled in the present study. Participants with PCOS were categorized as nonobese (Body Mass Index [BMI] PCOS group than in nonobese PCOS women but did not reach statistical significance. Nonobese healthy controls had significantly lower chemerin levels than two PCOS groups (Pwomen with PCOS than in other two groups. Also, these two parameters were higher in lean patients with PCOS than in healthy controls (PPCOS women but also in nonobese PCOS women. The physiological significance of elevated serum chemerin in PCOS remains unclear.
Cicinelli, Ettore; Trojano, Giuseppe; Mastromauro, Marcella; Vimercati, Antonella; Marinaccio, Marco; Mitola, Paola Carmela; Resta, Leonardo; de Ziegler, Dominique
2017-08-01
To evaluate the association between endometriosis end chronic endometritis (CE) diagnosed by hysteroscopy, conventional histology, and immunohistochemistry. Case-control study. University hospital. Women with and without endometriosis who have undergone hysterectomy. Retrospective evaluation of 78 women who have undergone hysterectomy and were affected by endometriosis and 78 women without endometriosis. CE diagnosed based on conventional histology and immunohistochemistry with anti-syndecan-1 antibodies to identify CD138 cells. The prevalence of CE was statistically significantly higher in the women with endometriosis as compared with the women who did not have endometriosis (33 of 78, 42.3% vs. 12 of 78, 15.4% according to hysteroscopy; and 30 of 78, 38.5% vs. 11 of 78, 14.1% according to histology). The women were divided into two groups, 115 patients without CE and 41 patients with CE. With univariate analysis, parity was associated with a lower risk for CE, and endometriosis was associated with a statistically significantly elevated risk of CE. Using multivariate analysis, parity continued to be associated with a lower incidence of CE, whereas endometriosis was associated with a 2.7 fold higher risk. The diagnosis of CE is more frequent in women with endometriosis. Although no etiologic relationships between CE and endometriosis can be established, this study suggests that CE should be considered and if necessary ruled out in women with endometriosis, particularly if they have abnormal uterine bleeding. Identification and appropriate treatment of CE may avoid unnecessary surgery. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Data-driven inference for the spatial scan statistic.
Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C
2011-08-02
Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.
Medical Statistics – Mathematics or Oracle? Farewell Lecture
Directory of Open Access Journals (Sweden)
Gaus, Wilhelm
2005-06-01
Full Text Available Certainty is rare in medicine. This is a direct consequence of the individuality of each and every human being and the reason why we need medical statistics. However, statistics have their pitfalls, too. Fig. 1 shows that the suicide rate peaks in youth, while in Fig. 2 the rate is highest in midlife and Fig. 3 in old age. Which of these contradictory messages is right? After an introduction to the principles of statistical testing, this lecture examines the probability with which statistical test results are correct. For this purpose the level of significance and the power of the test are compared with the sensitivity and specificity of a diagnostic procedure. The probability of obtaining correct statistical test results is the same as that for the positive and negative correctness of a diagnostic procedure and therefore depends on prevalence. The focus then shifts to the problem of multiple statistical testing. The lecture demonstrates that for each data set of reasonable size at least one test result proves to be significant - even if the data set is produced by a random number generator. It is extremely important that a hypothesis is generated independently from the data used for its testing. These considerations enable us to understand the gradation of "lame excuses, lies and statistics" and the difference between pure truth and the full truth. Finally, two historical oracles are cited.
Statistical moments of the Strehl ratio
Yaitskova, Natalia; Esselborn, Michael; Gladysz, Szymon
2012-07-01
Knowledge of the statistical characteristics of the Strehl ratio is essential for the performance assessment of the existing and future adaptive optics systems. For full assessment not only the mean value of the Strehl ratio but also higher statistical moments are important. Variance is related to the stability of an image and skewness reflects the chance to have in a set of short exposure images more or less images with the quality exceeding the mean. Skewness is a central parameter in the domain of lucky imaging. We present a rigorous theory for the calculation of the mean value, the variance and the skewness of the Strehl ratio. In our approach we represent the residual wavefront as being formed by independent cells. The level of the adaptive optics correction defines the number of the cells and the variance of the cells, which are the two main parameters of our theory. The deliverables are the values of the three moments as the functions of the correction level. We make no further assumptions except for the statistical independence of the cells.
Australian Indigenous Higher Education: Politics, Policy and Representation
Wilson, Katie; Wilks, Judith
2015-01-01
The growth of Aboriginal and Torres Strait Islander participation in Australian higher education from 1959 to the present is notable statistically, but below population parity. Distinct patterns in government policy-making and programme development, inconsistent funding and political influences, together with Indigenous representation during the…
Higher moments method for generalized Pareto distribution in flood frequency analysis
Zhou, C. R.; Chen, Y. F.; Huang, Q.; Gu, S. H.
2017-08-01
The generalized Pareto distribution (GPD) has proven to be the ideal distribution in fitting with the peak over threshold series in flood frequency analysis. Several moments-based estimators are applied to estimating the parameters of GPD. Higher linear moments (LH moments) and higher probability weighted moments (HPWM) are the linear combinations of Probability Weighted Moments (PWM). In this study, the relationship between them will be explored. A series of statistical experiments and a case study are used to compare their performances. The results show that if the same PWM are used in LH moments and HPWM methods, the parameter estimated by these two methods is unbiased. Particularly, when the same PWM are used, the PWM method (or the HPWM method when the order equals 0) shows identical results in parameter estimation with the linear Moments (L-Moments) method. Additionally, this phenomenon is significant when r ≥ 1 that the same order PWM are used in HPWM and LH moments method.
Testing for significance of phase synchronisation dynamics in the EEG.
Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J
2013-06-01
A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.
International Nuclear Information System (INIS)
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-01-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.
Conversion factors and oil statistics
International Nuclear Information System (INIS)
Karbuz, Sohbet
2004-01-01
World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers
Statistics with JMP graphs, descriptive statistics and probability
Goos, Peter
2015-01-01
Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic
DEFF Research Database (Denmark)
Schneider, Jesper Wiborg
2012-01-01
In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Rankin...
Serum endocan level and its prognostic significance in breast cancer patients
Directory of Open Access Journals (Sweden)
Ozturk Ates
2018-04-01
Full Text Available Background: Endocan, known as endothelial cell specific molecule (ESM, is a novel endothelial dysfunction marker. The aim of this study is to examine the plasma endocan level and its prognostic significance in newly diagnosed breast cancer patients. Methods: A total of 84 patients were enrolled the study. Plasma endocan level was measured by specific enzyme-linked immunosorbent assay (ELİSA kit. Ethical approval and informed consent were attained. Results: At the time of diagnosis, 33 patients had stage 4 disease. The median plasma endocan level was 619.9 (min 259.9–2813.2 ng/L and its level was significantly higher in metastatic breast cancer group compared to non–metastatic breast cancer group. According to molecular sub-type of breast cancer, there is not statistical difference in plasma endocan level, but its level was higher in patients with Her-2 amplified and triple negative breast cancer (TNBC. Median follow-up time is 11 (1-30 months. Event free survival (EFS was 15 months in patients with plasma endocan level lower than 620, while it was 4 months in patients with serum endocan level greater than 620 (p = 0.016. There was no difference between groups in terms of hypertension, age, Lymphovascular invasion (LVI, extra capsular extension (ECE, body mass index (BMI and White blood cells (WBC, platelet count and plasma endocan level. Conclusion: Plasma endocan levels higher than non metastatic breast cancer. Patients with high plasma endocan levels are short EFS. Further studies would be useful to assess endocan level as a prognostic factor in breast cancer. Keywords: Endothelial cell specific molecule, Breast cancer, Prognosis
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
On the inflationary perturbations of massive higher-spin fields
Energy Technology Data Exchange (ETDEWEB)
Kehagias, Alex [Physics Division, National Technical University of Athens, 15780 Zografou Campus, Athens (Greece); Riotto, Antonio, E-mail: kehagias@central.ntua.gr, E-mail: Antonio.Riotto@unige.ch [Department of Theoretical Physics and Center for Astroparticle Physics (CAP), 24 quai E. Ansermet, CH-1211 Geneva 4 (Switzerland)
2017-07-01
Cosmological perturbations of massive higher-spin fields are generated during inflation, but they decay on scales larger than the Hubble radius as a consequence of the Higuchi bound. By introducing suitable couplings to the inflaton field, we show that one can obtain statistical correlators of massive higher-spin fields which remain constant or decay very slowly outside the Hubble radius. This opens up the possibility of new observational signatures from inflation.
Kleibergen, F.R.
2003-01-01
We show that the sensitivity of the limit distribution of commonly used GMM statistics to weak and many instruments results from superfluous elements in the higher order expansion of these statistics. When the instruments are strong and their number is small, these elements are of higher order and
Competitiveness - higher education
Directory of Open Access Journals (Sweden)
Labas Istvan
2016-03-01
Full Text Available Involvement of European Union plays an important role in the areas of education and training equally. The member states are responsible for organizing and operating their education and training systems themselves. And, EU policy is aimed at supporting the efforts of member states and trying to find solutions for the common challenges which appear. In order to make our future sustainable maximally; the key to it lies in education. The highly qualified workforce is the key to development, advancement and innovation of the world. Nowadays, the competitiveness of higher education institutions has become more and more appreciated in the national economy. In recent years, the frameworks of operation of higher education systems have gone through a total transformation. The number of applying students is continuously decreasing in some European countries therefore only those institutions can “survive” this shortfall, which are able to minimize the loss of the number of students. In this process, the factors forming the competitiveness of these budgetary institutions play an important role from the point of view of survival. The more competitive a higher education institution is, the greater the chance is that the students would like to continue their studies there and thus this institution will have a greater chance for the survival in the future, compared to ones lagging behind in the competition. Aim of our treatise prepared is to present the current situation and main data of the EU higher education and we examine the performance of higher education: to what extent it fulfils the strategy for smart, sustainable and inclusive growth which is worded in the framework of Europe 2020 programme. The treatise is based on analysis of statistical data.
Attitude of teaching faculty towards statistics at a medical university in Karachi, Pakistan.
Khan, Nazeer; Mumtaz, Yasmin
2009-01-01
Statistics is mainly used in biological research to verify the clinicians and researchers findings and feelings, and gives scientific validity for their inferences. In Pakistan, the educational curriculum is developed in such a way that the students who are interested in entering in the field of biological sciences do not study mathematics after grade 10. Therefore, due to their fragile background of mathematical skills, the Pakistani medical professionals feel that they do not have adequate base to understand the basic concepts of statistical techniques when they try to use it in their research or read a scientific article. The aim of the study was to assess the attitude of medical faculty towards statistics. A questionnaire containing 42 close-ended and 4 open-ended questions, related to the attitude and knowledge of statistics, was distributed among the teaching faculty of Dow University of Health Sciences (DUHS). One hundred and sixty-seven filled questionnaires were returned from 374 faculty members (response rate 44.7%). Forty-three percent of the respondents claimed that they had 'introductive' level of statistics courses, 63% of the respondents strongly agreed that a good researcher must have some training in statistics, 82% of the faculty was in favour (strongly agreed or agreed) that statistics was really useful for research. Only 17% correctly stated that statistics is the science of uncertainty. Half of the respondents accepted that they have problem of writing the statistical section of the article. 64% of the subjects indicated that statistical teaching methods were the main reasons for the impression of its difficulties. 53% of the faculty indicated that the co-authorship of the statistician should depend upon his/her contribution in the study. Gender did not show any significant difference among the responses. However, senior faculty showed higher level of the importance for the use of statistics and difficulties of writing result section of
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
Jerez, José M; Molina, Ignacio; García-Laencina, Pedro J; Alba, Emilio; Ribelles, Nuria; Martín, Miguel; Franco, Leonardo
2010-10-01
Missing data imputation is an important task in cases where it is crucial to use all available data and not discard records with missing values. This work evaluates the performance of several statistical and machine learning imputation methods that were used to predict recurrence in patients in an extensive real breast cancer data set. Imputation methods based on statistical techniques, e.g., mean, hot-deck and multiple imputation, and machine learning techniques, e.g., multi-layer perceptron (MLP), self-organisation maps (SOM) and k-nearest neighbour (KNN), were applied to data collected through the "El Álamo-I" project, and the results were then compared to those obtained from the listwise deletion (LD) imputation method. The database includes demographic, therapeutic and recurrence-survival information from 3679 women with operable invasive breast cancer diagnosed in 32 different hospitals belonging to the Spanish Breast Cancer Research Group (GEICAM). The accuracies of predictions on early cancer relapse were measured using artificial neural networks (ANNs), in which different ANNs were estimated using the data sets with imputed missing values. The imputation methods based on machine learning algorithms outperformed imputation statistical methods in the prediction of patient outcome. Friedman's test revealed a significant difference (p=0.0091) in the observed area under the ROC curve (AUC) values, and the pairwise comparison test showed that the AUCs for MLP, KNN and SOM were significantly higher (p=0.0053, p=0.0048 and p=0.0071, respectively) than the AUC from the LD-based prognosis model. The methods based on machine learning techniques were the most suited for the imputation of missing values and led to a significant enhancement of prognosis accuracy compared to imputation methods based on statistical procedures. Copyright © 2010 Elsevier B.V. All rights reserved.
Statistics Anxiety and Business Statistics: The International Student
Bell, James A.
2008-01-01
Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…
Higher fuel prices are associated with lower air pollution levels.
Barnett, Adrian G; Knibbs, Luke D
2014-05-01
Air pollution is a persistent problem in urban areas, and traffic emissions are a major cause of poor air quality. Policies to curb pollution levels often involve raising the price of using private vehicles, for example, congestion charges. We were interested in whether higher fuel prices were associated with decreased air pollution levels. We examined an association between diesel and petrol prices and four traffic-related pollutants in Brisbane from 2010 to 2013. We used a regression model and examined pollution levels up to 16 days after the price change. Higher diesel prices were associated with statistically significant short-term reductions in carbon monoxide and nitrogen oxides. Changes in petrol prices had no impact on air pollution. Raising diesel taxes in Australia could be justified as a public health measure. As raising taxes is politically unpopular, an alternative political approach would be to remove schemes that put a downward pressure on fuel prices, such as industry subsidies and shopping vouchers that give fuel discounts. Copyright © 2014 Elsevier Ltd. All rights reserved.
Applying Statistical Mechanics to pixel detectors
International Nuclear Information System (INIS)
Pindo, Massimiliano
2002-01-01
Pixel detectors, being made of a large number of active cells of the same kind, can be considered as significant sets to which Statistical Mechanics variables and methods can be applied. By properly redefining well known statistical parameters in order to let them match the ones that actually characterize pixel detectors, an analysis of the way they work can be performed in a totally new perspective. A deeper understanding of pixel detectors is attained, helping in the evaluation and comparison of their intrinsic characteristics and performance
Introductory statistics for the behavioral sciences
Welkowitz, Joan; Cohen, Jacob
1971-01-01
Introductory Statistics for the Behavioral Sciences provides an introduction to statistical concepts and principles. This book emphasizes the robustness of parametric procedures wherein such significant tests as t and F yield accurate results even if such assumptions as equal population variances and normal population distributions are not well met.Organized into three parts encompassing 16 chapters, this book begins with an overview of the rationale upon which much of behavioral science research is based, namely, drawing inferences about a population based on data obtained from a samp
Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.
2014-03-01
The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.
Spreadsheets as tools for statistical computing and statistics education
Neuwirth, Erich
2000-01-01
Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.
Directory of Open Access Journals (Sweden)
Rossi Hassad
2018-01-01
Full Text Available Students� attitude, including perceived usefulness, is generally associated with academic success. The related research in statistics education has focused almost exclusively on the role of attitude in explaining and predicting academic learning outcomes, hence there is a paucity of research evidence on how attitude (particularly perceived usefulness impacts students� intentions to use and stay engaged in statistics beyond the introductory course. This study explored the relationship between college students� perception of the usefulness of an introductory statistics course, their beliefs about where statistics will be most useful, and their intentions to take another statistics course. A cross-sectional study of 106 students was conducted. The mean rating for usefulness was 4.7 (out of 7, with no statistically significant differences based on gender and age. Sixty-four percent reported that they would consider taking another statistics course, and this subgroup rated the course as more useful (p = .01. The majority (67% reported that statistics would be most useful for either graduate school or research, whereas 14% indicated their job, and 19% were undecided. The �undecided� students had the lowest mean rating for usefulness of the course (p = .001. Addressing data, in the context of real-world problem-solving and decision-making, could facilitate students to better appreciate the usefulness and practicality of statistics. Qualitative research methods could help to elucidate these findings.
Prognostic significance of macrophage invasion in hilar cholangiocarcinoma
International Nuclear Information System (INIS)
Atanasov, Georgi; Hau, Hans-Michael; Dietel, Corinna; Benzing, Christian; Krenzien, Felix; Brandl, Andreas; Wiltberger, Georg; Matia, Ivan; Prager, Isabel; Schierle, Katrin; Robson, Simon C.; Reutzel-Selke, Anja; Pratschke, Johann; Schmelzle, Moritz; Jonas, Sven
2015-01-01
Tumor-associated macrophages (TAMs) promote tumor progression and have an effect on survival in human cancer. However, little is known regarding their influence on tumor progression and prognosis in human hilar cholangiocarcinoma. We analyzed surgically resected tumor specimens of hilar cholangiocarcinoma (n = 47) for distribution and localization of TAMs, as defined by expression of CD68. Abundance of TAMs was correlated with clinicopathologic characteristics, tumor recurrence and patients’ survival. Statistical analysis was performed using SPSS software. Patients with high density of TAMs in tumor invasive front (TIF) showed significantly higher local and overall tumor recurrence (both ρ < 0.05). Furthermore, high density of TAMs was associated with decreased overall (one-year 83.6 % vs. 75.1 %; three-year 61.3 % vs. 42.4 %; both ρ < 0.05) and recurrence-free survival (one-year 93.9 % vs. 57.4 %; three-year 59.8 % vs. 26.2 %; both ρ < 0.05). TAMs in TIF and tumor recurrence, were confirmed as the only independent prognostic variables in the multivariate survival analysis (all ρ < 0.05). Overall survival and recurrence free survival of patients with hilar cholangiocarcinoma significantly improved in patients with low levels of TAMs in the area of TIF, when compared to those with a high density of TAMs. These observations suggest their utilization as valuable prognostic markers in routine histopathologic evaluation, and might indicate future therapeutic approaches by targeting TAMs
Hamed, Moath; Schraml, Frank; Wilson, Jeffrey; Galvin, James; Sabbagh, Marwan N
2018-01-01
To determine whether occipital and cingulate hypometabolism is being under-reported or missed on 18-fluorodeoxyglucose positron emission tomography (FDG-PET) CT scans in patients with Dementia with Lewy Bodies (DLB). Recent studies have reported higher sensitivity and specificity for occipital and cingulate hypometabolism on FDG-PET of DLB patients. This retrospective chart review looked at regions of interest (ROI's) in FDG-PET CT scan reports in 35 consecutive patients with a clinical diagnosis of probable, possible, or definite DLB as defined by the latest DLB Consortium Report. ROI's consisting of glucose hypometabolism in frontal, parietal, temporal, occipital, and cingulate areas were tabulated and charted separately by the authors from the reports. A blinded Nuclear medicine physician read the images independently and marked ROI's separately. A Cohen's Kappa coefficient statistic was calculated to determine agreement between the reports and the blinded reads. On the radiology reports, 25.71% and 17.14% of patients reported occipital and cingulate hypometabolism respectively. Independent reads demonstrated significant disagreement with the proportion of occipital and cingulate hypometabolism being reported on initial reads: 91.43% and 85.71% respectively. Cohen's Kappa statistic determinations demonstrated significant agreement only with parietal hypometabolism (pOccipital and cingulate hypometabolism is under-reported and missed frequently on clinical interpretations of FDG-PET scans of patients with DLB, but the frequency of hypometabolism is even higher than previously reported. Further studies with more statistical power and receiver operating characteristic analyses are needed to delineate the sensitivity and specificity of these in vivo biomarkers.
Data-driven inference for the spatial scan statistic
Directory of Open Access Journals (Sweden)
Duczmal Luiz H
2011-08-01
Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.
Register-based statistics statistical methods for administrative data
Wallgren, Anders
2014-01-01
This book provides a comprehensive and up to date treatment of theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
Energy Technology Data Exchange (ETDEWEB)
Fhager, V
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
International Nuclear Information System (INIS)
Fhager, V.
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
Milic, Natasa M.; Trajkovic, Goran Z.; Bukumiric, Zoran M.; Cirkovic, Andja; Nikolic, Ivan M.; Milin, Jelena S.; Milic, Nikola V.; Savic, Marko D.; Corac, Aleksandar M.; Marinkovic, Jelena M.; Stanisavljevic, Dejana M.
2016-01-01
Background Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. Methods This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013–14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Results Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (plearning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics. PMID:26859832
Worm, Bjarne Skjødt; Jensen, Kenneth
2013-01-01
Background and aims The fast development of e-learning and social forums demands us to update our understanding of e-learning and peer learning. We aimed to investigate if higher, pre-defined levels of e-learning or social interaction in web forums improved students' learning ability. Methods One hundred and twenty Danish medical students were randomized to six groups all with 20 students (eCases level 1, eCases level 2, eCases level 2+, eTextbook level 1, eTextbook level 2, and eTextbook level 2+). All students participated in a pre-test, Group 1 participated in an interactive case-based e-learning program, while Group 2 was presented with textbook material electronically. The 2+ groups were able to discuss the material between themselves in a web forum. The subject was head injury and associated treatment and observation guidelines in the emergency room. Following the e-learning, all students completed a post-test. Pre- and post-tests both consisted of 25 questions randomly chosen from a pool of 50 different questions. Results All students concluded the study with comparable pre-test results. Students at Level 2 (in both groups) improved statistically significant compared to students at level 1 (p>0.05). There was no statistically significant difference between level 2 and level 2+. However, level 2+ was associated with statistically significant greater student's satisfaction than the rest of the students (p>0.05). Conclusions This study applies a new way of comparing different types of e-learning using a pre-defined level division and the possibility of peer learning. Our findings show that higher levels of e-learning does in fact provide better results when compared with the same type of e-learning at lower levels. While social interaction in web forums increase student satisfaction, learning ability does not seem to change. Both findings are relevant when designing new e-learning materials.
Worm, Bjarne Skjødt; Jensen, Kenneth
2013-01-01
Background and aims The fast development of e-learning and social forums demands us to update our understanding of e-learning and peer learning. We aimed to investigate if higher, pre-defined levels of e-learning or social interaction in web forums improved students’ learning ability. Methods One hundred and twenty Danish medical students were randomized to six groups all with 20 students (eCases level 1, eCases level 2, eCases level 2+, eTextbook level 1, eTextbook level 2, and eTextbook level 2+). All students participated in a pre-test, Group 1 participated in an interactive case-based e-learning program, while Group 2 was presented with textbook material electronically. The 2+ groups were able to discuss the material between themselves in a web forum. The subject was head injury and associated treatment and observation guidelines in the emergency room. Following the e-learning, all students completed a post-test. Pre- and post-tests both consisted of 25 questions randomly chosen from a pool of 50 different questions. Results All students concluded the study with comparable pre-test results. Students at Level 2 (in both groups) improved statistically significant compared to students at level 1 (p>0.05). There was no statistically significant difference between level 2 and level 2+. However, level 2+ was associated with statistically significant greater student's satisfaction than the rest of the students (p>0.05). Conclusions This study applies a new way of comparing different types of e-learning using a pre-defined level division and the possibility of peer learning. Our findings show that higher levels of e-learning does in fact provide better results when compared with the same type of e-learning at lower levels. While social interaction in web forums increase student satisfaction, learning ability does not seem to change. Both findings are relevant when designing new e-learning materials. PMID:24229729
Directory of Open Access Journals (Sweden)
Bjarne Skjødt Worm
2013-11-01
Full Text Available Background and aims : The fast development of e-learning and social forums demands us to update our understanding of e-learning and peer learning. We aimed to investigate if higher, pre-defined levels of e-learning or social interaction in web forums improved students’ learning ability. Methods : One hundred and twenty Danish medical students were randomized to six groups all with 20 students (eCases level 1, eCases level 2, eCases level 2+, eTextbook level 1, eTextbook level 2, and eTextbook level 2+. All students participated in a pre-test, Group 1 participated in an interactive case-based e-learning program, while Group 2 was presented with textbook material electronically. The 2+ groups were able to discuss the material between themselves in a web forum. The subject was head injury and associated treatment and observation guidelines in the emergency room. Following the e-learning, all students completed a post-test. Pre- and post-tests both consisted of 25 questions randomly chosen from a pool of 50 different questions. Results : All students concluded the study with comparable pre-test results. Students at Level 2 (in both groups improved statistically significant compared to students at level 1 (p>0.05. There was no statistically significant difference between level 2 and level 2+. However, level 2+ was associated with statistically significant greater student's satisfaction than the rest of the students (p>0.05. Conclusions : This study applies a new way of comparing different types of e-learning using a pre-defined level division and the possibility of peer learning. Our findings show that higher levels of e-learning does in fact provide better results when compared with the same type of e-learning at lower levels. While social interaction in web forums increase student satisfaction, learning ability does not seem to change. Both findings are relevant when designing new e-learning materials.
Software Used to Generate Cancer Statistics - SEER Cancer Statistics
Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.
Understanding Statistics and Statistics Education: A Chinese Perspective
Shi, Ning-Zhong; He, Xuming; Tao, Jian
2009-01-01
In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M.; Wolters, Lidewij H.; Huizenga, Hilde M.
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a
Statistical inference and visualization in scale-space for spatially dependent images
Vaughan, Amy
2012-03-01
SiZer (SIgnificant ZERo crossing of the derivatives) is a graphical scale-space visualization tool that allows for statistical inferences. In this paper we develop a spatial SiZer for finding significant features and conducting goodness-of-fit tests for spatially dependent images. The spatial SiZer utilizes a family of kernel estimates of the image and provides not only exploratory data analysis but also statistical inference with spatial correlation taken into account. It is also capable of comparing the observed image with a specific null model being tested by adjusting the statistical inference using an assumed covariance structure. Pixel locations having statistically significant differences between the image and a given null model are highlighted by arrows. The spatial SiZer is compared with the existing independent SiZer via the analysis of simulated data with and without signal on both planar and spherical domains. We apply the spatial SiZer method to the decadal temperature change over some regions of the Earth. © 2011 The Korean Statistical Society.
Higher order antibunching in intermediate states
International Nuclear Information System (INIS)
Verma, Amit; Sharma, Navneet K.; Pathak, Anirban
2008-01-01
Since the introduction of binomial state as an intermediate state, different intermediate states have been proposed. Different nonclassical effects have also been reported in these intermediate states. But till now higher order antibunching is predicted in only one type of intermediate state, which is known as shadowed negative binomial state. Recently we have shown that the higher order antibunching is not a rare phenomenon [P. Gupta, P. Pandey, A. Pathak, J. Phys. B 39 (2006) 1137]. To establish our earlier claim further, here we have shown that the higher order antibunching can be seen in different intermediate states, such as binomial state, reciprocal binomial state, hypergeometric state, generalized binomial state, negative binomial state and photon added coherent state. We have studied the possibility of observing the higher order subpoissonian photon statistics in different limits of intermediate states. The effects of different control parameters on the depth of non classicality have also been studied in this connection and it has been shown that the depth of nonclassicality can be tuned by controlling various physical parameters
The Role of Statistics in Business and Industry
Hahn, Gerald J
2011-01-01
An insightful guide to the use of statistics for solving key problems in modern-day business and industry This book has been awarded the Technometrics Ziegel Prize for the best book reviewed by the journal in 2010. Technometrics is a journal of statistics for the physical, chemical and engineering sciences, published jointly by the American Society for Quality and the American Statistical Association. Criteria for the award include that the book brings together in one volume a body of material previously only available in scattered research articles and having the potential to significantly im
Implementing statistical equating for MRCP(UK) Parts 1 and 2.
McManus, I C; Chis, Liliana; Fox, Ray; Waller, Derek; Tang, Peter
2014-09-26
The MRCP(UK) exam, in 2008 and 2010, changed the standard-setting of its Part 1 and Part 2 examinations from a hybrid Angoff/Hofstee method to statistical equating using Item Response Theory, the reference group being UK graduates. The present paper considers the implementation of the change, the question of whether the pass rate increased amongst non-UK candidates, any possible role of Differential Item Functioning (DIF), and changes in examination predictive validity after the change. Analysis of data of MRCP(UK) Part 1 exam from 2003 to 2013 and Part 2 exam from 2005 to 2013. Inspection suggested that Part 1 pass rates were stable after the introduction of statistical equating, but showed greater annual variation probably due to stronger candidates taking the examination earlier. Pass rates seemed to have increased in non-UK graduates after equating was introduced, but was not associated with any changes in DIF after statistical equating. Statistical modelling of the pass rates for non-UK graduates found that pass rates, in both Part 1 and Part 2, were increasing year on year, with the changes probably beginning before the introduction of equating. The predictive validity of Part 1 for Part 2 was higher with statistical equating than with the previous hybrid Angoff/Hofstee method, confirming the utility of IRT-based statistical equating. Statistical equating was successfully introduced into the MRCP(UK) Part 1 and Part 2 written examinations, resulting in higher predictive validity than the previous Angoff/Hofstee standard setting. Concerns about an artefactual increase in pass rates for non-UK candidates after equating were shown not to be well-founded. Most likely the changes resulted from a genuine increase in candidate ability, albeit for reasons which remain unclear, coupled with a cognitive illusion giving the impression of a step-change immediately after equating began. Statistical equating provides a robust standard-setting method, with a better
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Statistical Symbolic Execution with Informed Sampling
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
The use and misuse of statistical methodologies in pharmacology research.
Marino, Michael J
2014-01-01
Descriptive, exploratory, and inferential statistics are necessary components of hypothesis-driven biomedical research. Despite the ubiquitous need for these tools, the emphasis on statistical methods in pharmacology has become dominated by inferential methods often chosen more by the availability of user-friendly software than by any understanding of the data set or the critical assumptions of the statistical tests. Such frank misuse of statistical methodology and the quest to reach the mystical αstatistical training. Perhaps more critically, a poor understanding of statistical tools limits the conclusions that may be drawn from a study by divorcing the investigator from their own data. The net result is a decrease in quality and confidence in research findings, fueling recent controversies over the reproducibility of high profile findings and effects that appear to diminish over time. The recent development of "omics" approaches leading to the production of massive higher dimensional data sets has amplified these issues making it clear that new approaches are needed to appropriately and effectively mine this type of data. Unfortunately, statistical education in the field has not kept pace. This commentary provides a foundation for an intuitive understanding of statistics that fosters an exploratory approach and an appreciation for the assumptions of various statistical tests that hopefully will increase the correct use of statistics, the application of exploratory data analysis, and the use of statistical study design, with the goal of increasing reproducibility and confidence in the literature. Copyright © 2013. Published by Elsevier Inc.
Statistical shape analysis with applications in R
Dryden, Ian L
2016-01-01
A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while reta...
Monte Carlo testing in spatial statistics, with applications to spatial residuals
DEFF Research Database (Denmark)
Mrkvička, Tomáš; Soubeyrand, Samuel; Myllymäki, Mari
2016-01-01
This paper reviews recent advances made in testing in spatial statistics and discussed at the Spatial Statistics conference in Avignon 2015. The rank and directional quantile envelope tests are discussed and practical rules for their use are provided. These tests are global envelope tests...... with an appropriate type I error probability. Two novel examples are given on their usage. First, in addition to the test based on a classical one-dimensional summary function, the goodness-of-fit of a point process model is evaluated by means of the test based on a higher dimensional functional statistic, namely...
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Clinical significance of quantitative analysis of facial nerve enhancement on MRI in Bell's palsy.
Song, Mee Hyun; Kim, Jinna; Jeon, Ju Hyun; Cho, Chang Il; Yoo, Eun Hye; Lee, Won-Sang; Lee, Ho-Ki
2008-11-01
Quantitative analysis of the facial nerve on the lesion side as well as the normal side, which allowed for more accurate measurement of facial nerve enhancement in patients with facial palsy, showed statistically significant correlation with the initial severity of facial nerve inflammation, although little prognostic significance was shown. This study investigated the clinical significance of quantitative measurement of facial nerve enhancement in patients with Bell's palsy by analyzing the enhancement pattern and correlating MRI findings with initial severity of facial palsy and clinical outcome. Facial nerve enhancement was measured quantitatively by using the region of interest on pre- and postcontrast T1-weighted images in 44 patients diagnosed with Bell's palsy. The signal intensity increase on the lesion side was first compared with that of the contralateral side and then correlated with the initial degree of facial palsy and prognosis. The lesion side showed significantly higher signal intensity increase compared with the normal side in all of the segments except for the mastoid segment. Signal intensity increase at the internal auditory canal and labyrinthine segments showed correlation with the initial degree of facial palsy but no significant difference was found between different prognostic groups.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
International Nuclear Information System (INIS)
Garcia, Francisco; Palacio, Carlos; Garcia, Uriel
2012-01-01
Multivariate statistical techniques were used to investigate the temporal and spatial variations of water quality at the Santa Marta coastal area where a submarine out fall that discharges 1 m3/s of domestic wastewater is located. Two-way analysis of variance (ANOVA), cluster and principal component analysis and Krigging interpolation were considered for this report. Temporal variation showed two heterogeneous periods. From December to April, and July, where the concentration of the water quality parameters is higher; the rest of the year (May, June, August-November) were significantly lower. The spatial variation reported two areas where the water quality is different, this difference is related to the proximity to the submarine out fall discharge.
Colon-Berlingeri, Migdalisel; Burrowes, Patricia A
2011-01-01
Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.
Higher education, Graduate unemployment, Poverty and Economic growth in Tunisia, 1990-2013
Directory of Open Access Journals (Sweden)
Haifa Mefteh
2016-07-01
Full Text Available This paper examines the relationship between economic growth, higher education, unemployment and poverty using properties of time series variables while applying the Ordinary Least Squares (OLS method. Our study thus contributes to the existing literature by giving the first integrated approach to examine the four way linkages in the Tunisian background over the period 1990-2013. This paper holds that higher education can impact unemployment and graduate unemployment causes poverty which would affect economic growth. Our empirical results show that there is bi-directional causal relationship between per capita gross domestic product (GDP and poverty rate (POV and also between Number of graduate students (GRA and School enrollment tertiary education (ENR besides unidirectional causal relationship which running from Number of graduate students to Unemployment with tertiary education (UNP, from Higher education expenditure (EXP to poverty rate and from Unemployment with tertiary education to poverty rate. Our empirical results also verified the existence of positive effect of ENR, GRA and POV on economic growth, while, UNP and EXP have negative determining influence on economic growth with only GRA statistically significant. These empirical insights are of particular interest for the policy makers as they help build sound economic policies to sustain economic development and improve the higher educational quality.
Effects of quantum coherence on work statistics
Xu, Bao-Ming; Zou, Jian; Guo, Li-Sha; Kong, Xiang-Mu
2018-05-01
In the conventional two-point measurement scheme of quantum thermodynamics, quantum coherence is destroyed by the first measurement. But as we know the coherence really plays an important role in the quantum thermodynamics process, and how to describe the work statistics for a quantum coherent process is still an open question. In this paper, we use the full counting statistics method to investigate the effects of quantum coherence on work statistics. First, we give a general discussion and show that for a quantum coherent process, work statistics is very different from that of the two-point measurement scheme, specifically the average work is increased or decreased and the work fluctuation can be decreased by quantum coherence, which strongly depends on the relative phase, the energy level structure, and the external protocol. Then, we concretely consider a quenched one-dimensional transverse Ising model and show that quantum coherence has a more significant influence on work statistics in the ferromagnetism regime compared with that in the paramagnetism regime, so that due to the presence of quantum coherence the work statistics can exhibit the critical phenomenon even at high temperature.
A Statistical Study of Serum Cholesterol Level by Gender and Race.
Tharu, Bhikhari Prasad; Tsokos, Chris P
2017-07-25
Cholesterol level (CL) is growing concerned as health issue in human health since it is considered one of the causes in heart diseases. A study of cholesterol level can provide insight about its nature and characteristics. A cross-sectional study. National Health and Nutrition Examination Survey (NHANS) II was conducted on a probability sample of approximately 28,000 persons in the USA and cholesterol level is obtained from laboratory results. Samples were selected so that certain population groups thought to be at high risk of malnutrition. Study included 11,864 persons for CL cases with 9,602 males and 2,262 females with races: whites, blacks, and others. Non-parametric statistical tests and goodness of fit test have been used to identify probability distributions. The study concludes that the cholesterol level exhibits significant racial and gender differences in terms of probability distributions. The study has concluded that white people are relatively higher at risk than black people to have risk line and high risk cholesterol. The study clearly indicates that black males normally have higher cholesterol. Females have lower variation in cholesterol than males. There exists gender and racial discrepancies in cholesterol which has been identified as lognormal and gamma probability distributions. White individuals seem to be at a higher risk of having high risk cholesterol level than blacks. Females tend to have higher variation in cholesterol level than males.
Shared Leadership Transforms Higher Education IT
Duin, Ann Hill; Cawley, Steve; Gulachek, Bernard; O'Sullivan, Douglas M.; Wollner, Diane
2011-01-01
Globalization, immersive research and learning environments, unlimited access to information and analytics, and fiscal realities continue to impact higher education--and higher education IT. Although IT organizations face immense pressure to meet significantly greater expectations at significantly less cost, with such pressure comes the…
Energy Technology Data Exchange (ETDEWEB)
Teodoro, Rodrigo; Dias, Carla R.B.R.; Osso Junior, Joao A., E-mail: jaosso@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Fernandez Nunez, Eutimio Gustavo [Universidade de Sao Paulo (EP/USP), SP (Brazil). Escola Politecnica. Dept. de Engenharia Quimica
2011-07-01
Precipitation of {sup 99}Mo by {alpha}-benzoin oxime ({alpha}-Bz) is a standard precipitation method for molybdenum due the high selectivity of this agent. Nowadays, statistical analysis tools have been employed in analytical systems to prove its efficiency and feasibility. IPEN has a project aiming the production of {sup 99}Mo by the fission of {sup 235}U route. The processing uses as the first step the precipitation of {sup 99}Mo with {alpha}-Bz. This precipitation step involves many key reaction parameters. The aim of this work is based on the development of the already known acidic route to produce {sup 99}Mo as well as the optimization of the reactional parameters applying statistical tools. In order to simulate {sup 99}Mo precipitation, the study was conducted in acidic media using HNO{sub 3}, {alpha}Bz as precipitant agent and NaOH /1%H{sub 2}O{sub 2} as dissolver solution. Then, a Mo carrier, KMnO{sub 4} solutions and {sup 99}Mo tracer were added to the reaction flask. The reactional parameters ({alpha}-Bz/Mo ratio, Mo carrier, reaction time and temperature, and cooling reaction time before filtration) were evaluated under a fractional factorial design of resolution V. The best values of each reactional parameter were determined by a response surface statistical planning. The precipitation and recovery yields of {sup 99}Mo were measured using HPGe detector. Statistical analysis from experimental data suggested that the reactional parameters {alpha}-Bz/Mo ratio, reaction time and temperature have a significant impact on {sup 99}Mo precipitation. Optimization statistical planning showed that higher {alpha}Bz/Mo ratios, room temperature, and lower reaction time lead to higher {sup 99}Mo yields. (author)
Krasnikov, G V; Tiurina, M Ĭ; Tankanag, A V; Piskunova, G M; Cheremis, N K
2014-01-01
The effect of deep breathing controlled in both rate and amplitude on the heart rate variability (HRV) and respiration-dependent blood flow oscillations of forearm and finger-pad skin has been studied in 29 young healthy volunteers from 18 to 25 years old. To reveal the effect of the segments of the vegetative autonomic nervous system on the amplitudes of HRV and respiration-dependent oscillations of skin blood flow we estimated the parameters of the cardiovascular system into two groups of participants: with formally high and low sympathovagal balance values. The sympathovagal balance value was judged by the magnitude of LF/HF power ratio calculated for each participant using the spontaneous breathing rhythmogram. It was found what the participants with predominant parasympathetic tonus had statistically significant higher amplitudes of H R V and skin blood flow oscillations in the breathing rate less than 4 cycles per min than the subjects with predominant sympathetic tonus. In the forearm skin, where the density of sympathetic innervations is low comparatively to that in the finger skin, no statistically significant differences in the amplitude of respiratory skin blood flow oscillations was found between the two groups of participants.
Ueno, Tamio; Matuda, Junichi; Yamane, Nobuhisa
2013-03-01
To evaluate the occurrence of out-of acceptable ranges and accuracy of antimicrobial susceptibility tests, we applied a new statistical tool to the Inter-Laboratory Quality Control Program established by the Kyushu Quality Control Research Group. First, we defined acceptable ranges of minimum inhibitory concentration (MIC) for broth microdilution tests and inhibitory zone diameter for disk diffusion tests on the basis of Clinical and Laboratory Standards Institute (CLSI) M100-S21. In the analysis, more than two out-of acceptable range results in the 20 tests were considered as not allowable according to the CLSI document. Of the 90 participating laboratories, 46 (51%) experienced one or more occurrences of out-of acceptable range results. Then, a binomial test was applied to each participating laboratory. The results indicated that the occurrences of out-of acceptable range results in the 11 laboratories were significantly higher when compared to the CLSI recommendation (allowable rate laboratory was statistically compared with zero using a Student's t-test. The results revealed that 5 of the 11 above laboratories reported erroneous test results that systematically drifted to the side of resistance. In conclusion, our statistical approach has enabled us to detect significantly higher occurrences and source of interpretive errors in antimicrobial susceptibility tests; therefore, this approach can provide us with additional information that can improve the accuracy of the test results in clinical microbiology laboratories.
Statistical reporting inconsistencies in experimental philosophy.
Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B; Sprenger, Jan
2018-01-01
Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science.
Statistical reporting inconsistencies in experimental philosophy
Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B.; Sprenger, Jan
2018-01-01
Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science. PMID:29649220
Lv, L-F; Jia, H-Y; Zhang, H-F; Hu, Y-X
2017-10-01
To investigate the level of expression and the clinical significance of IL-2 (interleukin-2), IL-6 (interleukin-6) and TGF-β (transforming growth factor-β) in elderly patients with goiter and hyperthyroidism. Gender, age, course of disease, BMI (Body Mass Index), serum FT3 (Free triiodothyronine-3), FT4 (Free triiodothyronine-4), TT3 (Total triiodothyronine-3), TT4 (Total triiodothyronine-4), TSH (Thyroid Stimulating Hormone) and clinical manifestations on admission and other general clinical data and laboratory examination results were collected and statistically analyzed as case group in 128 elderly patients with goiter and hyperthyroidism. Additional 128 over 60-year-old patients with hyperthyroidism were selected as control group. The thyroid tissue of these patients and the control group were examined by fine needle aspiration biopsy. The expressions of IL-2, IL-6, TGF-β of the thyroid tissue in all patients were detected by immunohistochemistry, qRT-PCR (Real-time Quantitative Polymerase Chain Reaction) and Western blot method respectively, and the statistical analysis was carried out. p hyperthyroidism and thyroid enlargement (p hyperthyroidism, and symptoms of exophthalmos, the level of expression of IL-6 was significantly higher than that of patients without exophthalmos (p hyperthyroidism and symptoms of exophthalmos, and the patients with goiter, hyperthyroidism without symptoms of exophthalmos, IL-2 and TGF-β expression level were not different (p > 0.05). The expression levels of IL-2, IL-6, and TGF-β were significantly increased in the patients with senile goiter and hyperthyroidism, but in the senile patients with goiter, hyperthyroidism and exophthalmos symptoms, IL-6 levels were significantly higher than those without exophthalmos. The use of IL-2, IL-6, and TGF-β is of great significance in the diagnosis of goiter with hyperthyroidism, especially for elderly patients with atypical clinical symptoms of hyperthyroidism.
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Dispersal of potato cyst nematodes measured using historical and spatial statistical analyses.
Banks, N C; Hodda, M; Singh, S K; Matveeva, E M
2012-06-01
Rates and modes of dispersal of potato cyst nematodes (PCNs) were investigated. Analysis of records from eight countries suggested that PCNs spread a mean distance of 5.3 km/year radially from the site of first detection, and spread 212 km over ≈40 years before detection. Data from four countries with more detailed histories of invasion were analyzed further, using distance from first detection, distance from previous detection, distance from nearest detection, straight line distance, and road distance. Linear distance from first detection was significantly related to the time since the first detection. Estimated rate of spread was 5.7 km/year, and did not differ statistically between countries. Time between the first detection and estimated introduction date varied between 0 and 20 years, and differed among countries. Road distances from nearest and first detection were statistically significantly related to time, and gave slightly higher estimates for rate of spread of 6.0 and 7.9 km/year, respectively. These results indicate that the original site of introduction of PCNs may act as a source for subsequent spread and that this may occur at a relatively constant rate over time regardless of whether this distance is measured by road or by a straight line. The implications of this constant radial rate of dispersal for biosecurity and pest management are discussed, along with the effects of control strategies.
Analysis of statistical misconception in terms of statistical reasoning
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Statistical Inference at Work: Statistical Process Control as an Example
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.
Kobayashi, Kensuke
2016-01-01
Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.
A Statistical Primer: Understanding Descriptive and Inferential Statistics
Gillian Byrne
2007-01-01
As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...
Solution of the statistical bootstrap with Bose statistics
International Nuclear Information System (INIS)
Engels, J.; Fabricius, K.; Schilling, K.
1977-01-01
A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density
Trends In Funding Higher Education In Romania And EU
Directory of Open Access Journals (Sweden)
Raluca Mariana Dragoescu
2014-05-01
Full Text Available Education is one of the determinants of the economic growth in any state, education funding representing thus a very important aspect in public policies. In this article we present the general principles of funding higher education in Romania and how it evolved over the last decade, stressing that the public higher education has been consistently underfunded. We also present an overview of the evolution of the main statistical indicators that characterize higher education in Romania, the number of universities and faculties, the number of students, number of teachers, revealing discrepancies between their evolution and the evolution of funding. We compared the funding of higher education in Romania and EU countries highlighting the fact that Romania should pay a special attention to higher education to achieve the performancen of other EU member countries.
Effects of SMILE and Trans-PRK on corneal higher order aberrations after myopic correction
Directory of Open Access Journals (Sweden)
Jiao Zhao
2018-02-01
Full Text Available AIM:To observe the effects of small incision lenticule extraction(SMILEand trans-epithelial photorefractive keratectomy(Trans-PRKon corneal horizontal coma, vertical coma, and spherical aberration and total higher order aberrations after refractive correction for myopia. METHODS: This was a prospective non-randomized cohort study. The cohort included 40 patients(80 eyeswith myopia, who received refraction correction surgery from December 2016 to February 2017 in Leshan Ophthalmic Center. Twenty patients(40 eyesreceived SMILE surgery and the other 20 patients(40 eyesreceived Trans-PRK surgery. Corneal aberrations were determined by a high-resolution Pentacam Scheimpflug camera before the surgery and at 1 and 3mo after the operation. Statistical analyses were performed using analysis of variance of repeated measures. RESULTS: At 1 and 3mo post-operation, the uncorrected visual acuity in both groups was better than or equal to the preoperative best corrected visual acuity. The preoperative corneal aberrations showed no significant difference between the two groups(P>0.05. Significantly higher aberration was found after the surgery in both groups(PP>0.05. Post-operation, horizontal and vertical coma had no significant difference between the two groups(P>0.05, while SMILE group showed lower spherical aberration and lower total higher order aberration than Trans-PRK group(PCONCLUSION: Both SMILE and Trans-PRK increase corneal aberration and their effects on horizontal and vertical coma are similar. However, SMILE has a minor influence on spherical aberration and total high order aberration than Trans-PRK.
A statistical evaluation of asbestos air concentrations
International Nuclear Information System (INIS)
Lange, J.H.
1999-01-01
Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm - - 3 of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm -3 of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Rammohan, Anu; Awofeso, Niyi; Fernandez, Renae C
2012-05-08
Despite increased funding of measles vaccination programs by national governments and international aid agencies, structural factors encumber attainment of childhood measles immunisation to levels which may guarantee herd immunity. One of such factors is parental education status. Research on the links between parental education and vaccination has typically focused on the influence of maternal education status. This study aims to demonstrate the independent influence of paternal education status on measles immunisation. Comparable nationally representative survey data were obtained from six countries with the highest numbers of children missing the measles vaccine in 2008. Logistic regression analysis was applied to examine the influence of paternal education on uptake of the first dose of measles vaccination, independent of maternal education, whilst controlling for confounding factors such as respondent's age, urban/rural residence, province/state of residence, religion, wealth and occupation. The results of the analysis show that even if a mother is illiterate, having a father with an education of Secondary (high school) schooling and above is statistically significant and positively correlated with the likelihood of a child being vaccinated for measles, in the six countries analysed. Paternal education of secondary or higher level was significantly and independently correlated with measles immunisation uptake after controlling for all potential confounders. The influence of paternal education status on measles immunisation uptake was investigated and found to be statistically significant in six nations with the biggest gaps in measles immunisation coverage in 2008. This study underscores the imperative of utilising both maternal and paternal education as screening variables to identify children at risk of missing measles vaccination prospectively.
New Graphical Methods and Test Statistics for Testing Composite Normality
Directory of Open Access Journals (Sweden)
Marc S. Paolella
2015-07-01
Full Text Available Several graphical methods for testing univariate composite normality from an i.i.d. sample are presented. They are endowed with correct simultaneous error bounds and yield size-correct tests. As all are based on the empirical CDF, they are also consistent for all alternatives. For one test, called the modified stabilized probability test, or MSP, a highly simplified computational method is derived, which delivers the test statistic and also a highly accurate p-value approximation, essentially instantaneously. The MSP test is demonstrated to have higher power against asymmetric alternatives than the well-known and powerful Jarque-Bera test. A further size-correct test, based on combining two test statistics, is shown to have yet higher power. The methodology employed is fully general and can be applied to any i.i.d. univariate continuous distribution setting.
Milic, Natasa M; Trajkovic, Goran Z; Bukumiric, Zoran M; Cirkovic, Andja; Nikolic, Ivan M; Milin, Jelena S; Milic, Nikola V; Savic, Marko D; Corac, Aleksandar M; Marinkovic, Jelena M; Stanisavljevic, Dejana M
2016-01-01
Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (pstatistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics.
Corruption Significantly Increases the Capital Cost of Power Plants in Developing Contexts
Directory of Open Access Journals (Sweden)
Kumar Biswajit Debnath
2018-03-01
Full Text Available Emerging economies with rapidly growing population and energy demand, own some of the most expensive power plants in the world. We hypothesized that corruption has a relationship with the capital cost of power plants in developing countries such as Bangladesh. For this study, we analyzed the capital cost of 61 operational and planned power plants in Bangladesh. Initial comparison study revealed that the mean capital cost of a power plant in Bangladesh is twice than that of the global average. Then, the statistical analysis revealed a significant correlation between corruption and the cost of power plants, indicating that higher corruption leads to greater capital cost. The high up-front cost can be a significant burden on the economy, at present and in the future, as most are financed through international loans with extended repayment terms. There is, therefore, an urgent need for the review of the procurement and due diligence process of establishing power plants, and for the implementation of a more transparent system to mitigate adverse effects of corruption on megaprojects.
Doctoral production in South Africa: Statistics, challenges and ...
African Journals Online (AJOL)
The past few years have witnessed new interest in doctoral production in South Africa. In the first section of the article, it is argued that this new interest has its roots in various higher education policy documents over the past decade. The second part of the article presents some of the most recent statistics on various aspects ...
All of statistics a concise course in statistical inference
Wasserman, Larry
2004-01-01
This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...
Equilibrium statistical mechanics of lattice models
Lavis, David A
2015-01-01
Most interesting and difficult problems in equilibrium statistical mechanics concern models which exhibit phase transitions. For graduate students and more experienced researchers this book provides an invaluable reference source of approximate and exact solutions for a comprehensive range of such models. Part I contains background material on classical thermodynamics and statistical mechanics, together with a classification and survey of lattice models. The geometry of phase transitions is described and scaling theory is used to introduce critical exponents and scaling laws. An introduction is given to finite-size scaling, conformal invariance and Schramm—Loewner evolution. Part II contains accounts of classical mean-field methods. The parallels between Landau expansions and catastrophe theory are discussed and Ginzburg—Landau theory is introduced. The extension of mean-field theory to higher-orders is explored using the Kikuchi—Hijmans—De Boer hierarchy of approximations. In Part III the use of alge...
Southard, Rodney E.
2013-01-01
located in Region 1, 120 were located in Region 2, and 10 were located in Region 3. Streamgages located outside of Missouri were selected to extend the range of data used for the independent variables in the regression analyses. Streamgages included in the regression analyses had 10 or more years of record and were considered to be affected minimally by anthropogenic activities or trends. Regional regression analyses identified three characteristics as statistically significant for the development of regional equations. For Region 1, drainage area, longest flow path, and streamflow-variability index were statistically significant. The range in the standard error of estimate for Region 1 is 79.6 to 94.2 percent. For Region 2, drainage area and streamflow variability index were statistically significant, and the range in the standard error of estimate is 48.2 to 72.1 percent. For Region 3, drainage area and streamflow-variability index also were statistically significant with a range in the standard error of estimate of 48.1 to 96.2 percent. Limitations on the use of estimating low-flow frequency statistics at ungaged locations are dependent on the method used. The first method outlined for use in Missouri, power curve equations, were developed to estimate the selected statistics for ungaged locations on 28 selected streams with multiple streamgages located on the same stream. A second method uses a drainage-area ratio to compute statistics at an ungaged location using data from a single streamgage on the same stream with 10 or more years of record. Ungaged locations on these streams may use the ratio of the drainage area at an ungaged location to the drainage area at a streamgage location to scale the selected statistic value from the streamgage location to the ungaged location. This method can be used if the drainage area of the ungaged location is within 40 to 150 percent of the streamgage drainage area. The third method is the use of the regional regression equations
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
Inverse Statistics in the Foreign Exchange Market
Jensen, M. H.; Johansen, A.; Petroni, F.; Simonsen, I.
2004-01-01
We investigate intra-day foreign exchange (FX) time series using the inverse statistic analysis developed in [1,2]. Specifically, we study the time-averaged distributions of waiting times needed to obtain a certain increase (decrease) $\\rho$ in the price of an investment. The analysis is performed for the Deutsch mark (DM) against the $US for the full year of 1998, but similar results are obtained for the Japanese Yen against the $US. With high statistical significance, the presence of "reson...
Ba, Djibril Marie; Sow, Mamadou Saidou; Diack, Aminata; Dia, Khadidiatou; Mboup, Mouhamed Cherif; Fall, Pape Diadie; Fall, Moussa Daouda
2017-12-01
Since the discovery of the ABO blood group system by Karl Landsteiner in 1901, several reports have suggested an important involvement of the ABO blood group system in the susceptibility to thrombosis. Assessing that non-O blood groups in particular A blood group confer a higher risk of venous and arterial thrombosis than group O.Epidemiologic data are typically not available for all racial and ethnics groups.The purpose of this pilot study was to identify a link between ABO blood group and ischemic disease (ID) in Africans, and to analyze whether A blood group individuals were at higher risk of ischemic disease or not. A total of 299 medical records of patients over a three-year period admitted to the cardiology and internal medicine department of military hospital of Ouakam in Senegal were reviewed. We studied data on age, gender, past history of hypertension, diabetes, smoking, sedentarism, obesity, hyperlipidemia, use of estrogen-progestin contraceptives and blood group distribution.In each blood group type, we evaluated the prevalence of ischemic and non-ischemic cardiovascular disease. The medical records were then stratified into two categories to evaluate incidence of ischemic disease: Group 1: Patients carrying blood-group A and Group 2: Patients carrying blood group non-A (O, AB and B). Of the 299 patients whose medical records were reviewed, 92 (30.8%) were carrying blood group A, 175 (58.5%) had blood group O, 13 (4.3%) had blood group B, and 19 (6.4%) had blood group AB.The diagnosis of ischemic disease (ID) was higher in patients with blood group A (61.2%) than in other blood groups, and the diagnosis of non-ischemic disease (NID) was higher in patients with blood group O (73.6%) compared to other groups. In patients with blood group B or AB compared to non-B or non-AB, respectively there was no statistically significant difference in ID incidence.Main risk factor for ID was smoking (56.5%), hypertension (18.4%) and diabetes (14.3%).In our study
Prognostic significance of MCM2, Ki-67 and gelsolin in non-small cell lung cancer
International Nuclear Information System (INIS)
Yang, Jun; Tan, Dongfeng; Ramnath, Nithya; Moysich, Kirsten B; Asch, Harold L; Swede, Helen; Alrawi, Sadir J; Huberman, Joel; Geradts, Joseph; Brooks, John SJ
2006-01-01
Uncontrolled proliferation and increased motility are hallmarks of neoplastic cells, therefore markers of proliferation and motility may be valuable in assessing tumor progression and prognosis. MCM2 is a member of the minichromosome maintenance (MCM) protein family. It plays critical roles in the initiation of DNA replication and in replication fork movement, and is intimately related to cell proliferation. Ki-67 is a proliferation antigen that is expressed during all but G 0 phases of the cell cycle. Gelsolin is an actin-binding protein that regulates the integrity of the actin cytoskeletal structure and facilitates cell motility. In this study, we assessed the prognostic significance of MCM2 and Ki-67, two markers of proliferation, and gelsolin, a marker of motility, in non-small cell lung cancer (NSCLC). 128 patients with pathologically confirmed, resectable NSCLC (stage I-IIIA) were included. Immunohistochemistry was utilized to measure the expressions of these markers in formalin-fixed, paraffin-embedded tumor tissues. Staining and scoring of MCM2, Ki-67 and gelsolin was independently performed. Analyses were performed to evaluate the prognostic significance of single expression of each marker, as well as the prognostic significance of composite expressions of MCM2 and gelsolin. Cox regression and Kaplan-Meier survival analysis were used for statistical analysis. Of the three markers, higher levels of gelsolin were significantly associated with an increased risk of death (adjusted RR = 1.89, 95% CI = 1.17–3.05, p = 0.01), and higher levels of MCM2 were associated with a non-significant increased risk of death (adjusted RR = 1.36, 95% CI = 0.84–2.20, p = 0.22). Combined, adjusted analyses revealed a significantly poor prognostic effect for higher expression of MCM2 and gelsolin compared to low expression of both biomarkers (RR = 2.32, 95% CI = 1.21–4.45, p = 0.01). Ki-67 did not display apparent prognostic effect in this study sample. The results suggest
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Directory of Open Access Journals (Sweden)
Ozgur Arikan
2015-12-01
Full Text Available Objective: We aimed to compare serum and urinary HER2/neu levels between healthy control group and patients with non-muscle invasive bladder cancer. Additionally, we evaluated relationship of HER2/neu levels with tumor stage, grade, recurrence and progression. Materials and Methods: Fourty-four patients with primary non-muscle invasive bladder tumors (Group 2 and 40 healthy control group (Group 1 were included the study. Blood and urinary samples were collected from all patients and HER2/neu levels were measured by ELISA method. Blood and urinary HER2/neu levels and additionally, ratio of urinary HER2/neu levels to urinary creatinine levels were recorded. Demographic data and tumor characteristics were recorded. Results: Mean serum HER2/neu levels were similar between two groups and statistically significant difference wasn't observed. Urinary HER2/neu levels were significantly higher in group 2 than group 1. Ratio of urinary HER2/neu to urinary creatinine was significantly higher in group 2 than group 1, (p=0,021. Serum and urinary HER2/ neu levels were not associated with tumor stage, grade, recurrence and progression while ratio of urinary HER2/neu to urinary creatinin levels were significantly higher in high-grade tumors. HER2/neu, the sensitivity of the test was found to be 20.5%, and the specificity was 97.5%, also for the urinary HER2/neu/urinary creatinine ratio, the sensitivity and specificity of the test were found to be 31.8% and 87.5%, respectively. Conclusions: Urinary HER2/neu and ratio of urinary creatinine urine were significantly higher in patients with bladder cancer compared to healthy subjects. Large series and controlled studies are needed for use as a tumor marker.
Xiang, Mingli; Guo, Li; Ma, Yan; Li, Yi
2016-11-01
To discuss the expression of T helper cell 17 (Th17) cells and CD4+ CD25+ Foxp3+ regulatory T cells (Treg) in peripheral blood (PB) of patients with acute leukemia (AL), and to explore the relationship between them and disease prognosis. 40 patients diagnosed with acute leukemia in The First Affiliated Hospital of Zhengzhou University from July 2012 to August 2014 were selected as the observation group. Meanwhile, 40 healthy people were taken as the control group. Flow Cytometry Method (FCM) was used to detect the level of Th17 cells and CD4 + CD25 + Foxp3 + cells in peripheral blood of the two groups, and enzyme-linked immuno sorbent assay (ELISA) method was used to test the level of IL17 and TGF-β in peripheral blood of two groups; reverse transcription-polymerase chain reaction (RT-PCR) was adopted to analyze the mRNA levels of RORγT and Foxp3 in peripheral blood. In addition, we examined the levels of Th17 and CD4 + CD25 + Foxp3 + cells and associated factor levels in patients with remission after AL chemotherapy. the Th17 cells (CD3 + CD4 + IL-17 + ) in acute leukemia patients accounted for (1.51±0.27)%, which was significantly higher than that of control group (0.36±0.23)%, with statistical significance (t=20.51, Pcells in AL patients was (3.37±0.48)%, which was significantly higher than that of control group of (1.26±0.27)%, with statistical significance (t=24.23, Pt=7.83, Pt=7.83, Pt=12.27, Pt=7.89, Pcells and CD4 + CD25 + Foxp3 + cells, and the serum levels of IL-17 and TGF-β in acute leukemia patients all decreased significantly after 6 months of chemotherapy, and the difference was statistically significant (Pcells, CD4+ CD25+ Foxp3 + cells and their secretory proteins IL-17, TGF-β and transcription factors were significantly increased in AL patients. Therefore, regular detection of peripheral blood Th17 and Treg cells, as well as their secretory proteins are useful for monitoring the immune status and prognosis of patients.
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Lensing corrections to the Eg(z) statistics from large scale structure
Moradinezhad Dizgah, Azadeh; Durrer, Ruth
2016-09-01
We study the impact of the often neglected lensing contribution to galaxy number counts on the Eg statistics which is used to constrain deviations from GR. This contribution affects both the galaxy-galaxy and the convergence-galaxy spectra, while it is larger for the latter. At higher redshifts probed by upcoming surveys, for instance at z = 1.5, neglecting this term induces an error of (25-40)% in the spectra and therefore on the Eg statistics which is constructed from the combination of the two. Moreover, including it, renders the Eg statistics scale and bias-dependent and hence puts into question its very objective.
Statistical conditional sampling for variable-resolution video compression.
Directory of Open Access Journals (Sweden)
Alexander Wong
Full Text Available In this study, we investigate a variable-resolution approach to video compression based on Conditional Random Field and statistical conditional sampling in order to further improve compression rate while maintaining high-quality video. In the proposed approach, representative key-frames within a video shot are identified and stored at full resolution. The remaining frames within the video shot are stored and compressed at a reduced resolution. At the decompression stage, a region-based dictionary is constructed from the key-frames and used to restore the reduced resolution frames to the original resolution via statistical conditional sampling. The sampling approach is based on the conditional probability of the CRF modeling by use of the constructed dictionary. Experimental results show that the proposed variable-resolution approach via statistical conditional sampling has potential for improving compression rates when compared to compressing the video at full resolution, while achieving higher video quality when compared to compressing the video at reduced resolution.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Statistical Power in Plant Pathology Research.
Gent, David H; Esker, Paul D; Kriss, Alissa B
2018-01-01
In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even
Statistical inference for template aging
Schuckers, Michael E.
2006-04-01
A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.
Testing statistical self-similarity in the topology of river networks
Troutman, Brent M.; Mantilla, Ricardo; Gupta, Vijay K.
2010-01-01
Recent work has demonstrated that the topological properties of real river networks deviate significantly from predictions of Shreve's random model. At the same time the property of mean self-similarity postulated by Tokunaga's model is well supported by data. Recently, a new class of network model called random self-similar networks (RSN) that combines self-similarity and randomness has been introduced to replicate important topological features observed in real river networks. We investigate if the hypothesis of statistical self-similarity in the RSN model is supported by data on a set of 30 basins located across the continental United States that encompass a wide range of hydroclimatic variability. We demonstrate that the generators of the RSN model obey a geometric distribution, and self-similarity holds in a statistical sense in 26 of these 30 basins. The parameters describing the distribution of interior and exterior generators are tested to be statistically different and the difference is shown to produce the well-known Hack's law. The inter-basin variability of RSN parameters is found to be statistically significant. We also test generator dependence on two climatic indices, mean annual precipitation and radiative index of dryness. Some indication of climatic influence on the generators is detected, but this influence is not statistically significant with the sample size available. Finally, two key applications of the RSN model to hydrology and geomorphology are briefly discussed.
Directory of Open Access Journals (Sweden)
Natasa M Milic
Full Text Available Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face learning to further assess the potential value of web-based learning in medical statistics.This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545 the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course.Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001 and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023 with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001.This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional
Cvikl, Barbara; Haubenberger-Praml, Gertraud; Drabo, Petra; Hagmann, Michael; Gruber, Reinhard; Moritz, Andreas; Nell, Andrea
2014-05-09
A low level of education and the migration background of parents are associated with the development of caries in children. The aim of this study was to evaluate whether a higher educational level of parents can overcome risks for the development of caries in immigrants in Vienna, Austria. The educational level of the parents, the school type, and the caries status of 736 randomly selected twelve-year-old children with and without migration background was determined in this cross sectional study. In children attending school in Vienna the decayed, missing, and filled teeth (DMFT) index was determined. For statistical analysis, a mixed negative-binomial-model was used. The caries status of the children with migration background was significantly worse compared to that of the native Viennese population. A significant interaction was found between migration background and the educational level of the parents (p = 0.045). No interaction was found between the school type and either the migration background (p = 0.220) or the education level of the parents (p = 0.08). In parents with a higher scholarly education level, migration background (p education level, however, migration background and school type had no significant association with DMFT values. These data indicate that children with a migration background are at higher risk to acquire caries than other Viennese children, even when the parents have received a higher education.
Gene cluster statistics with gene families.
Raghupathy, Narayanan; Durand, Dannie
2009-05-01
Identifying genomic regions that descended from a common ancestor is important for understanding the function and evolution of genomes. In distantly related genomes, clusters of homologous gene pairs are evidence of candidate homologous regions. Demonstrating the statistical significance of such "gene clusters" is an essential component of comparative genomic analyses. However, currently there are no practical statistical tests for gene clusters that model the influence of the number of homologs in each gene family on cluster significance. In this work, we demonstrate empirically that failure to incorporate gene family size in gene cluster statistics results in overestimation of significance, leading to incorrect conclusions. We further present novel analytical methods for estimating gene cluster significance that take gene family size into account. Our methods do not require complete genome data and are suitable for testing individual clusters found in local regions, such as contigs in an unfinished assembly. We consider pairs of regions drawn from the same genome (paralogous clusters), as well as regions drawn from two different genomes (orthologous clusters). Determining cluster significance under general models of gene family size is computationally intractable. By assuming that all gene families are of equal size, we obtain analytical expressions that allow fast approximation of cluster probabilities. We evaluate the accuracy of this approximation by comparing the resulting gene cluster probabilities with cluster probabilities obtained by simulating a realistic, power-law distributed model of gene family size, with parameters inferred from genomic data. Surprisingly, despite the simplicity of the underlying assumption, our method accurately approximates the true cluster probabilities. It slightly overestimates these probabilities, yielding a conservative test. We present additional simulation results indicating the best choice of parameter values for data
Estimation of measurement variance in the context of environment statistics
Maiti, Pulakesh
2015-02-01
The object of environment statistics is for providing information on the environment, on its most important changes over time, across locations and identifying the main factors that influence them. Ultimately environment statistics would be required to produce higher quality statistical information. For this timely, reliable and comparable data are needed. Lack of proper and uniform definitions, unambiguous classifications pose serious problems to procure qualitative data. These cause measurement errors. We consider the problem of estimating measurement variance so that some measures may be adopted to improve upon the quality of data on environmental goods and services and on value statement in economic terms. The measurement technique considered here is that of employing personal interviewers and the sampling considered here is that of two-stage sampling.
Stepanova, Ekaterina V; Kondrashin, Anatoly V; Sergiev, Vladimir P; Morozova, Lola F; Turbabina, Natalia A; Maksimova, Maria S; Brazhnikov, Alexey I; Shevchenko, Sergei B; Morozov, Evgeny N
2017-01-01
Studies carried out in Moscow residents have revealed that the prevalence of chronic toxoplasmosis is very close to those in countries of Eastern and Central Europe. Our findings also demonstrated a statistically significant relationship between the rate of traffic accidents and the seroprevalence of chronic toxoplasmosis in drivers who were held responsible for accidents. The latter was 2.37 times higher in drivers who were involved in road accidents compared with control groups. These results suggest that the consequences of chronic toxoplasmosis (particularly a slower reaction time and decreased concentration) might contribute to the peculiarities of the epidemiology of road traffic accidents in the Russian Federation and might interfere with the successful implementation of the Federal Programme named "Increase road traffic safety". Suggestions for how to address overcome this problem are discussed in this paper.
Directory of Open Access Journals (Sweden)
Justin London
2010-01-01
Full Text Available In “National Metrical Types in Nineteenth Century Art Song” Leigh Van Handel gives a sympathetic critique of William Rothstein’s claim that in western classical music of the late 18th and 19th centuries there are discernable differences in the phrasing and metrical practice of German versus French and Italian composers. This commentary (a examines just what Rothstein means in terms of his proposed metrical typology, (b questions Van Handel on how she has applied it to a purely melodic framework, (c amplifies Van Handel’s critique of Rothstein, and then (d concludes with a rumination on the reach of quantitative (i.e., statistically-driven versus qualitative claims regarding such things as “national metrical types.”
Statistical characterization report for Single-Shell Tank 241-T-107
International Nuclear Information System (INIS)
Cromar, R.D.; Wilmarth, S.R.; Jensen, L.
1994-01-01
This report contains the results of the statistical analysis of data from three core samples obtained from single-shell tank 241-T-107 (T-107). Four specific topics are addressed. They are summarized below. Section 3.0 contains mean concentration estimates of analytes found in T-107. The estimates of open-quotes errorclose quotes associated with the concentration estimates are given as 95% confidence intervals (CI) on the mean. The results given are based on three types of samples: core composite samples, core segment samples, and drainable liquid samples. Section 4.0 contains estimates of the spatial variability (variability between cores and between segments) and the analytical variability (variability between the primary and the duplicate analysis). Statistical tests were performed to test the hypothesis that the between cores and the between segments spatial variability is zero. The results of the tests are as follows. Based on the core composite data, the between cores variance is significantly different from zero for 35 out of 74 analytes; i.e., for 53% of the analytes there is no statistically significant difference between the concentration means for two cores. Based on core segment data, the between segments variance is significantly different from zero for 22 out of 24 analytes and the between cores variance is significantly different from zero for 4 out of 24 analytes; i.e., for 8% of the analytes there is no statistically significant difference between segment means and for 83% of the analytes there is no difference between the means from the three cores. Section 5.0 contains the results of the application of multiple comparison methods to the core composite data, the core segment data, and the drainable liquid data. Section 6.0 contains the results of a statistical test conducted to determine the 222-S Analytical Laboratory's ability to homogenize solid core segments
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Jafarzadeh, Abdollah; Nemati, Maryam; Rezayati, Mohammad Taghi; Nabizadeh, Mansooreh; Ebrahimi, Medhi
2013-07-01
H. pylori infection has been associated with some autoimmune disorders. The aim of this study was to evaluate the serum concentrations of rheumatoid factor and anti-nuclear antibodies in H. pylori-infected peptic ulcer patients, H. pylori-infected asymptomatic carriers and a healthy control group. A Total of 100 H. pylori-infected peptic ulcer patients, 65 asymptomatic carriers and 30 healthy H. pylori-negative subjects (as a control group) were enrolled into study. Serum samples of participants tested for the levels of rheumatoid factor and anti-nuclear antibodies by use of ELISA. The mean serum levels of rheumatoid factor and anti-nuclear antibodies in peptic ulcer group was significantly higher in comparison to the control group (ppeptic ulcer patients and asymptomatic carriers groups regarding the mean serum levels of rheumatoid factor and anti-nuclear antibodies. The mean serum levels of rheumatoid factor in men with peptic ulcer was significantly higher compared to the group of healthy men (ppeptic ulcer patients or asymptomatic carriers groups, the mean serum levels of rheumatoid factor was higher than that in healthy women, but the differences were not statistically significant. Also, no significant differences were observed between men and women with peptic ulcer, asymptomatic carriers control groups based on the serum levels of anti-nuclear antibodies. The results showed higher serum levels of rheumatoid factor and anti-nuclear antibodies in H. pylori-infected patients with peptic ulcer disease which represent the H. pylori-related immune disturbance in these patients. Additional follow-up studies are necessary to clarify the clinical significance of these autoantibodies in patients with H. pylori infection.
Directory of Open Access Journals (Sweden)
Luis F. Moncada Mora
2011-12-01
Full Text Available In this article we present the significant determiners of academic performance of new students enrolled in the higher distance education system of Ecuador. A description and correlation of the variables were undertaken to formalize the probabilistic model that confirms the positive, negative, individual and global effects.
Ing, Alex; Schwarzbauer, Christian
2014-01-01
Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.
The clinic-statistic study of osteoporosis
Directory of Open Access Journals (Sweden)
Florin MARCU
2008-05-01
Full Text Available Osteoporosis is the most common metabolic bone disease and is characterized by the shrinkage in bone mass and the distruction of bone quality, thus conferring a higher risk for fractures and injuries. Osteoporosis reaches clinical attention when it is severe enough to induce microfractures and the collapsing of vertebral bodies manifesting with back aches or predisposition to other bone fractures. The aim of the study was to establish a statistic-numeric report between women and men in subjects diagnosed with osteoporosis through DEXA that present with a clinical simptomatology. We studied a group of subjects of masculine and feminine genders that have been diagnosed with osteoporosis through DEXA at the EURORAD clinic in Oradea from 01.01.2007-to present time .The result of the study was that the simptomatology of osteoporosis with pain and even cases of fractures is more obvious in female subjects then in male patients; statistically ,a woman/man report of 6.1/1 was established.
Access to Higher Education: The Case of Young Indigenous People of Oaxaca and Guerrero
Directory of Open Access Journals (Sweden)
Noemi López-Santiago
2017-04-01
Full Text Available This literature review article aims to present the current state of the higher education in Guerrero and Oaxaca, Mexico, so that to show the limited access for the young indigenous population and its relationship to poverty. The issue being addressed is of significance since both Mexican states have an average of schooling below the national average: the human development index in Guerrero is 0.679, and in Oaxaca it is 0.681, two of the lowest indexes in the country. By implementing some tools of descriptive statistics and information about the latest school cycles (2013-2014 and 2014-2015, we found that half of youth population in these two states lives in rural areas, one-fourth are speakers of an indigenous language, and only slightly more than ten per cent of indigenous young people over 18 years old belong to the total enrollment of higher education.
Statistical Damage Detection of Civil Engineering Structures using ARMAV Models
DEFF Research Database (Denmark)
Andersen, P.; Kirkegaard, Poul Henning
In this paper a statistically based damage detection of a lattice steel mast is performed. By estimation of the modal parameters and their uncertainties it is possible to detect whether some of the modal parameters have changed with a statistical significance. The estimation of the uncertainties ...
Energy Technology Data Exchange (ETDEWEB)
Paul, D. [SSBB and Senior Member-ASQ, Kolkata (India); Mandal, S.N. [Kalyani Govt Engg College, Kalyani (India); Mukherjee, D.; Bhadra Chaudhuri, S.R. [Dept of E. and T. C. Engg, B.E.S.U., Shibpur (India)
2010-10-15
System efficiency and payback time are yet to attain a commercially viable level for solar photovoltaic energy projects. Despite huge development in prediction of solar radiation data, there is a gap in extraction of pertinent information from such data. Hence the available data cannot be effectively utilized for engineering application. This is acting as a barrier for the emerging technology. For making accurate engineering and financial calculations regarding any solar energy project, it is crucial to identify and optimize the most significant statistic(s) representing insolation availability by the Photovoltaic setup at the installation site. Quality Function Deployment (QFD) technique has been applied for identifying the statistic(s), which are of high significance from a project designer's point of view. A MATLAB trademark program has been used to build the annual frequency distribution of hourly insolation over any module plane at a given location. Descriptive statistical analysis of such distributions is done through MINITAB trademark. For Building Integrated Photo Voltaic (BIPV) installation, similar statistical analysis has been carried out for the composite frequency distribution, which is formed by weighted summation of insolation distributions for different module planes used in the installation. Vital most influential statistic(s) of the composite distribution have been optimized through Artificial Neural Network computation. This approach is expected to open up a new horizon in BIPV system design. (author)
PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual
International Nuclear Information System (INIS)
2013-01-01
The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
International Nuclear Information System (INIS)
Qin Fang; Wen Wen; Chen Ji-Sheng
2014-01-01
The thermal and electrical transport properties of an ideal anyon gas within fractional exclusion statistics are studied. By solving the Boltzmann equation with the relaxation-time approximation, the analytical expressions for the thermal and electrical conductivities of a three-dimensional ideal anyon gas are given. The low-temperature expressions for the two conductivities are obtained by using the Sommerfeld expansion. It is found that the Wiedemann—Franz law should be modified by the higher-order temperature terms, which depend on the statistical parameter g for a charged anyon gas. Neglecting the higher-order terms of temperature, the Wiedemann—Franz law is respected, which gives the Lorenz number. The Lorenz number is a function of the statistical parameter g. (condensed matter: electronic structure, electrical, magnetic, and optical properties)
Rethlefsen, Melissa L; Farrell, Ann M; Osterhaus Trzasko, Leah C; Brigham, Tara J
2015-06-01
To determine whether librarian and information specialist authorship was associated with better reported systematic review (SR) search quality. SRs from high-impact general internal medicine journals were reviewed for search quality characteristics and reporting quality by independent reviewers using three instruments, including a checklist of Institute of Medicine Recommended Standards for the Search Process and a scored modification of the Peer Review of Electronic Search Strategies instrument. The level of librarian and information specialist participation was significantly associated with search reproducibility from reported search strategies (Χ(2) = 23.5; P Librarian co-authored SRs had significantly higher odds of meeting 8 of 13 analyzed search standards than those with no librarian participation and six more than those with mentioned librarian participation. One-way ANOVA showed that differences in total search quality scores between all three groups were statistically significant (F2,267 = 10.1233; P librarian or information specialist co-authors are correlated with significantly higher quality reported search strategies. To minimize bias in SRs, authors and editors could encourage librarian engagement in SRs including authorship as a potential way to help improve documentation of the search strategy. Copyright © 2015 Elsevier Inc. All rights reserved.
Behavioral investment strategy matters: a statistical arbitrage approach
Sun, David; Tsai, Shih-Chuan; Wang, Wei
2011-01-01
In this study, we employ a statistical arbitrage approach to demonstrate that momentum investment strategy tend to work better in periods longer than six months, a result different from findings in past literature. Compared with standard parametric tests, the statistical arbitrage method produces more clearly that momentum strategies work only in longer formation and holding periods. Also they yield positive significant returns in an up market, but negative yet insignificant returns in a down...
Energy Technology Data Exchange (ETDEWEB)
Kyle, Jennifer E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Casey, Cameron P. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Stratton, Kelly G. [National Security Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zink, Erika M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Kim, Young-Mo [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zheng, Xueyun [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Monroe, Matthew E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Weitz, Karl K. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Bloodsworth, Kent J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Orton, Daniel J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Ibrahim, Yehia M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Moore, Ronald J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Lee, Christine G. [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Research Service, Portland Veterans Affairs Medical Center, Portland OR USA; Pedersen, Catherine [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Orwoll, Eric [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Smith, Richard D. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Burnum-Johnson, Kristin E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Baker, Erin S. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA
2017-02-05
The use of dried blood spots (DBS) has many advantages over traditional plasma and serum samples such as smaller blood volume required, storage at room temperature, and ability for sampling in remote locations. However, understanding the robustness of different analytes in DBS samples is essential, especially in older samples collected for longitudinal studies. Here we analyzed DBS samples collected in 2000-2001 and stored at room temperature and compared them to matched serum samples stored at -80°C to determine if they could be effectively used as specific time points in a longitudinal study following metabolic disease. Four hundred small molecules were identified in both the serum and DBS samples using gas chromatograph-mass spectrometry (GC-MS), liquid chromatography-MS (LC-MS) and LC-ion mobility spectrometry-MS (LC-IMS-MS). The identified polar metabolites overlapped well between the sample types, though only one statistically significant polar metabolite in a case-control study was conserved, indicating degradation occurs in the DBS samples affecting quantitation. Differences in the lipid identifications indicated that some oxidation occurs in the DBS samples. However, thirty-six statistically significant lipids correlated in both sample types indicating that lipid quantitation was more stable across the sample types.
Directory of Open Access Journals (Sweden)
K. Strydom
2018-04-01
Full Text Available Managing diversity is one of the major challenges in higher education institutions in South Africa. Additionally, effective strategy implementation is vital for an institution to be successful and sustainable. Questionnaires were distributed to the management of Walter Sisulu University, South Africa, to investigate the relationship between diversity factors and effective strategy implementation. The questionnaires interrogated the effect of the acculturation process, the degree of structural integration, the degree of informal integration, institutional bias and intergroup conflict, and how these factors influence strategy implementation. Structural equation modelling (SEM was employed as the statistical tool to confirm the hypothetical model. Results of this study revealed that there is no statistically significant relationship between diversity and strategy implementation at the institution, and imply that diversity among staff do not impact on the successful achievement of strategic objectives in the institution. The findings of the study are contrary to empirical evidence by other studies. Keywords: Education, Sociology, Political science, Psychology
Reed, Donovan S; Apsey, Douglas; Steigleman, Walter; Townley, James; Caldwell, Matthew
2017-11-01
In an attempt to maximize treatment outcomes, refractive surgery techniques are being directed toward customized ablations to correct not only lower-order aberrations but also higher-order aberrations specific to the individual eye. Measurement of the entirety of ocular aberrations is the most definitive means to establish the true effect of refractive surgery on image quality and visual performance. Whether or not there is a statistically significant difference in induced higher-order corneal aberrations between the VISX Star S4 (Abbott Medical Optics, Santa Ana, California) and the WaveLight EX500 (Alcon, Fort Worth, Texas) lasers was examined. A retrospective analysis was performed to investigate the difference in root-mean-square (RMS) value of the higher-order corneal aberrations postoperatively between two currently available laser platforms, the VISX Star S4 and the WaveLight EX500 lasers. The RMS is a compilation of higher-order corneal aberrations. Data from 240 total eyes of active duty military or Department of Defense beneficiaries who completed photorefractive keratectomy (PRK) or laser in situ keratomileusis (LASIK) refractive surgery at the Wilford Hall Ambulatory Surgical Center Joint Warfighter Refractive Surgery Center were examined. Using SPSS statistics software (IBM Corp., Armonk, New York), the mean changes in RMS values between the two lasers and refractive surgery procedures were determined. A Student t test was performed to compare the RMS of the higher-order aberrations of the subjects' corneas from the lasers being studied. A regression analysis was performed to adjust for preoperative spherical equivalent. The study and a waiver of informed consent have been approved by the Clinical Research Division of the 59th Medical Wing Institutional Review Board (Protocol Number: 20150093H). The mean change in RMS value for PRK using the VISX laser was 0.00122, with a standard deviation of 0.02583. The mean change in RMS value for PRK using the
International Nuclear Information System (INIS)
Venkataraman, G.
1992-01-01
Treating radiation gas as a classical gas, Einstein derived Planck's law of radiation by considering the dynamic equilibrium between atoms and radiation. Dissatisfied with this treatment, S.N. Bose derived Plank's law by another original way. He treated the problem in generality: he counted how many cells were available for the photon gas in phase space and distributed the photons into these cells. In this manner of distribution, there were three radically new ideas: The indistinguishability of particles, the spin of the photon (with only two possible orientations) and the nonconservation of photon number. This gave rise to a new discipline of quantum statistical mechanics. Physics underlying Bose's discovery, its significance and its role in development of the concept of ideal gas, spin-statistics theorem and spin particles are described. The book has been written in a simple and direct language in an informal style aiming to stimulate the curiosity of a reader. (M.G.B.)
Expression of Rab25 in non-small cell lung cancer and its clinical significance
Directory of Open Access Journals (Sweden)
Pu ZHOU
2014-03-01
Full Text Available Objective To assess the expression of Rab25 protein in non-small cell lung cancer (NSCLC, and explore the correlation of its expression with tumor proliferation and metastasis. Methods Sixty-one cases of NSCLC specimens (31 cases of squamous cell carcinoma, 26 cases of adenocarcinoma, and 4 cases of adenosquamous carcinoma undergone surgical treatment, and 40 specimens of adjacent normal lung tissues were obtained from Jan. 2009 to Jun. 2010 at Xingqiao Hospital of Third Military Medical University. Immunochemistry method of MaxVision was used to detect the expression of Rab25 in the specimens, and then the correlation of the expression with the clinicopathological parameters (patients' sex, age, smoking history, tumor type, differentiation, volume, TNM stage, lymph metastasis, etc. was analyzed using statistical software SPSS 21.0. Results Rab25 protein was mainly expressed in cytoplasm and cell membrane. The positive rate of Rab25 in NSCLC was 93.4%, which was significantly higher than that in adjacent normal tissues (27.5%, P＜0.01. The expression of Rab25 protein was significantly associated with the TNM stage and tumor size (P＜0.05. Conclusions The expression of Rab25 is obviously higher in NSCLC than in the adjacent normal tissues, and the expression is associated with TNM stage and tumor size. Moreover, the later of the NSCLC stage, the larger of tumor size, and the higher of Rab25 expression will be in the NSCLC tissue. DOI: 10.11855/j.issn.0577-7402.2014.02.16
Statistical trend analysis methodology for rare failures in changing technical systems
International Nuclear Information System (INIS)
Ott, K.O.; Hoffmann, H.J.
1983-07-01
A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)
High-Throughput Nanoindentation for Statistical and Spatial Property Determination
Hintsala, Eric D.; Hangen, Ude; Stauffer, Douglas D.
2018-04-01
Standard nanoindentation tests are "high throughput" compared to nearly all other mechanical tests, such as tension or compression. However, the typical rates of tens of tests per hour can be significantly improved. These higher testing rates enable otherwise impractical studies requiring several thousands of indents, such as high-resolution property mapping and detailed statistical studies. However, care must be taken to avoid systematic errors in the measurement, including choosing of the indentation depth/spacing to avoid overlap of plastic zones, pileup, and influence of neighboring microstructural features in the material being tested. Furthermore, since fast loading rates are required, the strain rate sensitivity must also be considered. A review of these effects is given, with the emphasis placed on making complimentary standard nanoindentation measurements to address these issues. Experimental applications of the technique, including mapping of welds, microstructures, and composites with varying length scales, along with studying the effect of surface roughness on nominally homogeneous specimens, will be presented.
Statistical methodology for exploring elevational differences in precipitation chemistry
International Nuclear Information System (INIS)
Warren, W.G.; Boehm, M.; Link, D.
1992-01-01
A statistical methodology for exploring the relationships between elevation and precipitation chemistry is outlined and illustrated. The methodology utilizes maximum likelihood estimates and likelihood ratio tests with contour ellipses of assumed bivariate lognormal distributions used to assist in interpretation. The approach was illustrated using 12 NADP/NTN sites located in six study areas in the Wyoming and Colorado Rockies. These sites are part of the Rocky Mountain Deposition Monitoring Project (RMDMP), which was initiated in 1986 to investigate the relationships between elevation and the chemistry of precipitation. The results indicate differences in sulfate concentrations between airsheds, between snow and rain, and between higher and lower elevations. In general, sulfate concentrations in snow are greater at lower elevations and this difference is independent of concentration. A similar relationship for rain was not well established. In addition there is evidence that, overall, the sulfate concentrations differed between the six study areas, although pairwise differences were not always significant
Statistical distribution for generalized ideal gas of fractional-statistics particles
International Nuclear Information System (INIS)
Wu, Y.
1994-01-01
We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Luo, Li; Zhu, Yun; Xiong, Momiao
2012-06-01
The genome-wide association studies (GWAS) designed for next-generation sequencing data involve testing association of genomic variants, including common, low frequency, and rare variants. The current strategies for association studies are well developed for identifying association of common variants with the common diseases, but may be ill-suited when large amounts of allelic heterogeneity are present in sequence data. Recently, group tests that analyze their collective frequency differences between cases and controls shift the current variant-by-variant analysis paradigm for GWAS of common variants to the collective test of multiple variants in the association analysis of rare variants. However, group tests ignore differences in genetic effects among SNPs at different genomic locations. As an alternative to group tests, we developed a novel genome-information content-based statistics for testing association of the entire allele frequency spectrum of genomic variation with the diseases. To evaluate the performance of the proposed statistics, we use large-scale simulations based on whole genome low coverage pilot data in the 1000 Genomes Project to calculate the type 1 error rates and power of seven alternative statistics: a genome-information content-based statistic, the generalized T(2), collapsing method, multivariate and collapsing (CMC) method, individual χ(2) test, weighted-sum statistic, and variable threshold statistic. Finally, we apply the seven statistics to published resequencing dataset from ANGPTL3, ANGPTL4, ANGPTL5, and ANGPTL6 genes in the Dallas Heart Study. We report that the genome-information content-based statistic has significantly improved type 1 error rates and higher power than the other six statistics in both simulated and empirical datasets.
The Status of Native American Women in Higher Education.
Kidwell, Clara Sue
A study of the status of Native American women in higher education obtained questionnaires from 61 undergraduate women at 4 colleges and 9 women with advanced degrees, interviewed 6 women in or about to enter graduate programs, and reviewed previous research and available statistical data. Results indicated that: relatively few Native American…
Critical analysis of adsorption data statistically
Kaushal, Achla; Singh, S. K.
2017-10-01
Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.
A statistical evaluation of asbestos air concentrations
Energy Technology Data Exchange (ETDEWEB)
Lange, J.H. [Envirosafe Training and Consultants, Pittsburgh, PA (United States)
1999-07-01
Both area and personal air samples collected during an asbestos abatement project were matched and statistically analysed. Among the many parameters studied were fibre concentrations and their variability. Mean values for area and personal samples were 0.005 and 0.024 f cm{sup -}-{sup 3} of air, respectively. Summary values for area and personal samples suggest that exposures are low with no single exposure value exceeding the current OSHA TWA value of 0.1 f cm{sup -3} of air. Within- and between-worker analysis suggests that these data are homogeneous. Comparison of within- and between-worker values suggests that the exposure source and variability for abatement are more related to the process than individual practices. This supports the importance of control measures for abatement. Study results also suggest that area and personal samples are not statistically related, that is, there is no association observed for these two sampling methods when data are analysed by correlation or regression analysis. Personal samples were statistically higher in concentration than area samples. Area sampling cannot be used as a surrogate exposure for asbestos abatement workers. (author)
Can Money Buy Happiness? A Statistical Analysis of Predictors for User Satisfaction
Hunter, Ben; Perret, Robert
2011-01-01
2007 data from LibQUAL+[TM] and the ACRL Library Trends and Statistics database were analyzed to determine if there is a statistically significant correlation between library expenditures and usage statistics and library patron satisfaction across 73 universities. The results show that users of larger, better funded libraries have higher…
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Audiology patient fall statistics and risk factors compared to non-audiology patients.
Criter, Robin E; Honaker, Julie A
2016-10-01
To compare fall statistics (e.g. incidence, prevalence), fall risks, and characteristics of patients who seek hearing healthcare from an audiologist to individuals who have not sought such services. Case-control study. Two groups of community-dwelling older adult patients: 25 audiology patients aged 60 years or older (M age: 69.2 years, SD: 4.5, range: 61-77) and a control group (gender- and age-matched ±2 years) of 25 non-audiology patients (M age: 69.6, SD: 4.7, range: 60-77). Annual incidence of falls (most recent 12 months) was higher in audiology patients (68.0%) than non-audiology patients (28.0%; p = .005). Audiology patients reported a higher incidence of multiple recent falls (p =.025) and more chronic health conditions (p = .028) than non-audiology patients. Significantly more audiology patients fall on an annual basis than non-audiology patients, suggesting that falls are a pervasive issue in general hearing clinics. Further action on the part of healthcare professionals providing audiologic services may be necessary to identify individuals at risk for falling.
Disruptive Technologies in Higher Education
Flavin, Michael
2012-01-01
This paper analyses the role of "disruptive" innovative technologies in higher education. In this country and elsewhere, Higher Education Institutions (HEIs) have invested significant sums in learning technologies, with Virtual Learning Environments (VLEs) being more or less universal, but these technologies have not been universally…
Chorny, Joseph A; Frye, Teresa C; Fisher, Beth L; Remmers, Carol L
2018-03-23
The primary high-risk human papillomavirus (hrHPV) assays in the United States are the cobas (Roche) and the Aptima (Hologic). The cobas assay detects hrHPV by DNA analysis while the Aptima detects messenger RNA (mRNA) oncogenic transcripts. As the Aptima assay identifies oncogenic expression, it should have a lower rate of hrHPV and genotype detection. The Kaiser Permanente Regional Reference Laboratory in Denver, Colorado changed its hrHPV assay from the cobas to the Aptima assay. The rates of hrHPV detection and genotyping were compared over successive six-month periods. The overall hrHPV detection rates by the two platforms were similar (9.5% versus 9.1%) and not statistically different. For genotyping, the HPV 16 rate by the cobas was 1.6% and by the Aptima it was 1.1%. These differences were statistically different with the Aptima detecting nearly one-third less HPV 16 infections. With the HPV 18 and HPV 18/45, there was a slightly higher detection rate of HPV 18/45 by the Aptima platform (0.5% versus 0.9%) and this was statistically significant. While HPV 16 represents a low percentage of hrHPV infections, it was detected significantly less by the Aptima assay compared to the cobas assay. This has been previously reported, although not highlighted. Given the test methodologies, one would expect the Aptima to detect less HPV 16. This difference appears to be mainly due to a significantly increased number of non-oncogenic HPV 16 infections detected by the cobas test as there were no differences in HPV 16 detection rates in the high-grade squamous intraepithelial lesions indicating that the two tests have similar sensitivities for oncogenic HPV 16. © 2018 Wiley Periodicals, Inc.
Directory of Open Access Journals (Sweden)
Rammohan Anu
2012-07-01
Full Text Available Abstract Background Despite increased funding of measles vaccination programs by national governments and international aid agencies, structural factors encumber attainment of childhood measles immunisation to levels which may guarantee herd immunity. One of such factors is parental education status. Research on the links between parental education and vaccination has typically focused on the influence of maternal education status. This study aims to demonstrate the independent influence of paternal education status on measles immunisation. Methods Comparable nationally representative survey data were obtained from six countries with the highest numbers of children missing the measles vaccine in 2008. Logistic regression analysis was applied to examine the influence of paternal education on uptake of the first dose of measles vaccination, independent of maternal education, whilst controlling for confounding factors such as respondent’s age, urban/rural residence, province/state of residence, religion, wealth and occupation. Results The results of the analysis show that even if a mother is illiterate, having a father with an education of Secondary (high school schooling and above is statistically significant and positively correlated with the likelihood of a child being vaccinated for measles, in the six countries analysed. Paternal education of secondary or higher level was significantly and independently correlated with measles immunisation uptake after controlling for all potential confounders. Conclusions The influence of paternal education status on measles immunisation uptake was investigated and found to be statistically significant in six nations with the biggest gaps in measles immunisation coverage in 2008. This study underscores the imperative of utilising both maternal and paternal education as screening variables to identify children at risk of missing measles vaccination prospectively.
2012-01-01
Background Despite increased funding of measles vaccination programs by national governments and international aid agencies, structural factors encumber attainment of childhood measles immunisation to levels which may guarantee herd immunity. One of such factors is parental education status. Research on the links between parental education and vaccination has typically focused on the influence of maternal education status. This study aims to demonstrate the independent influence of paternal education status on measles immunisation. Methods Comparable nationally representative survey data were obtained from six countries with the highest numbers of children missing the measles vaccine in 2008. Logistic regression analysis was applied to examine the influence of paternal education on uptake of the first dose of measles vaccination, independent of maternal education, whilst controlling for confounding factors such as respondent’s age, urban/rural residence, province/state of residence, religion, wealth and occupation. Results The results of the analysis show that even if a mother is illiterate, having a father with an education of Secondary (high school) schooling and above is statistically significant and positively correlated with the likelihood of a child being vaccinated for measles, in the six countries analysed. Paternal education of secondary or higher level was significantly and independently correlated with measles immunisation uptake after controlling for all potential confounders. Conclusions The influence of paternal education status on measles immunisation uptake was investigated and found to be statistically significant in six nations with the biggest gaps in measles immunisation coverage in 2008. This study underscores the imperative of utilising both maternal and paternal education as screening variables to identify children at risk of missing measles vaccination prospectively. PMID:22568861
A Nineteenth Century Statistical Society that Abandoned Statistics
Stamhuis, I.H.
2007-01-01
In 1857, a Statistical Society was founded in the Netherlands. Within this society, statistics was considered a systematic, quantitative, and qualitative description of society. In the course of time, the society attracted a wide and diverse membership, although the number of physicians on its rolls
Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben
2017-09-15
Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation
Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.
2016-01-01
A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.
Prevalence of significant bacteriuria among symptomatic and ...
African Journals Online (AJOL)
Data were analyzed using the Statistical Package for Social Sciences (SPSS) version 16.0 (SPSS, Inc., Chicago, Ill). Results: A total of 100 consenting participants were recruited into the study. The mean age was: 23.42 ± 8.31 years and a range of 14‑50 years. Only 9% (9/100) had significant bacteriuria while 44.4% (4/9) ...
Uehleke, Bernhard; Hopfenmueller, Werner; Stange, Rainer; Saller, Reinhard
2012-01-01
Ancient and medieval herbal books are often believed to describe the same claims still in use today. Medieval herbal books, however, provide long lists of claims for each herb, most of which are not approved today, while the herb's modern use is often missing. So the hypothesis arises that a medieval author could have randomly hit on 'correct' claims among his many 'wrong' ones. We developed a statistical procedure based on a simple probability model. We applied our procedure to the herbal books of Hildegard von Bingen (1098- 1179) as an example for its usefulness. Claim attributions for a certain herb were classified as 'correct' if approximately the same as indicated in actual monographs. The number of 'correct' claim attributions was significantly higher than it could have been by pure chance, even though the vast majority of Hildegard von Bingen's claims were not 'correct'. The hypothesis that Hildegard would have achieved her 'correct' claims purely by chance can be clearly rejected. The finding that medical claims provided by a medieval author are significantly related to modern herbal use supports the importance of traditional medicinal systems as an empirical source. However, since many traditional claims are not in accordance with modern applications, they should be used carefully and analyzed in a systematic, statistics-based manner. Our statistical approach can be used for further systematic comparison of herbal claims of traditional sources as well as in the fields of ethnobotany and ethnopharmacology. Copyright © 2012 S. Karger AG, Basel.
Expression and Its Clinical Significance of SLC22A18 in Non-small Cell Lung Cancer
Directory of Open Access Journals (Sweden)
Ming LEI
2012-01-01
Full Text Available Background and objective It has been proven that multidrug resistance (MDR is the main cause of chemotherapy failure in lung cancer. Research on emergence mechanisms of MDR has great clinical significance in improving the curative efficiency of lung cancer chemotherapy. Proteins encoded by the SLC22A18 gene, which is similar to the transmembrane transporter, may influence the sensitivity of chemotherapeutics as well as the metabolism and growth of cells. In addition, these proteins probably have some effect on the development of lung cancer MDR. The aim of the present study is to investigate the expression of SLC22A18 protein in non-small cell lung cancer (NSCLC as well as in corresponding normal lung tissue. Furthermore, the relationship between SLC22A18 expression and pathological grade and TNM stage is analyzed. Methods The expression of SLC22A18 was detected by EnVinsion in 96 cases with NSCLC and in corresponding normal lung tissue. Statistical analysis was performed using SPSS 17.0 statistical software. Results SLC22A18 was mainly located in cell membrane and cytoplasm. The expression level of SLC22A18 in NSCLC was significantly higher than that in normal tissue (P<0.01. The positive rates in squamous cell lung cancer and lung adenocarcinoma were 68% and 78.2%, respectively (P<0.05. Moreover, the higher expression of SLC22A18 was associated with lower histological grade and later TNM stage (P<0.05. Conclusion SLC22A18 protein is overexpressed in NSCLC, and its expression is correlated with pathological grade and TNM stage. These findings provide the experimental basis for investigating the role of tumor and chemoresistance.
Response statistics of rotating shaft with non-linear elastic restoring forces by path integration
Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael
2017-07-01
Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.
Investigating spousal concordance of diabetes through statistical analysis and data mining.
Directory of Open Access Journals (Sweden)
Jong-Yi Wang
Full Text Available Spousal clustering of diabetes merits attention. Whether old-age vulnerability or a shared family environment determines the concordance of diabetes is also uncertain. This study investigated the spousal concordance of diabetes and compared the risk of diabetes concordance between couples and noncouples by using nationally representative data.A total of 22,572 individuals identified from the 2002-2013 National Health Insurance Research Database of Taiwan constituted 5,643 couples and 5,643 noncouples through 1:1 dual propensity score matching (PSM. Factors associated with concordance in both spouses with diabetes were analyzed at the individual level. The risk of diabetes concordance between couples and noncouples was compared at the couple level. Logistic regression was the main statistical method. Statistical data were analyzed using SAS 9.4. C&RT and Apriori of data mining conducted in IBM SPSS Modeler 13 served as a supplement to statistics.High odds of the spousal concordance of diabetes were associated with old age, middle levels of urbanization, and high comorbidities (all P < 0.05. The dual PSM analysis revealed that the risk of diabetes concordance was significantly higher in couples (5.19% than in noncouples (0.09%; OR = 61.743, P < 0.0001.A high concordance rate of diabetes in couples may indicate the influences of assortative mating and shared environment. Diabetes in a spouse implicates its risk in the partner. Family-based diabetes care that emphasizes the screening of couples at risk of diabetes by using the identified risk factors is suggested in prospective clinical practice interventions.
Understanding and forecasting polar stratospheric variability with statistical models
Directory of Open Access Journals (Sweden)
C. Blume
2012-07-01
Full Text Available The variability of the north-polar stratospheric vortex is a prominent aspect of the middle atmosphere. This work investigates a wide class of statistical models with respect to their ability to model geopotential and temperature anomalies, representing variability in the polar stratosphere. Four partly nonstationary, nonlinear models are assessed: linear discriminant analysis (LDA; a cluster method based on finite elements (FEM-VARX; a neural network, namely the multi-layer perceptron (MLP; and support vector regression (SVR. These methods model time series by incorporating all significant external factors simultaneously, including ENSO, QBO, the solar cycle, volcanoes, to then quantify their statistical importance. We show that variability in reanalysis data from 1980 to 2005 is successfully modeled. The period from 2005 to 2011 can be hindcasted to a certain extent, where MLP performs significantly better than the remaining models. However, variability remains that cannot be statistically hindcasted within the current framework, such as the unexpected major warming in January 2009. Finally, the statistical model with the best generalization performance is used to predict a winter 2011/12 with warm and weak vortex conditions. A vortex breakdown is predicted for late January, early February 2012.
Statistical data analysis using SAS intermediate statistical methods
Marasinghe, Mervyn G
2018-01-01
The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
Institute of Scientific and Technical Information of China (English)
Wei Gao; Jing Wang; Chao Zhang; Ping Qin
2017-01-01
Objective:To determine the serum inflammatory cytokines and oxidative stress parameters of diabetic retinopathy (DR) patients to explore their possible role in the DR.Methods: 116 cases of type 2 diabetic patients were selected from June 2015 to June 2016 in our hospital as research subjects, divided into diabetic Diabetes without retinopathy (NDR group,n = 63) and diabetic with retinopathy patients (DR group,n = 53). And 60 cases of healthy check-ups of the same period in our hospital medical center were selected as normal control group (NC). The VEGF, IL-6, TNF-α , MDA and SOD levels of three groups of patients were detected. Results:The IL-6 levels of NC group, NDR group and DR group were increased gradually, and the difference was statistically significant (P<0.05). The TNF-α levels of NC group, NDR group and DR group were increased gradually, and the difference was statistically significant (P<0.05). The VEGF levels of NC group, NDR group and DR group were increased gradually, and the difference was statistically significant (P<0.05). The malondialdehyde (MDA) levels of NC group, NDR group and DR group increased gradually, and the difference was statistically significant (P<0.05). The superoxide dismutase (SOD) levels of NC group, NDR group and DR group were decreased gradually, and the difference was statistically significant (P<0.05). Conclusions: DR patients express high levels of IL-6, TNF-α and VEGF, and there exists significant oxidative stress in DR, which shows that the inflammation occurrence and oxidative stress state play an important role in the development of DR.
The significance test controversy revisited the fiducial Bayesian alternative
Lecoutre, Bruno
2014-01-01
The purpose of this book is not only to revisit the “significance test controversy,”but also to provide a conceptually sounder alternative. As such, it presents a Bayesian framework for a new approach to analyzing and interpreting experimental data. It also prepares students and researchers for reporting on experimental results. Normative aspects: The main views of statistical tests are revisited and the philosophies of Fisher, Neyman-Pearson and Jeffrey are discussed in detail. Descriptive aspects: The misuses of Null Hypothesis Significance Tests are reconsidered in light of Jeffreys’ Bayesian conceptions concerning the role of statistical inference in experimental investigations. Prescriptive aspects: The current effect size and confidence interval reporting practices are presented and seriously questioned. Methodological aspects are carefully discussed and fiducial Bayesian methods are proposed as a more suitable alternative for reporting on experimental results. In closing, basic routine procedures...
Higher vs. lower fluid volume for septic shock
DEFF Research Database (Denmark)
Smith, Søren H; Perner, Anders
2012-01-01
.4 (2.2-5.5) vs. 2.0 (1.6-3.0) mmol l-1, P vs. 54 (45-67), P = 0.73), sequential organ failure assessment (SOFA) score (11 (9-13) vs. 11 (9-13), P = 0.78) and 90-day mortality (48 vs...... volumes. Characteristics between these groups were compared using non-parametric and Chi-square statistics. RESULTS: The 164 included patients received median 4.0 l (IQR 2.3-6.3) of fluid during the first day of septic shock. Patients receiving higher volumes (> 4.0 l) on day 1 had higher p-lactate (3....... 53%, P = 0.27) did not differ between groups. The 95 patients who still had shock on day 3 had received 7.5 l (4.3 - 10.8) of fluid by the end of day 3. Patients receiving higher volumes (> 7.5 l) had higher p-lactate (2.6 (1.7-3.4) vs. 1.9 (1.6-2.4) mmol l-1, P
The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data
Gregory, Kelvin
2006-01-01
The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…
Sadeghi, Fatemeh; Nasseri, Simin; Mosaferi, Mohammad; Nabizadeh, Ramin; Yunesian, Masud; Mesdaghinia, Alireza
2017-05-01
In this research, probable arsenic contamination in drinking water in the city of Ardabil was studied in 163 samples during four seasons. In each season, sampling was carried out randomly in the study area. Results were analyzed statistically applying SPSS 19 software, and the data was also modeled by Arc GIS 10.1 software. The maximum permissible arsenic concentration in drinking water defined by the World Health Organization and Iranian national standard is 10 μg/L. Statistical analysis showed 75, 88, 47, and 69% of samples in autumn, winter, spring, and summer, respectively, had concentrations higher than the national standard. The mean concentrations of arsenic in autumn, winter, spring, and summer were 19.89, 15.9, 10.87, and 14.6 μg/L, respectively, and the overall average in all samples through the year was 15.32 μg/L. Although GIS outputs indicated that the concentration distribution profiles changed in four consecutive seasons, variance analysis of the results showed that statistically there is no significant difference in arsenic levels in four seasons.
Role of quantum statistics in multi-particle decay dynamics
Marchewka, Avi; Granot, Er'el
2015-04-01
The role of quantum statistics in the decay dynamics of a multi-particle state, which is suddenly released from a confining potential, is investigated. For an initially confined double particle state, the exact dynamics is presented for both bosons and fermions. The time-evolution of the probability to measure two-particle is evaluated and some counterintuitive features are discussed. For instance, it is shown that although there is a higher chance of finding the two bosons (as oppose to fermions, and even distinguishable particles) at the initial trap region, there is a higher chance (higher than fermions) of finding them on two opposite sides of the trap as if the repulsion between bosons is higher than the repulsion between fermions. The results are demonstrated by numerical simulations and are calculated analytically in the short-time approximation. Furthermore, experimental validation is suggested.
Role of quantum statistics in multi-particle decay dynamics
International Nuclear Information System (INIS)
Marchewka, Avi; Granot, Er’el
2015-01-01
The role of quantum statistics in the decay dynamics of a multi-particle state, which is suddenly released from a confining potential, is investigated. For an initially confined double particle state, the exact dynamics is presented for both bosons and fermions. The time-evolution of the probability to measure two-particle is evaluated and some counterintuitive features are discussed. For instance, it is shown that although there is a higher chance of finding the two bosons (as oppose to fermions, and even distinguishable particles) at the initial trap region, there is a higher chance (higher than fermions) of finding them on two opposite sides of the trap as if the repulsion between bosons is higher than the repulsion between fermions. The results are demonstrated by numerical simulations and are calculated analytically in the short-time approximation. Furthermore, experimental validation is suggested
Role of quantum statistics in multi-particle decay dynamics
Energy Technology Data Exchange (ETDEWEB)
Marchewka, Avi, E-mail: avi.marchewka@gmail.com [Galei Tchelet St 8 Herzliya (Israel); Granot, Er’el [Department of Electrical and Electronics Engineering, Ariel University, Ariel (Israel)
2015-04-15
The role of quantum statistics in the decay dynamics of a multi-particle state, which is suddenly released from a confining potential, is investigated. For an initially confined double particle state, the exact dynamics is presented for both bosons and fermions. The time-evolution of the probability to measure two-particle is evaluated and some counterintuitive features are discussed. For instance, it is shown that although there is a higher chance of finding the two bosons (as oppose to fermions, and even distinguishable particles) at the initial trap region, there is a higher chance (higher than fermions) of finding them on two opposite sides of the trap as if the repulsion between bosons is higher than the repulsion between fermions. The results are demonstrated by numerical simulations and are calculated analytically in the short-time approximation. Furthermore, experimental validation is suggested.
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Second Language Experience Facilitates Statistical Learning of Novel Linguistic Materials.
Potter, Christine E; Wang, Tianlin; Saffran, Jenny R
2017-04-01
Recent research has begun to explore individual differences in statistical learning, and how those differences may be related to other cognitive abilities, particularly their effects on language learning. In this research, we explored a different type of relationship between language learning and statistical learning: the possibility that learning a new language may also influence statistical learning by changing the regularities to which learners are sensitive. We tested two groups of participants, Mandarin Learners and Naïve Controls, at two time points, 6 months apart. At each time point, participants performed two different statistical learning tasks: an artificial tonal language statistical learning task and a visual statistical learning task. Only the Mandarin-learning group showed significant improvement on the linguistic task, whereas both groups improved equally on the visual task. These results support the view that there are multiple influences on statistical learning. Domain-relevant experiences may affect the regularities that learners can discover when presented with novel stimuli. Copyright © 2016 Cognitive Science Society, Inc.
Adaptive approximation of higher order posterior statistics
Lee, Wonjung
2014-02-01
Filtering is an approach for incorporating observed data into time-evolving systems. Instead of a family of Dirac delta masses that is widely used in Monte Carlo methods, we here use the Wiener chaos expansion for the parametrization of the conditioned probability distribution to solve the nonlinear filtering problem. The Wiener chaos expansion is not the best method for uncertainty propagation without observations. Nevertheless, the projection of the system variables in a fixed polynomial basis spanning the probability space might be a competitive representation in the presence of relatively frequent observations because the Wiener chaos approach not only leads to an accurate and efficient prediction for short time uncertainty quantification, but it also allows to apply several data assimilation methods that can be used to yield a better approximate filtering solution. The aim of the present paper is to investigate this hypothesis. We answer in the affirmative for the (stochastic) Lorenz-63 system based on numerical simulations in which the uncertainty quantification method and the data assimilation method are adaptively selected by whether the dynamics is driven by Brownian motion and the near-Gaussianity of the measure to be updated, respectively. © 2013 Elsevier Inc.
Adaptive approximation of higher order posterior statistics
Lee, Wonjung
2014-01-01
Filtering is an approach for incorporating observed data into time-evolving systems. Instead of a family of Dirac delta masses that is widely used in Monte Carlo methods, we here use the Wiener chaos expansion for the parametrization
Measuring radioactive half-lives via statistical sampling in practice
Lorusso, G.; Collins, S. M.; Jagan, K.; Hitt, G. W.; Sadek, A. M.; Aitken-Smith, P. M.; Bridi, D.; Keightley, J. D.
2017-10-01
The statistical sampling method for the measurement of radioactive decay half-lives exhibits intriguing features such as that the half-life is approximately the median of a distribution closely resembling a Cauchy distribution. Whilst initial theoretical considerations suggested that in certain cases the method could have significant advantages, accurate measurements by statistical sampling have proven difficult, for they require an exercise in non-standard statistical analysis. As a consequence, no half-life measurement using this method has yet been reported and no comparison with traditional methods has ever been made. We used a Monte Carlo approach to address these analysis difficulties, and present the first experimental measurement of a radioisotope half-life (211Pb) by statistical sampling in good agreement with the literature recommended value. Our work also focused on the comparison between statistical sampling and exponential regression analysis, and concluded that exponential regression achieves generally the highest accuracy.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Statistical analogues of thermodynamic extremum principles
Ramshaw, John D.
2018-05-01
As shown by Jaynes, the canonical and grand canonical probability distributions of equilibrium statistical mechanics can be simply derived from the principle of maximum entropy, in which the statistical entropy S=- {k}{{B}}{\\sum }i{p}i{log}{p}i is maximised subject to constraints on the mean values of the energy E and/or number of particles N in a system of fixed volume V. The Lagrange multipliers associated with those constraints are then found to be simply related to the temperature T and chemical potential μ. Here we show that the constrained maximisation of S is equivalent to, and can therefore be replaced by, the essentially unconstrained minimisation of the obvious statistical analogues of the Helmholtz free energy F = E ‑ TS and the grand potential J = F ‑ μN. Those minimisations are more easily performed than the maximisation of S because they formally eliminate the constraints on the mean values of E and N and their associated Lagrange multipliers. This procedure significantly simplifies the derivation of the canonical and grand canonical probability distributions, and shows that the well known extremum principles for the various thermodynamic potentials possess natural statistical analogues which are equivalent to the constrained maximisation of S.
Statistics without Tears: Complex Statistics with Simple Arithmetic
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
International Nuclear Information System (INIS)
Nemnes, G A; Anghel, D V
2010-01-01
We present a stochastic method for the simulation of the time evolution in systems which obey generalized statistics, namely fractional exclusion statistics and Gentile's statistics. The transition rates are derived in the framework of canonical ensembles. This approach introduces a tool for describing interacting fermionic and bosonic systems in non-equilibrium as ideal FES systems, in a computationally efficient manner. The two types of statistics are analyzed comparatively, indicating their intrinsic thermodynamic differences and revealing key aspects related to the species size
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance
Kramer, Karen L.; Veile, Amanda; Ot?rola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger s...
Principles of Statistics: What the Sports Medicine Professional Needs to Know.
Riemann, Bryan L; Lininger, Monica R
2018-07-01
Understanding the results and statistics reported in original research remains a large challenge for many sports medicine practitioners and, in turn, may be among one of the biggest barriers to integrating research into sports medicine practice. The purpose of this article is to provide minimal essentials a sports medicine practitioner needs to know about interpreting statistics and research results to facilitate the incorporation of the latest evidence into practice. Topics covered include the difference between statistical significance and clinical meaningfulness; effect sizes and confidence intervals; reliability statistics, including the minimal detectable difference and minimal important difference; and statistical power. Copyright © 2018 Elsevier Inc. All rights reserved.
Higher positive identification of malignant CSF cells using the cytocentrifuge than the Suta chamber
Directory of Open Access Journals (Sweden)
Sérgio Monteiro de Almeida
Full Text Available ABSTRACT Objective To define how to best handle cerebrospinal fluid (CSF specimens to obtain the highest positivity rate for the diagnosis of malignancy, comparing two different methods of cell concentration, sedimentation and cytocentrifugation. Methods A retrospective analysis of 411 CSF reports. Results This is a descriptive comparative study. The positive identification of malignant CSF cells was higher using the centrifuge than that using the Suta chamber (27.8% vs. 19.0%, respectively; p = 0.038. Centrifuge positively identified higher numbers of malignant cells in samples with a normal concentration of white blood cells (WBCs (< 5 cells/mm3 and with more than 200 cells/mm3, although this was not statistically significant. There was no lymphocyte loss using either method. Conclusions Cytocentrifugation positively identified a greater number of malignant cells in the CSF than cytosedimentation with the Suta chamber. However, there was no difference between the methods when the WBC counts were within the normal range.
Doppke, Max George
This non-experimental, quantitative exploratory study examined the relationship between genders, student residency status, acculturation, worldviews, and the motivation towards science education for a group of 291 undergraduate students in the United States. As all demographic variables were nominal, and all survey variables were ordinal, associations and differences utilized non-parametric statistical procedures. The overall design was descriptive, comparative, and correlational. Spearman's rho signified that there was a moderate positive correlation between the total scores on the Worldview Analysis Scale (WAS) and the total scores on the Science Motivation Questionnaire-II (SMQ-II; rs = .393, *pAnalysis Scale (WAS) and the Science Motivation Questionnaire-II (SMQ-II) to determine if differences in score were based on gender. The WAS score was statistically significantly higher in males (Median = 180.00) than in females ( Median = 164.00, U = 8521.500, z = -2.840, p = .005). . The SMQ-II score was statistically insignificantly higher in males (Median = 152.56) than in females (Median = 140.08, U = 9652.500, z = -1.263, p = .207). In following the fundamental dictates of social research, this study offered a thorough description of a situation that ultimately provokes various possible explanations as necessary conclusions to intellectually stimulating thought, without the burden of propagating dubious inferences through unwarranted deterministic or probabilistic causality. Recommendations for future work include mixed-method studies with interviews, longitudinal studies, instructor-student studies, and gender vs. sexual orientation studies.
Park, Ji Young; Ramachandran, Gurumurthy; Raynor, Peter C; Eberly, Lynn E; Olson, Greg
2010-10-01
Recently, the appropriateness of using the 'mass concentration' metric for ultrafine particles has been questioned and surface area (SA) or number concentration metrics has been proposed as alternatives. To assess the abilities of various exposure metrics to distinguish between different exposure zones in workplaces with nanoparticle aerosols, exposure concentrations were measured in preassigned 'high-' and 'low-'exposure zones in a restaurant, an aluminum die-casting factory, and a diesel engine laboratory using SA, number, and mass concentration metrics. Predetermined exposure classifications were compared by each metric using statistical parameters and concentration ratios that were calculated from the different exposure concentrations. In the restaurant, SA and fine particle number concentrations showed significant differences between the high- and low-exposure zones and they had higher contrast (the ratio of between-zone variance to the sum of the between-zone and within-zone variances) than mass concentrations. Mass concentrations did not show significant differences. In the die cast facility, concentrations of all metrics were significantly greater in the high zone than in the low zone. SA and fine particle number concentrations showed larger concentration ratios between the high and low zones and higher contrast than mass concentrations. None of the metrics were significantly different between the high- and low-exposure zones in the diesel engine laboratory. The SA and fine particle number concentrations appeared to be better at differentiating exposure zones and finding the particle generation sources in workplaces generating nanoparticles. Because the choice of an exposure metric has significant implications for epidemiologic studies and industrial hygiene practice, a multimetric sampling approach is recommended for nanoparticle exposure assessment.
Więckowska, Barbara; Marcinkowska, Justyna
2017-11-06
When searching for epidemiological clusters, an important tool can be to carry out one's own research with the incidence rate from the literature as the reference level. Values exceeding this level may indicate the presence of a cluster in that location. This paper presents a method of searching for clusters that have significantly higher incidence rates than those specified by the investigator. The proposed method uses the classic binomial exact test for one proportion and an algorithm that joins areas with potential clusters while reducing the number of multiple comparisons needed. The sensitivity and specificity are preserved by this new method, while avoiding the Monte Carlo approach and still delivering results comparable to the commonly used Kulldorff's scan statistics and other similar methods of localising clusters. A strong contributing factor afforded by the statistical software that makes this possible is that it allows analysis and presentation of the results cartographically.
Immunization with avian metapneumovirus harboring chicken Fc induces higher immune responses.
Paudel, Sarita; Easwaran, Maheswaran; Jang, Hyun; Jung, Ho-Kyoung; Kim, Joo-Hun; Shin, Hyun-Jin
2016-07-15
In this study, we evaluated the immune responses of avian metapneumovirus harboring chicken Fc molecule. Stable Vero cells expressing chicken Fc chimera on its surface (Vero-cFc) were established, and we confirmed that aMPV grown in Vero-cFc incorporated host derived chimera Fc into the aMPV virions. Immunization of chicken with aMPV-cFc induced higher level of antibodies and inflammatory cytokines; (Interferon (IFN)-γ and Interleukin (IL)-1β) compared to those of aMPV. The increased levels of antibodies and inflammatory cytokines in chicken immunized with aMPV-cFc were statistically significantly (p<0.05) to that of aMPV and control. The aMPV-cFc group also generated the highest neutralizing antibody response. After challenges, chickens immunized with aMPV-cFc showed much less pathological signs in nasal turbinates and trachea so that we could confirm aMPV-cFc induced higher protection than that of aMPV. The greater ability of aMPV harboring chicken Fc to that of aMPV presented it as a possible vaccine candidate. Copyright © 2016 Elsevier B.V. All rights reserved.
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Academic Computing Facilities and Services in Higher Education--A Survey.
Warlick, Charles H.
1986-01-01
Presents statistics about academic computing facilities based on data collected over the past six years from 1,753 institutions in the United States, Canada, Mexico, and Puerto Rico for the "Directory of Computing Facilities in Higher Education." Organizational, functional, and financial characteristics are examined as well as types of…