Statistically significant relational data mining :
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Statistical significance versus clinical relevance.
van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G
2017-04-01
In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Common pitfalls in statistical analysis: Clinical versus statistical significance
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
In clinical research, study results, which are statistically significant are often interpreted as being clinically important. While statistical significance indicates the reliability of the study results, clinical significance reflects its impact on clinical practice. The third article in this series exploring pitfalls in statistical analysis clarifies the importance of differentiating between statistical significance and clinical significance. PMID:26229754
Statistical significance of cis-regulatory modules
Smith Andrew D
2007-01-01
Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.
The thresholds for statistical and clinical significance
Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per
2014-01-01
threshold if the trial is stopped early or if interim analyses have been conducted; (4) adjust the confidence intervals and the P-values for multiplicity due to number of outcome comparisons; and (5) assess clinical significance of the trial results. CONCLUSIONS: If the proposed five-step procedure...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore......, assessment of intervention effects in randomised clinical trials deserves more rigour in order to become more valid. METHODS: Several methodologies for assessing the statistical and clinical significance of intervention effects in randomised clinical trials were considered. Balancing simplicity...
Social significance of community structure: statistical view.
Li, Hui-Jia; Daniels, Jasmine J
2015-01-01
Community structure analysis is a powerful tool for social networks that can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the significance of a partitioned community structure is an urgent and important question. In this paper, integrating the specific characteristics of real society, we present a framework to analyze the significance of a social community. The dynamics of social interactions are modeled by identifying social leaders and corresponding hierarchical structures. Instead of a direct comparison with the average outcome of a random model, we compute the similarity of a given node with the leader by the number of common neighbors. To determine the membership vector, an efficient community detection algorithm is proposed based on the position of the nodes and their corresponding leaders. Then, using a log-likelihood score, the tightness of the community can be derived. Based on the distribution of community tightness, we establish a connection between p-value theory and network analysis, and then we obtain a significance measure of statistical form . Finally, the framework is applied to both benchmark networks and real social networks. Experimental results show that our work can be used in many fields, such as determining the optimal number of communities, analyzing the social significance of a given community, comparing the performance among various algorithms, etc.
Social significance of community structure: Statistical view
Li, Hui-Jia; Daniels, Jasmine J.
2015-01-01
Community structure analysis is a powerful tool for social networks that can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the significance of a partitioned community structure is an urgent and important question. In this paper, integrating the specific characteristics of real society, we present a framework to analyze the significance of a social community. The dynamics of social interactions are modeled by identifying social leaders and corresponding hierarchical structures. Instead of a direct comparison with the average outcome of a random model, we compute the similarity of a given node with the leader by the number of common neighbors. To determine the membership vector, an efficient community detection algorithm is proposed based on the position of the nodes and their corresponding leaders. Then, using a log-likelihood score, the tightness of the community can be derived. Based on the distribution of community tightness, we establish a connection between p -value theory and network analysis, and then we obtain a significance measure of statistical form . Finally, the framework is applied to both benchmark networks and real social networks. Experimental results show that our work can be used in many fields, such as determining the optimal number of communities, analyzing the social significance of a given community, comparing the performance among various algorithms, etc.
Assessing statistical significance in causal graphs
Chindelevitch Leonid
2012-02-01
Full Text Available Abstract Background Causal graphs are an increasingly popular tool for the analysis of biological datasets. In particular, signed causal graphs--directed graphs whose edges additionally have a sign denoting upregulation or downregulation--can be used to model regulatory networks within a cell. Such models allow prediction of downstream effects of regulation of biological entities; conversely, they also enable inference of causative agents behind observed expression changes. However, due to their complex nature, signed causal graph models present special challenges with respect to assessing statistical significance. In this paper we frame and solve two fundamental computational problems that arise in practice when computing appropriate null distributions for hypothesis testing. Results First, we show how to compute a p-value for agreement between observed and model-predicted classifications of gene transcripts as upregulated, downregulated, or neither. Specifically, how likely are the classifications to agree to the same extent under the null distribution of the observed classification being randomized? This problem, which we call "Ternary Dot Product Distribution" owing to its mathematical form, can be viewed as a generalization of Fisher's exact test to ternary variables. We present two computationally efficient algorithms for computing the Ternary Dot Product Distribution and investigate its combinatorial structure analytically and numerically to establish computational complexity bounds. Second, we develop an algorithm for efficiently performing random sampling of causal graphs. This enables p-value computation under a different, equally important null distribution obtained by randomizing the graph topology but keeping fixed its basic structure: connectedness and the positive and negative in- and out-degrees of each vertex. We provide an algorithm for sampling a graph from this distribution uniformly at random. We also highlight theoretical
Significant Statistics: Viewed with a Contextual Lens
Tait-McCutcheon, Sandi
2010-01-01
This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…
Social significance of community structure: Statistical view
Li, Hui-Jia
2015-01-01
Community structure analysis is a powerful tool for social networks, which can simplify their topological and functional analysis considerably. However, since community detection methods have random factors and real social networks obtained from complex systems always contain error edges, evaluating the significance of community structure partitioned is an urgent and important question. In this paper, integrating the specific characteristics of real society, we present a novel framework analyzing the significance of social community specially. The dynamics of social interactions are modeled by identifying social leaders and corresponding hierarchical structures. Instead of a direct comparison with the average outcome of a random model, we compute the similarity of a given node with the leader by the number of common neighbors. To determine the membership vector, an efficient community detection algorithm is proposed based on the position of nodes and their corresponding leaders. Then, using log-likelihood sco...
Caveats for using statistical significance tests in research assessments
2011-01-01
This paper raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators. Statistical significance tests are highly controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with s...
Caveats for using statistical significance tests in research assessments
Schneider, Jesper W
2011-01-01
This paper raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators. Statistical significance tests are highly controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice of such tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We argue that applying statistical significance tests and mechanically adhering to their results is highly problematic and detrimental to critical thinki...
Significance analysis and statistical mechanics: an application to clustering.
Łuksza, Marta; Lässig, Michael; Berg, Johannes
2010-11-26
This Letter addresses the statistical significance of structures in random data: given a set of vectors and a measure of mutual similarity, how likely is it that a subset of these vectors forms a cluster with enhanced similarity among its elements? The computation of this cluster p value for randomly distributed vectors is mapped onto a well-defined problem of statistical mechanics. We solve this problem analytically, establishing a connection between the physics of quenched disorder and multiple-testing statistics in clustering and related problems. In an application to gene expression data, we find a remarkable link between the statistical significance of a cluster and the functional relationships between its genes.
Mass spectrometry based protein identification with accurate statistical significance assignment
Alves, Gelio; Yu, Yi-Kuo
2014-01-01
Motivation: Assigning statistical significance accurately has become increasingly important as meta data of many types, often assembled in hierarchies, are constructed and combined for further biological analyses. Statistical inaccuracy of meta data at any level may propagate to downstream analyses, undermining the validity of scientific conclusions thus drawn. From the perspective of mass spectrometry based proteomics, even though accurate statistics for peptide identification can now be ach...
Significance and importance: some common misapprehensions about statistics
Currey, John; Paul D Baxter; Pitchford, Jonathan W
2009-01-01
Abstract This paper attempts to discuss, in a readily understandable way, some very common misapprehensions that occur in laboratory-based scientists? thinking about statistics. We deal mainly with three issues 1) P-values are best thought of as merely guides to action: are your experimental data consistent with your null hypothesis, or not.? 2) When confronted with statistically non-significant results, you should also think about the power of the statistical test jdc1@york....
The Use of Meta-Analytic Statistical Significance Testing
Polanin, Joshua R.; Pigott, Terri D.
2015-01-01
Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one review, is an underdeveloped literature. We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how…
Yang, Jing; Zammit, Christian; Dudley, Bruce
2017-04-01
The phenomenon of losing and gaining in rivers normally takes place in lowland where often there are various, sometimes conflicting uses for water resources, e.g., agriculture, industry, recreation, and maintenance of ecosystem function. To better support water allocation decisions, it is crucial to understand the location and seasonal dynamics of these losses and gains. We present a statistical methodology to predict losing and gaining river reaches in New Zealand based on 1) information surveys with surface water and groundwater experts from regional government, 2) A collection of river/watershed characteristics, including climate, soil and hydrogeologic information, and 3) the random forests technique. The surveys on losing and gaining reaches were conducted face-to-face at 16 New Zealand regional government authorities, and climate, soil, river geometry, and hydrogeologic data from various sources were collected and compiled to represent river/watershed characteristics. The random forests technique was used to build up the statistical relationship between river reach status (gain and loss) and river/watershed characteristics, and then to predict for river reaches at Strahler order one without prior losing and gaining information. Results show that the model has a classification error of around 10% for "gain" and "loss". The results will assist further research, and water allocation decisions in lowland New Zealand.
Caveats for using statistical significance tests in research assessments
Schneider, Jesper Wiborg
2013-01-01
This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice...... of such tests, their dichotomous application in decision making, the difference between statistical and substantive significance, the implausibility of most null hypotheses, the crucial assumption of randomness, as well as the utility of standard errors and confidence intervals for inferential purposes. We...
Caveats for using statistical significance tests in research assessments
Schneider, Jesper Wiborg
2013-01-01
This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice...... are important or not. On the contrary their use may be harmful. Like many other critics, we generally believe that statistical significance tests are over- and misused in the empirical sciences including scientometrics and we encourage a reform on these matters....
The questioned p value: clinical, practical and statistical significance.
Jiménez-Paneque, Rosa
2016-09-09
The use of p-value and statistical significance have been questioned since the early 80s in the last century until today. Much has been discussed about it in the field of statistics and its applications, especially in Epidemiology and Public Health. As a matter of fact, the p-value and its equivalent, statistical significance, are difficult concepts to grasp for the many health professionals some way involved in research applied to their work areas. However, its meaning should be clear in intuitive terms although it is based on theoretical concepts of the field of Statistics. This paper attempts to present the p-value as a concept that applies to everyday life and therefore intuitively simple but whose proper use cannot be separated from theoretical and methodological elements of inherent complexity. The reasons behind the criticism received by the p-value and its isolated use are intuitively explained, mainly the need to demarcate statistical significance from clinical significance and some of the recommended remedies for these problems are approached as well. It finally refers to the current trend to vindicate the p-value appealing to the convenience of its use in certain situations and the recent statement of the American Statistical Association in this regard.
Statistical significance test for transition matrices of atmospheric Markov chains
Vautard, Robert; Mo, Kingtse C.; Ghil, Michael
1990-01-01
Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.
On detection and assessment of statistical significance of Genomic Islands
Chaudhuri Probal
2008-04-01
Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.
Priya Ranganathan
2015-01-01
Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958
Systematic reviews of anesthesiologic interventions reported as statistically significant
Imberger, Georgina; Gluud, Christian; Boylan, John
2015-01-01
statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that may......: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median number...
Your Chi-Square Test Is Statistically Significant: Now What?
Donald Sharpe
2015-04-01
Full Text Available Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data from two recent journal articles were used to illustrate these approaches. A call is made for greater consideration of foundational techniques such as the chi-square tests.
Lexical Co-occurrence, Statistical Significance, and Word Association
Chaudhari, Dipak; Laxman, Srivatsan
2010-01-01
Lexical co-occurrence is an important cue for detecting word associations. We present a theoretical framework for discovering statistically significant lexical co-occurrences from a given corpus. In contrast with the prevalent practice of giving weightage to unigram frequencies, we focus only on the documents containing both the terms (of a candidate bigram). We detect biases in span distributions of associated words, while being agnostic to variations in global unigram frequencies. Our framework has the fidelity to distinguish different classes of lexical co-occurrences, based on strengths of the document and corpuslevel cues of co-occurrence in the data. We perform extensive experiments on benchmark data sets to study the performance of various co-occurrence measures that are currently known in literature. We find that a relatively obscure measure called Ochiai, and a newly introduced measure CSA capture the notion of lexical co-occurrence best, followed next by LLR, Dice, and TTest, while another popular m...
Fostering Students' Statistical Literacy through Significant Learning Experience
Krishnan, Saras
2015-01-01
A major objective of statistics education is to develop students' statistical literacy that enables them to be educated users of data in context. Teaching statistics in today's educational settings is not an easy feat because teachers have a huge task in keeping up with the demands of the new generation of learners. The present day students have…
A tutorial on hunting statistical significance by chasing N
Denes Szucs
2016-09-01
Full Text Available There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticise some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post-hoc along potential but unplanned independent grouping variables. The first approach 'hacks' the number of participants in studies, the second approach ‘hacks’ the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20-50%, or more false positives.
Statistical significance of spectral lag transition in GRB 160625B
Ganguly, Shalini; Desai, Shantanu
2017-09-01
Recently Wei et al.[1] have found evidence for a transition from positive time lags to negative time lags in the spectral lag data of GRB 160625B. They have fit these observed lags to a sum of two components: an assumed functional form for intrinsic time lag due to astrophysical mechanisms and an energy-dependent speed of light due to quadratic and linear Lorentz invariance violation (LIV) models. Here, we examine the statistical significance of the evidence for a transition to negative time lags. Such a transition, even if present in GRB 160625B, cannot be due to an energy dependent speed of light as this would contradict previous limits by some 3-4 orders of magnitude, and must therefore be of intrinsic astrophysical origin. We use three different model comparison techniques: a frequentist test and two information based criteria (AIC and BIC). From the frequentist model comparison test, we find that the evidence for transition in the spectral lag data is favored at 3.05σ and 3.74σ for the linear and quadratic models respectively. We find that ΔAIC and ΔBIC have values ≳ 10 for the spectral lag transition that was motivated as being due to quadratic Lorentz invariance violating model pointing to ;decisive evidence;. We note however that none of the three models (including the model of intrinsic astrophysical emission) provide a good fit to the data.
Tipping points in the arctic: eyeballing or statistical significance?
Carstensen, Jacob; Weydmann, Agata
2012-02-01
Arctic ecosystems have experienced and are projected to experience continued large increases in temperature and declines in sea ice cover. It has been hypothesized that small changes in ecosystem drivers can fundamentally alter ecosystem functioning, and that this might be particularly pronounced for Arctic ecosystems. We present a suite of simple statistical analyses to identify changes in the statistical properties of data, emphasizing that changes in the standard error should be considered in addition to changes in mean properties. The methods are exemplified using sea ice extent, and suggest that the loss rate of sea ice accelerated by factor of ~5 in 1996, as reported in other studies, but increases in random fluctuations, as an early warning signal, were observed already in 1990. We recommend to employ the proposed methods more systematically for analyzing tipping points to document effects of climate change in the Arctic.
Statistical downscaling rainfall using artificial neural network: significantly wetter Bangkok?
Vu, Minh Tue; Aribarg, Thannob; Supratid, Siriporn; Raghavan, Srivatsan V.; Liong, Shie-Yui
2016-11-01
Artificial neural network (ANN) is an established technique with a flexible mathematical structure that is capable of identifying complex nonlinear relationships between input and output data. The present study utilizes ANN as a method of statistically downscaling global climate models (GCMs) during the rainy season at meteorological site locations in Bangkok, Thailand. The study illustrates the applications of the feed forward back propagation using large-scale predictor variables derived from both the ERA-Interim reanalyses data and present day/future GCM data. The predictors are first selected over different grid boxes surrounding Bangkok region and then screened by using principal component analysis (PCA) to filter the best correlated predictors for ANN training. The reanalyses downscaled results of the present day climate show good agreement against station precipitation with a correlation coefficient of 0.8 and a Nash-Sutcliffe efficiency of 0.65. The final downscaled results for four GCMs show an increasing trend of precipitation for rainy season over Bangkok by the end of the twenty-first century. The extreme values of precipitation determined using statistical indices show strong increases of wetness. These findings will be useful for policy makers in pondering adaptation measures due to flooding such as whether the current drainage network system is sufficient to meet the changing climate and to plan for a range of related adaptation/mitigation measures.
Statistically significant data base of rock properties for geothermal use
Koch, A.; Jorand, R.; Clauser, C.
2009-04-01
The high risk of failure due to the unknown properties of the target rocks at depth is a major obstacle for the exploration of geothermal energy. In general, the ranges of thermal and hydraulic properties given in compilations of rock properties are too large to be useful to constrain properties at a specific site. To overcome this problem, we study the thermal and hydraulic rock properties of the main rock types in Germany in a statistical approach. An important aspect is the use of data from exploration wells that are largely untapped for the purpose of geothermal exploration. In the current project stage, we have been analyzing mostly Devonian and Carboniferous drill cores from 20 deep boreholes in the region of the Lower Rhine Embayment and the Ruhr area (western North Rhine Westphalia). In total, we selected 230 core samples with a length of up to 30 cm from the core archive of the State Geological Survey. The use of core scanning technology allowed the rapid measurement of thermal conductivity, sonic velocity, and gamma density under dry and water saturated conditions with high resolution for a large number of samples. In addition, we measured porosity, bulk density, and matrix density based on Archimedes' principle and pycnometer analysis. As first results we present arithmetic means, medians and standard deviations characterizing the petrophysical properties and their variability for specific lithostratigraphic units. Bi- and multimodal frequency distributions correspond to the occurrence of different lithologies such as shale, limestone, dolomite, sandstone, siltstone, marlstone, and quartz-schist. In a next step, the data set will be combined with logging data and complementary mineralogical analyses to derive the variation of thermal conductivity with depth. As a final result, this may be used to infer thermal conductivity for boreholes without appropriate core data which were drilled in similar geological settings.
Statistical significance of seasonal warming/cooling trends
Ludescher, Josef; Bunde, Armin; Schellnhuber, Hans Joachim
2017-04-01
The question whether a seasonal climate trend (e.g., the increase of summer temperatures in Antarctica in the last decades) is of anthropogenic or natural origin is of great importance for mitigation and adaption measures alike. The conventional significance analysis assumes that (i) the seasonal climate trends can be quantified by linear regression, (ii) the different seasonal records can be treated as independent records, and (iii) the persistence in each of these seasonal records can be characterized by short-term memory described by an autoregressive process of first order. Here we show that assumption ii is not valid, due to strong intraannual correlations by which different seasons are correlated. We also show that, even in the absence of correlations, for Gaussian white noise, the conventional analysis leads to a strong overestimation of the significance of the seasonal trends, because multiple testing has not been taken into account. In addition, when the data exhibit long-term memory (which is the case in most climate records), assumption iii leads to a further overestimation of the trend significance. Combining Monte Carlo simulations with the Holm-Bonferroni method, we demonstrate how to obtain reliable estimates of the significance of the seasonal climate trends in long-term correlated records. For an illustration, we apply our method to representative temperature records from West Antarctica, which is one of the fastest-warming places on Earth and belongs to the crucial tipping elements in the Earth system.
Wilkinson, Michael
2014-03-01
Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.
Changing Statistical Significance with the Amount of Information: The Adaptive α Significance Level.
Pérez, María-Eglée; Pericchi, Luis Raúl
2014-02-01
We put forward an adaptive alpha which changes with the amount of sample information. This calibration may be interpreted as a Bayes/non-Bayes compromise, and leads to statistical consistency. The calibration can also be used to produce confidence intervals whose size take in consideration the amount of observed information.
Changing Statistical Significance with the Amount of Information: The Adaptive α Significance Level☆
Pérez, María-Eglée; Pericchi, Luis Raúl
2014-01-01
We put forward an adaptive alpha which changes with the amount of sample information. This calibration may be interpreted as a Bayes/non-Bayes compromise, and leads to statistical consistency. The calibration can also be used to produce confidence intervals whose size take in consideration the amount of observed information. PMID:24511173
Lies, damned lies and statistics: Clinical importance versus statistical significance in research.
Mellis, Craig
2017-02-28
Correctly performed and interpreted statistics play a crucial role for both those who 'produce' clinical research, and for those who 'consume' this research. Unfortunately, however, there are many misunderstandings and misinterpretations of statistics by both groups. In particular, there is a widespread lack of appreciation for the severe limitations with p values. This is a particular problem with small sample sizes and low event rates - common features of many published clinical trials. These issues have resulted in increasing numbers of false positive clinical trials (false 'discoveries'), and the well-publicised inability to replicate many of the findings. While chance clearly plays a role in these errors, many more are due to either poorly performed or badly misinterpreted statistics. Consequently, it is essential that whenever p values appear, these need be accompanied by both 95% confidence limits and effect sizes. These will enable readers to immediately assess the plausible range of results, and whether or not the effect is clinically meaningful.
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Hayek Lee-Ann C.
2005-01-01
Full Text Available Several analytic techniques have been used to determine sexual dimorphism in vertebrate morphological measurement data with no emergent consensus on which technique is superior. A further confounding problem for frog data is the existence of considerable measurement error. To determine dimorphism, we examine a single hypothesis (Ho = equal means for two groups (females and males. We demonstrate that frog measurement data meet assumptions for clearly defined statistical hypothesis testing with statistical linear models rather than those of exploratory multivariate techniques such as principal components, correlation or correspondence analysis. In order to distinguish biological from statistical significance of hypotheses, we propose a new protocol that incorporates measurement error and effect size. Measurement error is evaluated with a novel measurement error index. Effect size, widely used in the behavioral sciences and in meta-analysis studies in biology, proves to be the most useful single metric to evaluate whether statistically significant results are biologically meaningful. Definitions for a range of small, medium, and large effect sizes specifically for frog measurement data are provided. Examples with measurement data for species of the frog genus Leptodactylus are presented. The new protocol is recommended not only to evaluate sexual dimorphism for frog data but for any animal measurement data for which the measurement error index and observed or a priori effect sizes can be calculated.
Zhao, Guixiang
2017-04-01
Based on the hourly TBB and cloud images of FY-2E, meteorological observation data, and NCEP reanalysis data with 1°×1° spatial resolution from May to October during 2005-2014, the climatic characteristics of mesoscale convective systems (MCS) over the middle reaches area of the Yellow River were analyzed, including mesoscale convective complex (MCC), persistent elongated convective systems (PECS), meso-βscale MCC (MβCCS) and Meso-βscale PECS (MβECS). The results are as follows: (1) MCS tended to occur over the middle and south of Gansu, the middle and south of Shanxi, the middle and north of Shaanxi, and the border of Shanxi, Shaanxi and Inner Mongolia. MCS over the middle reaches area of the Yellow River formed in May to October, and was easy to develop the mature in summer. MCC and MβECS were main MCS causing precipitation in summer. (2) The daily variation of MCS was obvious, and usually formed and matured in the afternoon and the evening to early morning of the next day. Most MCS generated fast and dissipated slowly, and were mainly move to the easterly and southeasterly, but the moving of round shape MCS was less than the elongated shape's. (3) The average TBB for the round shape MCS was lower than the elongated shape MCS. The development of MCC was most vigorous and strong, and it was the strongest in August, while that of MβECS wasn't obviously influenced by the seasonal change. The average eccentricity of the mature MCC and PECS over the middle reaches area of the Yellow River was greater than that in USA, and the former was greater than in the lower reaches area of the Yellow River, while the latter was smaller. (4) The characteristics of rainfall caused by MCS were complex over the middle reaches area of the Yellow River, and there were obvious regional difference. There was wider, stronger and longer precipitation when the multiple MCS merged. The rainfall in the center of cloud area was obviously greater than in other region of cloud area. The
Engsted, Tom
reliable estimates, and I argue that significance tests are useful tools in those cases where a statistical model serves as input in the quantification of an economic model. Finally, I provide a specific example from economics - asset return predictability - where the distinction between statistical......I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model...
EasyGene – a prokaryotic gene finder that ranks ORFs by statistical significance
Larsen, Thomas Schou; Krogh, Anders Stærmose
2003-01-01
in Swiss-Prot, a high quality training set of genes is automatically extracted from the genome and used to estimate the HMM. Putative genes are then scored with the HMM, and based on score and length of an ORF, the statistical significance is calculated. The measure of statistical significance for an ORF...... is the expected number of ORFs in one megabase of random sequence at the same significance level or better, where the random sequence has the same statistics as the genome in the sense of a third order Markov chain.Conclusions: The result is a flexible gene finder whose overall performance matches or exceeds...
Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza
2014-01-01
This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
The statistical significance of the N-S asymmetry of solar activity revisited
Carbonell, M; Oliver, R; Ballester, J L
2007-01-01
The main aim of this study is to point out the difficulties found when trying to assess the statistical significance of the North-South asymmetry (hereafter SSNSA) of the most usually considered time series of solar activity. First of all, we distinguish between solar activity time series composed by integer or non-integer and dimensionless data, or composed by non-integer and dimensional data. For each of these cases, we discuss the most suitable statistical tests which can be applied and highlight the difficulties to obtain valid information about the statistical significance of solar activity time series. Our results suggest that, apart from the need to apply the suitable statistical tests, other effects such as the data binning, the considered units and the need, in some tests, to consider groups of data, affect substantially the determination of the statistical significance of the asymmetry. Our main conclusion is that the assessment of the statistical significance of the N-S asymmetry of solar activity ...
Zhang, Zhang
2012-03-22
Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.
Zhang Zhang
2012-03-01
Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.
Coulson, Melissa; Healey, Michelle; Fidler, Fiona; Cumming, Geoff
2010-01-01
A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST), or confidence intervals (CIs). Authors of articles published in psychology, behavioral neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.
Melissa Coulson
2010-07-01
Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.
Does Statistical Significance Help to Evaluate Predictive Performance of Competing Models?
Levent Bulut
2016-04-01
Full Text Available In Monte Carlo experiment with simulated data, we show that as a point forecast criterion, the Clark and West's (2006 unconditional test of mean squared prediction errors does not reflect the relative performance of a superior model over a relatively weaker one. The simulation results show that even though the mean squared prediction errors of a constructed superior model is far below a weaker alternative, the Clark- West test does not reflect this in their test statistics. Therefore, studies that use this statistic in testing the predictive accuracy of alternative exchange rate models, stock return predictability, inflation forecasting, and unemployment forecasting should not weight too much on the magnitude of the statistically significant Clark-West tests statistics.
EasyGene – a prokaryotic gene finder that ranks ORFs by statistical significance
Larsen Thomas
2003-06-01
Full Text Available Abstract Background Contrary to other areas of sequence analysis, a measure of statistical significance of a putative gene has not been devised to help in discriminating real genes from the masses of random Open Reading Frames (ORFs in prokaryotic genomes. Therefore, many genomes have too many short ORFs annotated as genes. Results In this paper, we present a new automated gene-finding method, EasyGene, which estimates the statistical significance of a predicted gene. The gene finder is based on a hidden Markov model (HMM that is automatically estimated for a new genome. Using extensions of similarities in Swiss-Prot, a high quality training set of genes is automatically extracted from the genome and used to estimate the HMM. Putative genes are then scored with the HMM, and based on score and length of an ORF, the statistical significance is calculated. The measure of statistical significance for an ORF is the expected number of ORFs in one megabase of random sequence at the same significance level or better, where the random sequence has the same statistics as the genome in the sense of a third order Markov chain. Conclusions The result is a flexible gene finder whose overall performance matches or exceeds other methods. The entire pipeline of computer processing from the raw input of a genome or set of contigs to a list of putative genes with significance is automated, making it easy to apply EasyGene to newly sequenced organisms. EasyGene with pre-trained models can be accessed at http://www.cbs.dtu.dk/services/EasyGene.
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…
Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.
Deegear, James
This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...
[Tests of statistical significance in three biomedical journals: a critical review].
Sarria Castro, Madelaine; Silva Ayçaguer, Luis Carlos
2004-05-01
To describe the use of conventional tests of statistical significance and the current trends shown by their use in three biomedical journals read in Spanish-speaking countries. All descriptive or explanatory original articles published in the five-year period of 1996 through 2000 were reviewed in three journals: Revista Cubana de Medicina General Integral [Cuban Journal of Comprehensive General Medicine], Revista Panamericana de Salud Pública/Pan American Journal of Public Health, and Medicina Clínica [Clinical Medicine] (which is published in Spain). In the three journals that were reviewed various shortcomings were found in their use of hypothesis tests based on P values and in the limited use of new tools that have been suggested for use in their place: confidence intervals (CIs) and Bayesian inference. The basic findings of our research were: minimal use of CIs, as either a complement to significance tests or as the only statistical tool; mentions of a small sample size as a possible explanation for the lack of statistical significance; a predominant use of rigid alpha values; a lack of uniformity in the presentation of results; and improper reference in the research conclusions to the results of hypothesis tests. Our results indicate the lack of compliance by authors and editors with accepted standards for the use of tests of statistical significance. The findings also highlight that the stagnant use of these tests continues to be a common practice in the scientific literature.
Statistical significance of trends in monthly heavy precipitation over the US
Mahajan, Salil
2011-05-11
Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.
Krapivin, Vladimir F.; Varotsos, Costas A.; Soldatov, Vladimir Yu.
2017-01-01
This paper presents the results obtained from the study of the sustainable state between nature and human society on a global scale, focusing on the most critical interactions between the natural and anthropogenic processes. Apart from the conventional global models, the basic tool employed herein is the newly proposed complex model entitled “nature-society system (NSS) model”, through which a reliable modeling of the processes taking place in the global climate-nature-society system (CNSS) is achieved. This universal tool is mainly based on the information technology that allows the adaptive conformance of the parametric and functional space of this model. The structure of this model includes the global biogeochemical cycles, the hydrological cycle, the demographic processes and a simple climate model. In this model, the survivability indicator is used as a criterion for the survival of humanity, which defines a trend in the dynamics of the total biomass of the biosphere, taking into account the trends of the biocomplexity dynamics of the land and hydrosphere ecosystems. It should be stressed that there are no other complex global models comparable to those of the CNSS model developed here. The potential of this global model is demonstrated through specific examples in which the classification of the terrestrial ecosystem is accomplished by separating 30 soil-plant formations for geographic pixels 4° × 5°. In addition, humanity is considered to be represented by three groups of economic development status (high, transition, developing) and the World Ocean is parameterized by three latitude zones (low, middle, high). The modelling results obtained show the dynamics of the CNSS at the beginning of the 23rd century, according to which the world population can reach the level of 14 billion without the occurrence of major negative impacts. PMID:28783136
Krapivin, Vladimir F; Varotsos, Costas A; Soldatov, Vladimir Yu
2017-08-07
This paper presents the results obtained from the study of the sustainable state between nature and human society on a global scale, focusing on the most critical interactions between the natural and anthropogenic processes. Apart from the conventional global models, the basic tool employed herein is the newly proposed complex model entitled "nature-society system (NSS) model", through which a reliable modeling of the processes taking place in the global climate-nature-society system (CNSS) is achieved. This universal tool is mainly based on the information technology that allows the adaptive conformance of the parametric and functional space of this model. The structure of this model includes the global biogeochemical cycles, the hydrological cycle, the demographic processes and a simple climate model. In this model, the survivability indicator is used as a criterion for the survival of humanity, which defines a trend in the dynamics of the total biomass of the biosphere, taking into account the trends of the biocomplexity dynamics of the land and hydrosphere ecosystems. It should be stressed that there are no other complex global models comparable to those of the CNSS model developed here. The potential of this global model is demonstrated through specific examples in which the classification of the terrestrial ecosystem is accomplished by separating 30 soil-plant formations for geographic pixels 4° × 5°. In addition, humanity is considered to be represented by three groups of economic development status (high, transition, developing) and the World Ocean is parameterized by three latitude zones (low, middle, high). The modelling results obtained show the dynamics of the CNSS at the beginning of the 23rd century, according to which the world population can reach the level of 14 billion without the occurrence of major negative impacts.
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...
Statistical significance of variables driving systematic variation in high-dimensional data
Chung, Neo Christopher; Storey, John D.
2015-01-01
Motivation: There are a number of well-established methods such as principal component analysis (PCA) for automatically capturing systematic variation due to latent variables in large-scale genomic data. PCA and related methods may directly provide a quantitative characterization of a complex biological variable that is otherwise difficult to precisely define or model. An unsolved problem in this context is how to systematically identify the genomic variables that are drivers of systematic variation captured by PCA. Principal components (PCs) (and other estimates of systematic variation) are directly constructed from the genomic variables themselves, making measures of statistical significance artificially inflated when using conventional methods due to over-fitting. Results: We introduce a new approach called the jackstraw that allows one to accurately identify genomic variables that are statistically significantly associated with any subset or linear combination of PCs. The proposed method can greatly simplify complex significance testing problems encountered in genomics and can be used to identify the genomic variables significantly associated with latent variables. Using simulation, we demonstrate that our method attains accurate measures of statistical significance over a range of relevant scenarios. We consider yeast cell-cycle gene expression data, and show that the proposed method can be used to straightforwardly identify genes that are cell-cycle regulated with an accurate measure of statistical significance. We also analyze gene expression data from post-trauma patients, allowing the gene expression data to provide a molecularly driven phenotype. Using our method, we find a greater enrichment for inflammatory-related gene sets compared to the original analysis that uses a clinically defined, although likely imprecise, phenotype. The proposed method provides a useful bridge between large-scale quantifications of systematic variation and gene
Stern, J M
2010-01-01
This book presents our case in defense of a constructivist epistemological framework and the use of compatible statistical theory and inference tools. The basic metaphor of decision theory is the maximization of a gambler's expected fortune, according to his own subjective utility, prior beliefs an learned experiences. This metaphor has proven to be very useful, leading the development of Bayesian statistics since its XX-th century revival, rooted on the work of de Finetti, Savage and others. The basic metaphor presented in this text, as a foundation for cognitive constructivism, is that of an eigen-solution, and the verification of its objective epistemic status. The FBST - Full Bayesian Significance Test - is the cornerstone of a set of statistical tolls conceived to assess the epistemic value of such eigen-solutions, according to their four essential attributes, namely, sharpness, stability, separability and composability. We believe that this alternative perspective, complementary to the one ofered by dec...
McClure, Mark; Chiu, Kitkwan; Ranganath, Rajesh
2016-01-01
In this study, we develop a statistical method for identifying induced seismicity from large datasets and apply the method to decades of wastewater disposal and seismicity data in California and Oklahoma. The method is robust against a variety of potential pitfalls. The study regions are divided into gridblocks. We use a longitudinal study design, seeking associations between seismicity and wastewater injection along time-series within each gridblock. The longitudinal design helps control for non-random application of wastewater injection. We define a statistical model that is flexible enough to describe the seismicity observations, which have temporal correlation and high kurtosis. In each gridblock, we find the maximum likelihood estimate for a model parameter that relates induced seismicity hazard to total volume of wastewater injected each year. To assess significance, we compute likelihood ratio test statistics in each gridblock and each state, California and Oklahoma. Resampling is used to empirically d...
Jakobsen, Janus Christian; Wetterslev, Jørn; Winkel, Per;
2014-01-01
BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...
Tornetta Paul
2008-01-01
Full Text Available Abstract Background Evidence-based medicine posits that health care research is founded upon clinically important differences in patient centered outcomes. Statistically significant differences between two treatments may not necessarily reflect a clinically important difference. We aimed to quantify the sample sizes and magnitude of treatment effects in a review of orthopaedic randomized trials with statistically significant findings. Methods We conducted a comprehensive search (PubMed, Cochrane for all randomized controlled trials between 1/1/95 to 12/31/04. Eligible studies include those that focused upon orthopaedic trauma. Baseline characteristics and treatment effects were abstracted by two reviewers. Briefly, for continuous outcome measures (ie functional scores, we calculated effect sizes (mean difference/standard deviation. Dichotomous variables (ie infection, nonunion were summarized as absolute risk differences and relative risk reductions (RRR. Effect sizes >0.80 and RRRs>50% were defined as large effects. Using regression analysis we examined the association between the total number of outcome events and treatment effect (dichotomous outcomes. Results Our search yielded 433 randomized controlled trials (RCTs, of which 76 RCTs with statistically significant findings on 184 outcomes (122 continuous/62 dichotomous outcomes met study eligibility criteria. The mean effect size across studies with continuous outcome variables was 1.7 (95% confidence interval: 1.43–1.97. For dichotomous outcomes, the mean risk difference was 30% (95%confidence interval:24%–36% and the mean relative risk reduction was 61% (95% confidence interval: 55%–66%; range: 0%–97%. Fewer numbers of total outcome events in studies was strongly correlated with increasing magnitude of the treatment effect (Pearson's R = -0.70, p Conclusion Our review suggests that statistically significant results in orthopaedic trials have the following implications-1 On average
Xia, Li C; Ai, Dongmei; Cram, Jacob; Fuhrman, Jed A; Sun, Fengzhu
2013-01-15
Local similarity analysis of biological time series data helps elucidate the varying dynamics of biological systems. However, its applications to large scale high-throughput data are limited by slow permutation procedures for statistical significance evaluation. We developed a theoretical approach to approximate the statistical significance of local similarity analysis based on the approximate tail distribution of the maximum partial sum of independent identically distributed (i.i.d.) random variables. Simulations show that the derived formula approximates the tail distribution reasonably well (starting at time points > 10 with no delay and > 20 with delay) and provides P-values comparable with those from permutations. The new approach enables efficient calculation of statistical significance for pairwise local similarity analysis, making possible all-to-all local association studies otherwise prohibitive. As a demonstration, local similarity analysis of human microbiome time series shows that core operational taxonomic units (OTUs) are highly synergetic and some of the associations are body-site specific across samples. The new approach is implemented in our eLSA package, which now provides pipelines for faster local similarity analysis of time series data. The tool is freely available from eLSA's website: http://meta.usc.edu/softs/lsa. Supplementary data are available at Bioinformatics online. fsun@usc.edu.
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo
2016-02-01
Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
Rudd, James; Moore, Jason H; Urbanowicz, Ryan J
2013-11-01
Permutation-based statistics for evaluating the significance of class prediction, predictive attributes, and patterns of association have only appeared within the learning classifier system (LCS) literature since 2012. While still not widely utilized by the LCS research community, formal evaluations of test statistic confidence are imperative to large and complex real world applications such as genetic epidemiology where it is standard practice to quantify the likelihood that a seemingly meaningful statistic could have been obtained purely by chance. LCS algorithms are relatively computationally expensive on their own. The compounding requirements for generating permutation-based statistics may be a limiting factor for some researchers interested in applying LCS algorithms to real world problems. Technology has made LCS parallelization strategies more accessible and thus more popular in recent years. In the present study we examine the benefits of externally parallelizing a series of independent LCS runs such that permutation testing with cross validation becomes more feasible to complete on a single multi-core workstation. We test our python implementation of this strategy in the context of a simulated complex genetic epidemiological data mining problem. Our evaluations indicate that as long as the number of concurrent processes does not exceed the number of CPU cores, the speedup achieved is approximately linear.
Madsen, Tobias
2017-01-01
are used to scale the aforementioned driver detection methods to a dataset consisting of more than 2,000 cancer genomes. The sizes and dimensionalities of genomic data sets, be it a large number of genes or multiple heterogeneous data sources, pose both great statistical opportunities and challenges....... This distribution can be learned across the entire set of genes and then be used to improve inference on the level of the individual gene. A practical way to implement this insight is using empirical Bayes. This idea is one of the main statistical underpinnings of the present work. The thesis consist of three main...... manuscripts as well as two supplementary manuscripts. In the first manuscript we explore efficient significance evaluation for models defined with factor graphs. Factor graphs are a class of graphical models encompassing both Bayesian networks and Markov models. We specifically develop a saddle...
Statistically Non-significant Papers in Environmental Health Studies included more Outcome Variables
Pentti Nieminen; Khaled Abass; Kirsi Vhkanga; Arja Rautio
2015-01-01
Objective The number of analyzed outcome variables is important in the statistical analysis and interpretation of research findings. This study investigated published papers in the field of environmental health studies. We aimed to examine whether differences in the number of reported outcome variables exist between papers with non-significant findings compared to those with significant findings. Articles on the maternal exposure to mercury and child development were used as examples. Methods Articles published between 1995 and 2013 focusing on the relationships between maternal exposure to mercury and child development were collected from Medline and Scopus. Results Of 87 extracted papers, 73 used statistical significance testing and 38 (43.7%) of these reported ‘non-significant’ (P>0.05) findings. The median number of child development outcome variables in papers reporting ‘significant’ (n=35) and ‘non-significant’ (n=38) results was 4 versus 7, respectively (Mann-Whitney test P-value=0.014). An elevated number of outcome variables was especially found in papers reporting non-significant associations between maternal mercury and outcomes when mercury was the only analyzed exposure variable. Conclusion Authors often report analyzed health outcome variables based on their P-values rather than on stated primary research questions. Such a practice probably skews the research evidence.
How to get statistically significant effects in any ERP experiment (and why you shouldn't).
Luck, Steven J; Gaspelin, Nicholas
2017-01-01
ERP experiments generate massive datasets, often containing thousands of values for each participant, even after averaging. The richness of these datasets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant but bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand-averaged data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multifactor statistical analyses. Reanalyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant but bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions.
Statistical significance estimation of a signal within the GooFit framework on GPUs
Cristella, Leonardo; Di Florio, Adriano; Pompili, Alexis
2017-03-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
Deriving statistical significance maps for SVM based image classification and group comparisons.
Gaonkar, Bilwaj; Davatzikos, Christos
2012-01-01
Population based pattern analysis and classification for quantifying structural and functional differences between diverse groups has been shown to be a powerful tool for the study of a number of diseases, and is quite commonly used especially in neuroimaging. The alternative to these pattern analysis methods, namely mass univariate methods such as voxel based analysis and all related methods, cannot detect multivariate patterns associated with group differences, and are not particularly suitable for developing individual-based diagnostic and prognostic biomarkers. A commonly used pattern analysis tool is the support vector machine (SVM). Unlike univariate statistical frameworks for morphometry, analytical tools for statistical inference are unavailable for the SVM. In this paper, we show that null distributions ordinarily obtained by permutation tests using SVMs can be analytically approximated from the data. The analytical computation takes a small fraction of the time it takes to do an actual permutation test, thereby rendering it possible to quickly create statistical significance maps derived from SVMs. Such maps are critical for understanding imaging patterns of group differences and interpreting which anatomical regions are important in determining the classifier's decision.
RT-PSM, a real-time program for peptide-spectrum matching with statistical significance.
Wu, Fang-Xiang; Gagné, Pierre; Droit, Arnaud; Poirier, Guy G
2006-01-01
The analysis of complex biological peptide mixtures by tandem mass spectrometry (MS/MS) produces a huge body of collision-induced dissociation (CID) MS/MS spectra. Several methods have been developed for identifying peptide-spectrum matches (PSMs) by assigning MS/MS spectra to peptides in a database. However, most of these methods either do not give the statistical significance of PSMs (e.g., SEQUEST) or employ time-consuming computational methods to estimate the statistical significance (e.g., PeptideProphet). In this paper, we describe a new algorithm, RT-PSM, which can be used to identify PSMs and estimate their accuracy statistically in real time. RT-PSM first computes PSM scores between an MS/MS spectrum and a set of candidate peptides whose masses are within a preset tolerance of the MS/MS precursor ion mass. Then the computed PSM scores of all candidate peptides are employed to fit the expectation value distribution of the scores into a second-degree polynomial function in PSM score. The statistical significance of the best PSM is estimated by extrapolating the fitting polynomial function to the best PSM score. RT-PSM was tested on two pairs of MS/MS spectrum datasets and protein databases to investigate its performance. The MS/MS spectra were acquired using an ion trap mass spectrometer equipped with a nano-electrospray ionization source. The results show that RT-PSM has good sensitivity and specificity. Using a 55,577-entry protein database and running on a standard Pentium-4, 2.8-GHz CPU personal computer, RT-PSM can process peptide spectra on a sequential, one-by-one basis in 0.047 s on average, compared to more than 7 s per spectrum on average for Sequest and X!Tandem, in their current batch-mode processing implementations. RT-PSM is clearly shown to be fast enough for real-time PSM assignment of MS/MS spectra generated every 3 s or so by a 3D ion trap or by a QqTOF instrument.
Jefferson, L; Cooper, E; Hewitt, C; Torgerson, T; Cook, L; Tharmanathan, P; Cockayne, S; Torgerson, D
2016-01-01
Objective Time-lag from study completion to publication is a potential source of publication bias in randomised controlled trials. This study sought to update the evidence base by identifying the effect of the statistical significance of research findings on time to publication of trial results. Design Literature searches were carried out in four general medical journals from June 2013 to June 2014 inclusive (BMJ, JAMA, the Lancet and the New England Journal of Medicine). Setting Methodological review of four general medical journals. Participants Original research articles presenting the primary analyses from phase 2, 3 and 4 parallel-group randomised controlled trials were included. Main outcome measures Time from trial completion to publication. Results The median time from trial completion to publication was 431 days (n = 208, interquartile range 278–618). A multivariable adjusted Cox model found no statistically significant difference in time to publication for trials reporting positive or negative results (hazard ratio: 0.86, 95% CI 0.64 to 1.16, p = 0.32). Conclusion In contrast to previous studies, this review did not demonstrate the presence of time-lag bias in time to publication. This may be a result of these articles being published in four high-impact general medical journals that may be more inclined to publish rapidly, whatever the findings. Further research is needed to explore the presence of time-lag bias in lower quality studies and lower impact journals. PMID:27757242
Hu, R; Hu, Rui; Wang, Bin
2000-01-01
Finding out statistically significant words in DNA and protein sequences forms the basis for many genetic studies. By applying the maximal entropy principle, we give one systematic way to study the nonrandom occurrence of words in DNA or protein sequences. Through comparison with experimental results, it was shown that patterns of regulatory binding sites in Saccharomyces cerevisiae(yeast) genomes tend to occur significantly in the promoter regions. We studied two correlated gene family of yeast. The method successfully extracts the binding sites varified by experiments in each family. Many putative regulatory sites in the upstream regions are proposed. The study also suggested that some regulatory sites are a ctive in both directions, while others show directional preference.
Crow, C.J.
1985-01-01
Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.
Mining Statistically Significant Substrings Based on the Chi-Square Measure
Bhattacharya, Sourav Dutta Arnab
2010-01-01
Given the vast reservoirs of data stored worldwide, efficient mining of data from a large information store has emerged as a great challenge. Many databases like that of intrusion detection systems, web-click records, player statistics, texts, proteins etc., store strings or sequences. Searching for an unusual pattern within such long strings of data has emerged as a requirement for diverse applications. Given a string, the problem then is to identify the substrings that differs the most from the expected or normal behavior, i.e., the substrings that are statistically significant. In other words, these substrings are less likely to occur due to chance alone and may point to some interesting information or phenomenon that warrants further exploration. To this end, we use the chi-square measure. We propose two heuristics for retrieving the top-k substrings with the largest chi-square measure. We show that the algorithms outperform other competing algorithms in the runtime, while maintaining a high approximation...
Zhang, Pan
2014-01-01
Modularity is a popular measure of community structure. However, maximizing the modularity can lead to many competing partitions with almost the same modularity that are poorly correlated to each other; it can also overfit, producing illusory "communities" in random graphs where none exist. We address this problem by using the modularity as a Hamiltonian, and computing the marginals of the resulting Gibbs distribution. If we assign each node to its most-likely community under these marginals, we claim that, unlike the ground state, the resulting partition is a good measure of statistically-significant community structure. We propose an efficient Belief Propagation (BP) algorithm to compute these marginals. In random networks with no true communities, the system has two phases as we vary the temperature: a paramagnetic phase where all marginals are equal, and a spin glass phase where BP fails to converge. In networks with real community structure, there is an additional retrieval phase where BP converges, and ...
Statistical Significance of Non-Reproducibility of Cross Sections in Dissipative Reactions
王琦; 董玉川; 李松林; 田文栋; 李志常; 路秀琴; 赵葵; 符长波; 刘建成; 姜华; 胡桂青
2003-01-01
Two independent excitation function measurements have been performed in the reaction system of 19F+93 Nb using two target foils of the same nominal thickness. We measured the dissipative reaction products at incident energies of 102 through 108 MeV with a step of 250keV. The variance of energy autocorrelation functions of the reaction products was found to be three times of that originated from the randomized counting rates. By analysing the probability distributions of the deviations in the measured cross sections, we found that about 20% of all the deviations exceeds three standard deviations. This indicates that the non-reproducibility of the cross sections in the two independent measurements is of a statistical significance but not originated from randomized fluctuation of counting rates.
Henderson, Douglas
2010-01-01
Henry Eyring was, and still is, a towering figure in science. Some aspects of his life and science, beginning in Mexico and continuing in Arizona, California, Wisconsin, Germany, Princeton, and finally Utah, are reviewed here. Eyring moved gradually from quantum theory toward statistical mechanics and the theory of liquids, motivated in part by his desire to understand reactions in condensed matter. Significant structure theory, while not as successful as Eyring thought, is better than his critics realize. Eyring won many awards. However, most chemists are surprised, if not shocked, that he was never awarded a Nobel Prize. He joined Lise Meitner, Rosalind Franklin, John Slater, and others, in an even more select group, those who should have received a Nobel Prize but did not.
A network-based method to assess the statistical significance of mild co-regulation effects.
Emőke-Ágnes Horvát
Full Text Available Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis.
A Network-Based Method to Assess the Statistical Significance of Mild Co-Regulation Effects
Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna
2013-01-01
Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis. PMID:24039936
Kellerer-Pirklbauer, Andreas
2016-04-01
Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter
Carr, J.R.; Roberts, K.P.
1989-02-01
Universal kriging is compared with ordinary kriging for estimation of earthquake ground motion. Ordinary kriging is based on a stationary random function model; universal kriging is based on a nonstationary random function model representing first-order drift. Accuracy of universal kriging is compared with that for ordinary kriging; cross-validation is used as the basis for comparison. Hypothesis testing on these results shows that accuracy obtained using universal kriging is not significantly different from accuracy obtained using ordinary kriging. Test based on normal distribution assumptions are applied to errors measured in the cross-validation procedure; t and F tests reveal no evidence to suggest universal and ordinary kriging are different for estimation of earthquake ground motion. Nonparametric hypothesis tests applied to these errors and jackknife statistics yield the same conclusion: universal and ordinary kriging are not significantly different for this application as determined by a cross-validation procedure. These results are based on application to four independent data sets (four different seismic events).
Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior
2011-09-23
Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.
Hojat, Mohammadreza; Xu, Gang
2004-01-01
Effect Sizes (ES) are an increasingly important index used to quantify the degree of practical significance of study results. This paper gives an introduction to the computation and interpretation of effect sizes from the perspective of the consumer of the research literature. The key points made are: 1. ES is a useful indicator of the practical (clinical) importance of research results that can be operationally defined from being "negligible" to "moderate", to "important". 2. The ES has two advantages over statistical significance testing: (a) it is independent of the size of the sample; (b) it is a scale-free index. Therefore, ES can be uniformly interpreted in different studies regardless of the sample size and the original scales of the variables. 3. Calculations of the ES are illustrated by using examples of comparisons between two means, correlation coefficients, chi-square tests and two proportions, along with appropriate formulas. 4. Operational definitions for the ES s are given, along with numerical examples for the purpose of illustration.
Sassenhagen, Jona; Alday, Phillip M
2016-11-01
Experimental research on behavior and cognition frequently rests on stimulus or subject selection where not all characteristics can be fully controlled, even when attempting strict matching. For example, when contrasting patients to controls, variables such as intelligence or socioeconomic status are often correlated with patient status. Similarly, when presenting word stimuli, variables such as word frequency are often correlated with primary variables of interest. One procedure very commonly employed to control for such nuisance effects is conducting inferential tests on confounding stimulus or subject characteristics. For example, if word length is not significantly different for two stimulus sets, they are considered as matched for word length. Such a test has high error rates and is conceptually misguided. It reflects a common misunderstanding of statistical tests: interpreting significance not to refer to inference about a particular population parameter, but about 1. the sample in question, 2. the practical relevance of a sample difference (so that a nonsignificant test is taken to indicate evidence for the absence of relevant differences). We show inferential testing for assessing nuisance effects to be inappropriate both pragmatically and philosophically, present a survey showing its high prevalence, and briefly discuss an alternative in the form of regression including nuisance variables.
Homeopathy: statistical significance versus the sample size in experiments with Toxoplasma gondii
Ana LÃƒÂºcia Falavigna Guilherme
2011-09-01
Full Text Available Introduction: Toxoplasmosis is a zoonosis that represents a serious public health problem, caused by Toxoplasma gondii, which affects 20-90% of the world human population [1,2]. It is a serious problem especially when considering the congenital transmission due to congenital sequels. Treatment with highly diluted substances is one of the alternative/complementary medicines most employed in the world [3,4]. The current ethical rules regarding the number of animals used in animal experimental protocols with the use of more conservative statistical methods [5] can not enhance the biological effects of highly diluted substances observed by the experience of the researcher. Aim: To evaluate the minimum number of animals per group to achieve a significant difference among the groups of animals treated with biotherapic T. gondii and infected with the protozoan regarding the number of cysts observed in the brain. Material and methods: A blind randomized controlled trial was performed using eleven Swiss male mice, aged 57 days, divided into two groups: BIOT-200DH - treated with biotherapic (n=6 and CONTROL - treated with hydroalcoholic solution 7% (n=7.The animals of the group BIOT-200DH were treated for 3 consecutive days in a single dose 0.1ml/dose/day. The animals of BIOT Ã¢â‚¬â€œ 200DH group were orally infected with 20 cysts of ME49-T. gondii. The animals of the control group were treated with cereal alcohol 7% (n=7 for 3 consecutive days and then were infected with 20 cysts of ME49 -T. gondii orally. The biotherapic 200DH T. gondii was prepared with homogenized mouse brain, with 20 cysts of T. gondii / 100ÃŽÂ¼L according to the Brazilian Homeopathic Pharmacopoeia [6] in laminar flow. After 60 days post-infection the animals were killed in a chamber saturated with halothane, the brains were homogenized and resuspended in 1 ml of saline solution. Cysts were counted in 25 ml of this suspension, covered with a 24x24 mm coverglass
WANG Hanjie; SHI Weilai; CHEN Xiaohong
2006-01-01
The West Development Policy being implemented in China is causing significant land use and land cover (LULC) changes in West China. With the up-to-date satellite database of the Global Land Cover Characteristics Database (GLCCD) that characterizes the lower boundary conditions, the regional climate model RIEMS-TEA is used to simulate possible impacts of the significant LULC variation. The model was run for five continuous three-month periods from 1 June to 1 September of 1993, 1994, 1995, 1996, and 1997, and the results of the five groups are examined by means of a student t-test to identify the statistical significance of regional climate variation. The main results are: (1) The regional climate is affected by the LULC variation because the equilibrium of water and heat transfer in the air-vegetation interface is changed. (2) The integrated impact of the LULC variation on regional climate is not only limited to West China where the LULC varies, but also to some areas in the model domain where the LULC does not vary at all. (3) The East Asian monsoon system and its vertical structure are adjusted by the large scale LULC variation in western China, where the consequences are the enhancement of the westward water vapor transfer from the east oast and the relevant increase of wet-hydrostatic energy in the middle-upper atmospheric layers. (4) The ecological engineering in West China affects significantly the regional climate in Northwest China, North China and the middle-lower reaches of the Yangtze River; there are obvious effects in South, Northeast, and Southwest China, but minor effects in Tibet.
Blalock Eric M
2007-07-01
Full Text Available Abstract Background Researchers using RNA expression microarrays in experimental designs with more than two treatment groups often identify statistically significant genes with ANOVA approaches. However, the ANOVA test does not discriminate which of the multiple treatment groups differ from one another. Thus, post hoc tests, such as linear contrasts, template correlations, and pairwise comparisons are used. Linear contrasts and template correlations work extremely well, especially when the researcher has a priori information pointing to a particular pattern/template among the different treatment groups. Further, all pairwise comparisons can be used to identify particular, treatment group-dependent patterns of gene expression. However, these approaches are biased by the researcher's assumptions, and some treatment-based patterns may fail to be detected using these approaches. Finally, different patterns may have different probabilities of occurring by chance, importantly influencing researchers' conclusions about a pattern and its constituent genes. Results We developed a four step, post hoc pattern matching (PPM algorithm to automate single channel gene expression pattern identification/significance. First, 1-Way Analysis of Variance (ANOVA, coupled with post hoc 'all pairwise' comparisons are calculated for all genes. Second, for each ANOVA-significant gene, all pairwise contrast results are encoded to create unique pattern ID numbers. The # genes found in each pattern in the data is identified as that pattern's 'actual' frequency. Third, using Monte Carlo simulations, those patterns' frequencies are estimated in random data ('random' gene pattern frequency. Fourth, a Z-score for overrepresentation of the pattern is calculated ('actual' against 'random' gene pattern frequencies. We wrote a Visual Basic program (StatiGen that automates PPM procedure, constructs an Excel workbook with standardized graphs of overrepresented patterns, and lists of
Helmut Kern
2012-03-01
Full Text Available Aging is a multifactorial process that is characterized by decline in muscle mass and performance. Several factors, including reduced exercise, poor nutrition and modified hormonal metabolism, are responsible for changes in the rates of protein synthesis and degradation that drive skeletal muscle mass reduction with a consequent decline of force generation and mobility functional performances. Seniors with normal life style were enrolled: two groups in Vienna (n=32 and two groups in Bratislava: (n=19. All subjects were healthy and declared not to have any specific physical/disease problems. The two Vienna groups of seniors exercised for 10 weeks with two different types of training (leg press at the hospital or home-based functional electrical stimulation, h-b FES. Demografic data (age, height and weight were recorded before and after the training period and before and after the training period the patients were submitted to mobility functional analyses and muscle biopsies. The mobility functional analyses were: 1. gait speed (10m test fastest speed, in m/s; 2. time which the subject needed to rise from a chair for five times (5x Chair-Rise, in s; 3. Timed –Up-Go- Test, in s; 4. Stair-Test, in s; 5. isometric measurement of quadriceps force (Torque/kg, in Nm/kg; and 6. Dynamic Balance in mm. Preliminary analyses of muscle biopsies from quadriceps in some of the Vienna and Bratislava patients present morphometric results consistent with their functional behaviors. The statistically significant improvements in functional testings here reported demonstrates the effectiveness of h-b FES, and strongly support h-b FES, as a safe home-based method to improve contractility and performances of ageing muscles.
Vujović Svetlana R.
2013-01-01
Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an
Fhager, V
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
Testing statistical significance scores of sequence comparison methods with structure similarity
Hulsen, T.; Vlieg, J. de; Leunissen, J.A.M.; Groenen, P.M.
2006-01-01
BACKGROUND: In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical s
Testing statistical significance scores of sequence comparison methods with structure similarity
Hulsen, T.; Vlieg, de J.; Leunissen, J.A.M.; Groenen, P.
2006-01-01
Background - In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Alves, Gelio
After the sequencing of many complete genomes, we are in a post-genomic era in which the most important task has changed from gathering genetic information to organizing the mass of data as well as under standing how components interact with each other. The former is usually undertaking using bioinformatics methods, while the latter task is generally termed proteomics. Success in both parts demands correct statistical significance assignments for results found. In my dissertation. I study two concrete examples: global sequence alignment statistics and peptide sequencing/identification using mass spectrometry. High-performance liquid chromatography coupled to a mass spectrometer (HPLC/MS/MS), enabling peptide identifications and thus protein identifications, has become the tool of choice in large-scale proteomics experiments. Peptide identification is usually done by database searches methods. The lack of robust statistical significance assignment among current methods motivated the development of a novel de novo algorithm, RAId, whose score statistics then provide statistical significance for high scoring peptides found in our custom, enzyme-digested peptide library. The ease of incorporating post-translation modifications is another important feature of RAId. To organize the massive protein/DNA data accumulated, biologists often cluster proteins according to their similarity via tools such as sequence alignment. Homologous proteins share similar domains. To assess the similarity of two domains usually requires alignment from head to toe, ie. a global alignment. A good alignment score statistics with an appropriate null model enable us to distinguish the biologically meaningful similarity from chance similarity. There has been much progress in local alignment statistics, which characterize score statistics when alignments tend to appear as a short segment of the whole sequence. For global alignment, which is useful in domain alignment, there is still much room for
Dominic Beaulieu-Prévost
2006-03-01
Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
The Hall current system revealed as a statistical significant pattern during fast flows
K. Snekvik
2008-11-01
Full Text Available We have examined the dawn-dusk component of the magnetic field, B_{Y}, in the night side current sheet during fast flows in the neutral sheet. 237 h of Cluster data from the plasma sheet between 2 August 2002 and 2 October 2002 have been analysed. The spatial pattern of B_{Y} as a function of the distance from the centre of the current sheet has been estimated by using a Harris current sheet model. We have used the average slopes of these patterns to estimate earthward and tailward currents. For earthward fast flows there is a tailward current in the inner central plasma sheet and an earthward current in the outer central plasma sheet on average. For tailward fast flows the currents are oppositely directed. These observations are interpreted as signatures of Hall currents in the reconnection region or as field aligned currents which are connected with these currents. Although fast flows often are associated with a dawn-dusk current wedge, we believe that we have managed to filter out such currents from our statistical patterns.
Krumbholz, Aniko; Anielski, Patricia; Gfrerer, Lena; Graw, Matthias; Geyer, Hans; Schänzer, Wilhelm; Dvorak, Jiri; Thieme, Detlef
2014-01-01
Clenbuterol is a well-established β2-agonist, which is prohibited in sports and strictly regulated for use in the livestock industry. During the last few years clenbuterol-positive results in doping controls and in samples from residents or travellers from a high-risk country were suspected to be related the illegal use of clenbuterol for fattening. A sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed to detect low clenbuterol residues in hair with a detection limit of 0.02 pg/mg. A sub-therapeutic application study and a field study with volunteers, who have a high risk of contamination, were performed. For the application study, a total dosage of 30 µg clenbuterol was applied to 20 healthy volunteers on 5 subsequent days. One month after the beginning of the application, clenbuterol was detected in the proximal hair segment (0-1 cm) in concentrations between 0.43 and 4.76 pg/mg. For the second part, samples of 66 Mexican soccer players were analyzed. In 89% of these volunteers, clenbuterol was detectable in their hair at concentrations between 0.02 and 1.90 pg/mg. A comparison of both parts showed no statistical difference between sub-therapeutic application and contamination. In contrast, discrimination to a typical abuse of clenbuterol is apparently possible. Due to these findings results of real doping control samples can be evaluated. Copyright © 2014 John Wiley & Sons, Ltd.
Thompson, Bruce; Snyder, Patricia A.
1998-01-01
Investigates two aspects of research analyses in quantitative research studies reported in the 1996 issues of "Journal of Counseling & Development" (JCD). Acceptable methodological practice regarding significance testing and evaluation of score reliability has evolved considerably. Contemporary thinking on these issues is described; practice as…
Hojat, Mohammadreza; Xu, Gang
2004-01-01
Effect Sizes (ES) are an increasingly important index used to quantify the degree of practical significance of study results. This paper gives an introduction to the computation and interpretation of effect sizes from the perspective of the consumer of the research literature. The key points made are: (1) "ES" is a useful indicator of the…
Massey, J. L.
1976-01-01
The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.
WISCOD: A Statistical Web-Enabled Tool for the Identification of Significant Protein Coding Regions
Mireia Vilardell
2014-01-01
Full Text Available Classically, gene prediction programs are based on detecting signals such as boundary sites (splice sites, starts, and stops and coding regions in the DNA sequence in order to build potential exons and join them into a gene structure. Although nowadays it is possible to improve their performance with additional information from related species or/and cDNA databases, further improvement at any step could help to obtain better predictions. Here, we present WISCOD, a web-enabled tool for the identification of significant protein coding regions, a novel software tool that tackles the exon prediction problem in eukaryotic genomes. WISCOD has the capacity to detect real exons from large lists of potential exons, and it provides an easy way to use global P value called expected probability of being a false exon (EPFE that is useful for ranking potential exons in a probabilistic framework, without additional computational costs. The advantage of our approach is that it significantly increases the specificity and sensitivity (both between 80% and 90% in comparison to other ab initio methods (where they are in the range of 70–75%. WISCOD is written in JAVA and R and is available to download and to run in a local mode on Linux and Windows platforms.
Oshima, T. C.; Raju, Nambury S.; Nanda, Alice O.
2006-01-01
A new item parameter replication method is proposed for assessing the statistical significance of the noncompensatory differential item functioning (NCDIF) index associated with the differential functioning of items and tests framework. In this new method, a cutoff score for each item is determined by obtaining a (1-alpha ) percentile rank score…
Saha, Ranajit; Pan, Sudip; Chattaraj, Pratim K
2016-11-05
The validity of the maximum hardness principle (MHP) is tested in the cases of 50 chemical reactions, most of which are organic in nature and exhibit anomeric effect. To explore the effect of the level of theory on the validity of MHP in an exothermic reaction, B3LYP/6-311++G(2df,3pd) and LC-BLYP/6-311++G(2df,3pd) (def2-QZVP for iodine and mercury) levels are employed. Different approximations like the geometric mean of hardness and combined hardness are considered in case there are multiple reactants and/or products. It is observed that, based on the geometric mean of hardness, while 82% of the studied reactions obey the MHP at the B3LYP level, 84% of the reactions follow this rule at the LC-BLYP level. Most of the reactions possess the hardest species on the product side. A 50% null hypothesis is rejected at a 1% level of significance.
Ranajit Saha
2016-11-01
Full Text Available The validity of the maximum hardness principle (MHP is tested in the cases of 50 chemical reactions, most of which are organic in nature and exhibit anomeric effect. To explore the effect of the level of theory on the validity of MHP in an exothermic reaction, B3LYP/6-311++G(2df,3pd and LC-BLYP/6-311++G(2df,3pd (def2-QZVP for iodine and mercury levels are employed. Different approximations like the geometric mean of hardness and combined hardness are considered in case there are multiple reactants and/or products. It is observed that, based on the geometric mean of hardness, while 82% of the studied reactions obey the MHP at the B3LYP level, 84% of the reactions follow this rule at the LC-BLYP level. Most of the reactions possess the hardest species on the product side. A 50% null hypothesis is rejected at a 1% level of significance.
Justin London
2010-01-01
Full Text Available In “National Metrical Types in Nineteenth Century Art Song” Leigh Van Handel gives a sympathetic critique of William Rothstein’s claim that in western classical music of the late 18th and 19th centuries there are discernable differences in the phrasing and metrical practice of German versus French and Italian composers. This commentary (a examines just what Rothstein means in terms of his proposed metrical typology, (b questions Van Handel on how she has applied it to a purely melodic framework, (c amplifies Van Handel’s critique of Rothstein, and then (d concludes with a rumination on the reach of quantitative (i.e., statistically-driven versus qualitative claims regarding such things as “national metrical types.”
Adams, James; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen
2017-01-01
Introduction A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. Methods In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. “Leave-one-out” cross-validation was used to ensure statistical independence of results. Results and Discussion Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate
Adams, James; Howsmon, Daniel P; Kruger, Uwe; Geis, Elizabeth; Gehn, Eva; Fimbres, Valeria; Pollard, Elena; Mitchell, Jessica; Ingram, Julie; Hellmers, Robert; Quig, David; Hahn, Juergen
2017-01-01
A number of previous studies examined a possible association of toxic metals and autism, and over half of those studies suggest that toxic metal levels are different in individuals with Autism Spectrum Disorders (ASD). Additionally, several studies found that those levels correlate with the severity of ASD. In order to further investigate these points, this paper performs the most detailed statistical analysis to date of a data set in this field. First morning urine samples were collected from 67 children and adults with ASD and 50 neurotypical controls of similar age and gender. The samples were analyzed to determine the levels of 10 urinary toxic metals (UTM). Autism-related symptoms were assessed with eleven behavioral measures. Statistical analysis was used to distinguish participants on the ASD spectrum and neurotypical participants based upon the UTM data alone. The analysis also included examining the association of autism severity with toxic metal excretion data using linear and nonlinear analysis. "Leave-one-out" cross-validation was used to ensure statistical independence of results. Average excretion levels of several toxic metals (lead, tin, thallium, antimony) were significantly higher in the ASD group. However, ASD classification using univariate statistics proved difficult due to large variability, but nonlinear multivariate statistical analysis significantly improved ASD classification with Type I/II errors of 15% and 18%, respectively. These results clearly indicate that the urinary toxic metal excretion profiles of participants in the ASD group were significantly different from those of the neurotypical participants. Similarly, nonlinear methods determined a significantly stronger association between the behavioral measures and toxic metal excretion. The association was strongest for the Aberrant Behavior Checklist (including subscales on Irritability, Stereotypy, Hyperactivity, and Inappropriate Speech), but significant associations were found
Ernesto eIacucci
2012-02-01
Full Text Available High-throughput molecular biology studies, such as microarray assays of gene expression, two-hybrid experiments for detecting protein interactions, or ChIP-Seq experiments for transcription factor binding, often result in an interesting set of genes—say, genes that are co-expressed or bound by the same factor. One way of understanding the biological meaning of such a set is to consider what processes or functions, as defined in an ontology, are over-represented (enriched or under-represented (depleted among genes in the set. Usually, the significance of enrichment or depletion scores is based on simple statistical models and on the membership of genes in different classifications. We consider the more general problem of computing p-values for arbitrary integer additive statistics, or weighted membership functions. Such membership functions can be used to represent, for example, prior knowledge on the role of certain genes or classifications, differential importance of different classifications or genes to the experimenter, hierarchical relationships between classifications, or different degrees of interestingness or evidence for specific genes. We describe a generic dynamic programming algorithm that can compute exact p-values for arbitrary integer additive statistics. We also describe several optimizations for important special cases, which can provide orders-of-magnitude speed up in the computations. We apply our methods to datasets describing oxidative phosphorylation and parturition and compare p-values based on computations of several different statistics for measuring enrichment. We find major differences between p-values resulting from these statistics, and that some statistics recover gold standard annotations of the data better than others. Our work establishes a theoretical and algorithmic basis for far richer notions of enrichment or depletion of gene sets with respect to gene ontologies than has previously been available.
Le, S.Y.; Chen, J H; Maizel, J. V.
1989-01-01
RNA stem-loop structures situated just 3' to the frameshift sites of the retroviral gag-pol or gag-pro and pro-pol regions may make important contributions to frame-shifting in retroviruses. In this study, the thermodynamic stability and statistical significance of such secondary structural features relative to others in the sequence have been assessed using a newly developed method that combines calculations of the lowest free energy of formation of RNA secondary structures and the Monte Car...
Sung-Min Kim
2017-06-01
Full Text Available To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH, high content with a low z-score (HL, low content with a high z-score (LH, and low content with a low z-score (LL. The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.
2009-01-01
In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) dat...
ZHENG Yinghua; WU Yongqiu; LI Sen; TAN Lihua; GOU Shiwei; ZHANG Hongyan
2009-01-01
Widespread aeolian sediments have been found in the middle reaches of the Yarlung Zangbo River, China. The grain-size characteristics of sediments from Cha'er Section in the area were analyzed. The results show that the section include one stratum of paleo-mobile dunes, four strata of paleo-semi-fixed dunes, two strata of paleo-fixed dunes, one stratum of sandy immature soils. The paleo-mobile and paleo-somi-fixed dune sand in this section are similar to modern aeolian sand in either grain-size composition or Mz and σ distribution. Compared the above types of dunes each other, the content of sand substance decreases, while the content of silt and clay increases for palco-fixed dunes and sandy immature soils. Combined with age data for each stratum, the analysis shows that these strata are the products of climate changes and the evolution of aeolian landforms. The evolutionary sequence of the paleoclimate and of acolian activities in the valley since 8600 yr B.P. reveals four stages: 8600-5700 yr B.P., when the paleoclimate was cold and dry, with strong winds, thereby activating dunes; 5700-3600 yr B.P., when it was warm and wet, with weak winds, causing dunes to undergo soil-forming processes; 3600-1900 yr B.P., when climate shifted from cold-dry with strong winds to warm-wet with weak winds, and activated dunes were fixed again; and 1900 yr B.P. -present, when the climate became fine, with weak winds, fixing dunes again.
Jiahui Fan
2016-06-01
Full Text Available Land use profoundly changes the terrestrial ecosystem and landscape patterns, and these changes reveal the extent and scope of the ecological influence of land use on the terrestrial ecosystem. The study area selected for this research was the middle reaches of the Heihe River. Based on land use data (1986, 2000, and 2014, we proposed an ecological risk index of land use by combining a landscape disturbance index with a landscape fragility index. An exponential model was selected to perform kriging interpolation, as well as spatial autocorrelations and semivariance analyses which could reveal the spatial aggregation patterns. The results indicated that the ecological risk of the middle reaches of the Heihe River was generally high, and higher in the northwest. The high values of the ecological risk index (ERI tended to decrease, and the low ERI values tended to increase. Positive spatial autocorrelations and a prominent scale-dependence were observed among the ERI values. The main hot areas with High-High local autocorrelations were located in the north, and the cold areas with low-low local autocorrelations were primarily located in the middle corridor plain and Qilian Mountains. From 1986 to 2014, low and relatively low ecological risk areas decreased while relatively high risk areas expanded. A middle level of ecological risk was observed in Ganzhou and Minle counties. Shandan County presented a serious polarization, with high ecological risk areas observed in the north and low ecological risk areas observed in the southern Shandan horse farm. In order to lower the eco-risk and achieve the sustainability of land use, these results suggest policies to strictly control the oasis expansion and the occupation of farmland for urbanization. Some inefficient farmland should transform into grassland in appropriate cases.
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P P ≥ 0.001 to P ≥ 0.01 to P ≥ 0.05). We noted three irregularities: (1) high proportion of P-values P-values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
Leitner Dietmar
2005-04-01
Full Text Available Abstract Background A reliable prediction of the Xaa-Pro peptide bond conformation would be a useful tool for many protein structure calculation methods. We have analyzed the Protein Data Bank and show that the combined use of sequential and structural information has a predictive value for the assessment of the cis versus trans peptide bond conformation of Xaa-Pro within proteins. For the analysis of the data sets different statistical methods such as the calculation of the Chou-Fasman parameters and occurrence matrices were used. Furthermore we analyzed the relationship between the relative solvent accessibility and the relative occurrence of prolines in the cis and in the trans conformation. Results One of the main results of the statistical investigations is the ranking of the secondary structure and sequence information with respect to the prediction of the Xaa-Pro peptide bond conformation. We observed a significant impact of secondary structure information on the occurrence of the Xaa-Pro peptide bond conformation, while the sequence information of amino acids neighboring proline is of little predictive value for the conformation of this bond. Conclusion In this work, we present an extensive analysis of the occurrence of the cis and trans proline conformation in proteins. Based on the data set, we derived patterns and rules for a possible prediction of the proline conformation. Upon adoption of the Chou-Fasman parameters, we are able to derive statistically relevant correlations between the secondary structure of amino acid fragments and the Xaa-Pro peptide bond conformation.
Coleman, R. N.
1977-01-01
In the broad band neutral beam at Fermilab, a search for photoproduction of charmed D mesons was done using photons of 100 to 300 GeV. The reaction considered was ..gamma.. + Be ..-->.. DantiD + X, leptons + ..., K/sup 0//sub s/n..pi../sup +-/. No statistically significant evidence for D production is observed based on the K/sup 0//sub s/n..pi../sup +-/ mass spectrum. The sensitivity of the search is commensurate with theoretical estimates of sigma(..gamma..p ..-->.. DantiD + X) approximately 500 nb, however this is dependent on branching ratios and photoproduction models. Data are given on a similar search for semileptonic decays of charmed baryons. 48 references.
Østvand, Lene; Rypdal, Martin
2013-01-01
Various interpretations of the notion of a trend in the context of global warming are discussed, contrasting the difference between viewing a trend as the deterministic response to an external forcing and viewing it as a slow variation which can be separated from the background spectral continuum of long-range persistent climate noise. The emphasis in this paper is on the latter notion, and a general scheme is presented for testing a multi-parameter trend model against a null hypothesis which models the observed climate record as an autocorrelated noise. The scheme is employed to the instrumental global sea-surface temperature record and the global land-temperature record. A trend model comprising a linear plus an oscillatory trend with period of approximately 60 yr, and the statistical significance of the trends, are tested against three different null models: first-order autoregressive process, fractional Gaussian noise, and fractional Brownian motion. The linear trend is significant in all cases, but the o...
Wang, Jie; Liu, Guijian; Liu, Houqi; Lam, Paul K S
2017-04-01
A total of 211 water samples were collected from 53 key sampling points from 5-10th July 2013 at four different depths (0m, 2m, 4m, 8m) and at different sites in the Huaihe River, Anhui, China. These points monitored for 18 parameters (water temperature, pH, TN, TP, TOC, Cu, Pb, Zn, Ni, Co, Cr, Cd, Mn, B, Fe, Al, Mg, and Ba). The spatial variability, contamination sources and health risk of trace elements as well as the river water quality were investigated. Our results were compared with national (CSEPA) and international (WHO, USEPA) drinking water guidelines, revealing that Zn, Cd and Pb were the dominant pollutants in the water body. Application of different multivariate statistical approaches, including correlation matrix and factor/principal component analysis (FA/PCA), to assess the origins of the elements in the Huaihe River, identified three source types that accounted for 79.31% of the total variance. Anthropogenic activities were considered to contribute much of the Zn, Cd, Pb, Ni, Co, and Mn via industrial waste, coal combustion, and vehicle exhaust; Ba, B, Cr and Cu were controlled by mixed anthropogenic and natural sources, and Mg, Fe and Al had natural origins from weathered rocks and crustal materials. Cluster analysis (CA) was used to classify the 53 sample points into three groups of water pollution, high pollution, moderate pollution, and low pollution, reflecting influences from tributaries, power plants and vehicle exhaust, and agricultural activities, respectively. The results of the water quality index (WQI) indicate that water in the Huaihe River is heavily polluted by trace elements, so approximately 96% of the water in the Huaihe River is unsuitable for drinking. A health risk assessment using the hazard quotient and index (HQ/HI) recommended by the USEPA suggests that Co, Cd and Pb in the river could cause non-carcinogenic harm to human health.
Lutz Bornmann
Full Text Available Using the InCites tool of Thomson Reuters, this study compares normalized citation impact values calculated for China, Japan, France, Germany, United States, and the UK throughout the time period from 1981 to 2010. InCites offers a unique opportunity to study the normalized citation impacts of countries using (i a long publication window (1981 to 2010, (ii a differentiation in (broad or more narrow subject areas, and (iii allowing for the use of statistical procedures in order to obtain an insightful investigation of national citation trends across the years. Using four broad categories, our results show significantly increasing trends in citation impact values for France, the UK, and especially Germany across the last thirty years in all areas. The citation impact of papers from China is still at a relatively low level (mostly below the world average, but the country follows an increasing trend line. The USA exhibits a stable pattern of high citation impact values across the years. With small impact differences between the publication years, the US trend is increasing in engineering and technology but decreasing in medical and health sciences as well as in agricultural sciences. Similar to the USA, Japan follows increasing as well as decreasing trends in different subject areas, but the variability across the years is small. In most of the years, papers from Japan perform below or approximately at the world average in each subject area.
Aaron Fisher
2014-10-01
Full Text Available Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC. Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%–49.7%] of statistically significant relationships, and 74.6% (95% CI [72.5%–76.6%] of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1 that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2 data analysts can be trained to improve detection of statistically significant results with practice, but (3 data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/.
Fisher, Aaron; Anderson, G Brooke; Peng, Roger; Leek, Jeff
2014-01-01
Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%-49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%-76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/.
Fisher, Aaron; Anderson, G. Brooke; Peng, Roger
2014-01-01
Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%–49.7%]) of statistically significant relationships, and 74.6% (95% CI [72.5%–76.6%]) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/. PMID:25337457
Ioan Catalin VLAD
2012-11-01
Full Text Available Purpose: In recent studies perineural invasion (PNI is associated with poor survival rates in rectal cancer, but the impact of PNI it’s still controversial. We assessed PNI as a potential prognostic factor in rectal cancer. Patients and Methods: We analyzed 317 patients with rectal cancer resected at The Oncology Institute”Prof. Dr. Ion Chiricuţă” Cluj-Napoca, between January 2000 and December 2008. Tumors were reviewed for PNI by a pathologist. Patients data were reviewed and entered into a comprehensive database. The statistical analysis in our study was carried out in R environment for statistical computing and graphics, version 1.15.1. Overall and disease-free survivals were determined using the Kaplan-Meier method, and multivariate analysis using the Cox multiple hazards model. Results were compared using the log-rank test. Results: In our study PNI was identified in 19% of tumors. The 5-year disease-free survival rate was higher for patients with PNI-negative tumors versus those with PNI-positive tumors (57.31% vs. 36.99%, p=0.009. The 5-year overall survival rate was 59.15% for PNI-negative tumors versus 39.19% for PNI-positive tumors (p=0.014. On multivariate analysis, PNI was an independent prognostic factor for overall survival (Hazard Ratio = 0.6; 95% CI = 0.41 to 0.87; p = 0.0082. Conclusions: PNI can be considered an independent prognostic factor of outcomes in patients with rectal cancer. PNI should be taken into account when selecting patients for adjuvant treatment. R environment for statistical computing and graphics is complex yet easy to use software that has proven to be efficient in our clinical study.
DONG Yu-Chuan; JIANG Hua; HU Gui-Qing; WANG Qi; LI Song-Lin; TIAN Wen-Dong; LI Zhi-Chang; LU Xiu-Qin; ZHAO Kui; FU Chang-Bo; LIU Jian-Cheng
2004-01-01
Two independent measurements of cross sections for the 19F+93Nb dissipative heavy-ion collision (DHIC) have been performed at incident energies from 100 to 108 MeV in steps of 250 keV. Two independently prepared targets were used respectively with all other experimental conditions being identical in both experiments. The data indicate non-reproducibility of the non-self-averaging oscillation yields in the two measurements. The statistical analysis of this non-reproducibility supports recent theoretical predictions of spontaneous coherence, slow phase randomization and extreme sensitivity in highly excited quantum many-body systems.
Garcia, Juan A L; Bartumeus, Frederic; Roche, David; Giraldo, Jesús; Stanley, H Eugene; Casamayor, Emilio O
2008-06-01
We combined genometric (DNA walks) and statistical (detrended fluctuation analysis) methods on 456 prokaryotic chromosomes from 309 different bacterial and archaeal species to look for specific patterns and long-range correlations along the genome and relate them to ecological lifestyles. The position of each nucleotide along the complete genome sequence was plotted on an orthogonal plane (DNA landscape), and fluctuation analysis applied to the DNA walk series showed a long-range correlation in contrast to the lack of correlation for artificially generated genomes. Different features in the DNA landscapes among genomes from different ecological and metabolic groups of prokaryotes appeared with the combined analysis. Transition from hyperthermophilic to psychrophilic environments could have been related to more complex structural adaptations in microbial genomes, whereas for other environmental factors such as pH and salinity this effect would have been smaller. Prokaryotes with domain-specific metabolisms, such as photoautotrophy in Bacteria and methanogenesis in Archaea, showed consistent differences in genome correlation structure. Overall, we show that, beyond the relative proportion of nucleotides, correlation properties derived from their sequential position within the genome hide relevant phylogenetic and ecological information. This can be studied by combining genometric and statistical physics methods, leading to a reduction of genome complexity to a few useful descriptors.
无
2007-01-01
In order to meet the demand of nowcasting convective storms in Beijing, the climatological characteristics of convective storms in Beijing and its vicinity were analyzed based on the infrared (IR) temperature of black body (TBB) data during May―August of 1997―2004. The climatological probabilities, the diurnal cycle and the spatial distribution of convective storms are given respectively in this paper. The results show that the climatological characteristics of convective storms denoted by TBB≤-52℃ are consistent with those statistic studies based on the surface and lightning observations. Furthermore, the climatological characteristics of May and June are very different from those of July and August, showing that there are two types of convective storms in this region. One occurs in the transient polar air mass on the midlatitude continent during the late spring and early summer. This type of convection arises with thunder, strong wind gust and hail over the mountainous area in the northern part of this region from afternoon to nightfall, the other occurs with heavy rainfall in the warm and moist air mass over the North China Plain and vicinity of Bohai Sea. This study also shows that the long-term data of IR TBB observed by geostationary satellite can complement the temporal and spatial limitation of the weather radar and surface observations.
Yokoyama, Shozo; Takenaka, Naomi
2005-04-01
Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.
Gaonkar, Bilwaj; Davatzikos, Christos
2013-01-01
Multivariate pattern analysis (MVPA) methods such as support vector machines (SVMs) have been increasingly applied to fMRI and sMRI analyses, enabling the detection of distinctive imaging patterns. However, identifying brain regions that significantly contribute to the classification/group separation requires computationally expensive permutation testing. In this paper we show that the results of SVM-permutation testing can be analytically approximated. This approximation leads to more than a...
Gaonkar, Bilwaj; Davatzikos, Christos
2013-09-01
Multivariate pattern analysis (MVPA) methods such as support vector machines (SVMs) have been increasingly applied to fMRI and sMRI analyses, enabling the detection of distinctive imaging patterns. However, identifying brain regions that significantly contribute to the classification/group separation requires computationally expensive permutation testing. In this paper we show that the results of SVM-permutation testing can be analytically approximated. This approximation leads to more than a thousandfold speedup of the permutation testing procedure, thereby rendering it feasible to perform such tests on standard computers. The speedup achieved makes SVM based group difference analysis competitive with standard univariate group difference analysis methods.
Kurtz, S.E.; Fields, D.E.
1983-10-01
This report describes a version of the TERPED/P computer code that is very useful for small data sets. A new algorithm for determining the Kolmogorov-Smirnov (KS) statistics is used to extend program applicability. The TERPED/P code facilitates the analysis of experimental data and assists the user in determining its probability distribution function. Graphical and numerical tests are performed interactively in accordance with the user's assumption of normally or log-normally distributed data. Statistical analysis options include computation of the chi-square statistic and the KS one-sample test statistic and the corresponding significance levels. Cumulative probability plots of the user's data are generated either via a local graphics terminal, a local line printer or character-oriented terminal, or a remote high-resolution graphics device such as the FR80 film plotter or the Calcomp paper plotter. Several useful computer methodologies suffer from limitations of their implementations of the KS nonparametric test. This test is one of the more powerful analysis tools for examining the validity of an assumption about the probability distribution of a set of data. KS algorithms are found in other analysis codes, including the Statistical Analysis Subroutine (SAS) package and earlier versions of TERPED. The inability of these algorithms to generate significance levels for sample sizes less than 50 has limited their usefulness. The release of the TERPED code described herein contains algorithms to allow computation of the KS statistic and significance level for data sets of, if the user wishes, as few as three points. Values computed for the KS statistic are within 3% of the correct value for all data set sizes.
Ahmed, Sheehan H; Christensen, Charlotte R
2016-01-01
We investigate whether the inclusion of baryonic physics influences the formation of thin, coherently rotating planes of satellites such as those seen around the Milky Way and Andromeda. For four Milky Way-mass simulations, each run both as dark matter-only and with baryons included, we are able to identify a planar configuration that significantly maximizes the number of plane satellite members. The maximum plane member satellites are consistently different between the dark matter-only and baryonic versions of the same run due to the fact that satellites are both more likely to be destroyed and to infall later in the baryonic runs. Hence, studying satellite planes in dark matter-only simulations is misleading, because they will be composed of different satellite members than those that would exist if baryons were included. Additionally, the destruction of satellites in the baryonic runs leads to less radially concentrated satellite distributions, a result that is critical to making planes that are statistica...
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values a
Kania, M.J.; Homan, F.J.; Mehner, A.W.
1982-08-01
Two methods for measuring failure fraction on irradiated coated-particle fuels have been developed, one in the United States (the IMGA system - Irradiated-Microsphere Gamma Analyzer) and one in the Federal Republic of Germany (FRG) (the PIAA procedure - Postirradiation Annealing and Beta Autoradiography). A comparison of the two methods on two standardized sets of irradiated particles was undertaken to evaluate the accuracy, operational procedures, and expense of each method in obtaining statistically significant results. From the comparison, the postirradiation examination method employing the IMGA system was found to be superior to the PIAA procedure for measuring statistically significant failure fractions. Both methods require that the irradiated fuel be in the form of loose particles, each requires extensive remote hot-cell facilities, and each is capable of physically separating failed particles from unfailed particles. Important differences noted in the comparison are described.
Liu, Wei; Ding, Jinhui
2016-05-25
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
U.S. Environmental Protection Agency — The Reach Address Database (RAD) stores the reach address of each Water Program feature that has been linked to the underlying surface water features (streams,...
Karen Larwin
2014-02-01
Full Text Available The present study examined students' statistics-related self-efficacy, as measured with the current statistics self-efficacy (CSSE inventory developed by Finney and Schraw (2003. Structural equation modeling was used to check the confirmatory factor analysis of the one-dimensional factor of CSSE. Once confirmed, this factor was used to test whether a significant link to prior mathematics experiences exists. Additionally a new post-structural equation modeling (SEM application was employed to compute error-free latent variable score for CSSE in an effort to examine the ancillary effects of gender, age, ethnicity, department, degree level, hours completed, expected course grade, number of college-level math classes, current GPA on students' CSSE scores. Results support the one-dimensional construct and as expected, the model demonstrated a significant link between CSSE scores and prior mathematics experiences to CSSE. Additionally the students' department, expected grade, and number of prior math classes were found to have a significant effect on student's CSSE scores.
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M
2015-03-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.
Nhu, Nguyen Van; Singh, Mahendra; Leonhard, Kai
2008-05-08
We have computed molecular descriptors for sizes, shapes, charge distributions, and dispersion interactions for 67 compounds using quantum chemical ab initio and density functional theory methods. For the same compounds, we have fitted the three perturbed-chain polar statistical associating fluid theory (PCP-SAFT) equation of state (EOS) parameters to experimental data and have performed a statistical analysis for relations between the descriptors and the EOS parameters. On this basis, an analysis of the physical significance of the parameters, the limits of the present descriptors, and the PCP-SAFT EOS has been performed. The result is a method that can be used to estimate the vapor pressure curve including the normal boiling point, the liquid volume, the enthalpy of vaporization, the critical data, mixture properties, and so on. When only two of the three parameters are predicted and one is adjusted to experimental normal boiling point data, excellent predictions of all investigated pure compound and mixture properties are obtained. We are convinced that the methodology presented in this work will lead to new EOS applications as well as improved EOS models whose predictive performance is likely to surpass that of most present quantum chemically based, quantitative structure-property relationship, and group contribution methods for a broad range of chemical substances.
2016-09-01
Popular culture reflects both the interests of and the issues affecting the general public. As concerns regarding climate change and its impacts grow, is it permeating into popular culture and reaching that global audience?
Teratology testing under REACH.
Barton, Steve
2013-01-01
REACH guidelines may require teratology testing for new and existing chemicals. This chapter discusses procedures to assess the need for teratology testing and the conduct and interpretation of teratology tests where required.
Reaching affects saccade trajectories.
Tipper, S P; Howard, L A; Paul, M A
2001-01-01
The pre-motor theory suggests that, when attention is oriented to a location, the motor systems that are involved in achieving current behavioural goals are activated. For example, when a task requires accurate reaching, attention to a location activates the motor circuits controlling saccades and manual reaches. These actions involve separate neural systems for the control of eye and hand, but we believe that the selection processes acting on neural population codes within these systems are similar and can affect each other. The attentional effect can be revealed in the subsequent movement. The present study shows that the path the eye takes as it saccades to a target is affected by whether a reach to the target is also produced. This effect is interpreted as the influence of a hand-centred frame used in reaching on the spatial frame of reference required for the saccade.
Kim, Yee Suk; Lee, Sungin; Zong, Nansu; Kahng, Jimin
2017-06-01
The present study aimed to investigate differences in prognosis based on human papillomavirus (HPV) infection, persistent infection and genotype variations for patients exhibiting atypical squamous cells of undetermined significance (ASCUS) in their initial Papanicolaou (PAP) test results. A latent Dirichlet allocation (LDA)-based tool was developed that may offer a facilitated means of communication to be employed during patient-doctor consultations. The present study assessed 491 patients (139 HPV-positive and 352 HPV-negative cases) with a PAP test result of ASCUS with a follow-up period ≥2 years. Patients underwent PAP and HPV DNA chip tests between January 2006 and January 2009. The HPV-positive subjects were followed up with at least 2 instances of PAP and HPV DNA chip tests. The most common genotypes observed were HPV-16 (25.9%, 36/139), HPV-52 (14.4%, 20/139), HPV-58 (13.7%, 19/139), HPV-56 (11.5%, 16/139), HPV-51 (9.4%, 13/139) and HPV-18 (8.6%, 12/139). A total of 33.3% (12/36) patients positive for HPV-16 had cervical intraepithelial neoplasia (CIN)2 or a worse result, which was significantly higher than the prevalence of CIN2 of 1.8% (8/455) in patients negative for HPV-16 (P<0.001), while no significant association was identified for other genotypes in terms of genotype and clinical progress. There was a significant association between clearance and good prognosis (P<0.001). Persistent infection was higher in patients aged ≥51 years (38.7%) than in those aged ≤50 years (20.4%; P=0.036). Progression from persistent infection to CIN2 or worse (19/34, 55.9%) was higher than clearance (0/105, 0.0%; P<0.001). In the LDA analysis, using symmetric Dirichlet priors α=0.1 and β=0.01, and clusters (k)=5 or 10 provided the most meaningful groupings. Statistical and LDA analyses produced consistent results regarding the association between persistent infection of HPV-16, old age and long infection period with a clinical progression of CIN2 or worse
Terry, Dorothy Givens
2012-01-01
Dr. Mae Jemison is the world's first woman astronaut of color who continues to reach for the stars. Jemison was recently successful in leading a team that has secured a $500,000 federal grant to make interstellar space travel a reality. The Dorothy Jemison Foundation for Excellence (named after Jemison's mother) was selected in June by the Defense…
REACH. Air Conditioning Units.
Garrison, Joe; And Others
As a part of the REACH (Refrigeration, Electro-Mechanical, Air-Conditioning, Heating) electromechanical cluster, this student manual contains individualized instructional units in the area of air conditioning. The instructional units focus on air conditioning fundamentals, window air conditioning, system and installation, troubleshooting and…
REACH. Air Conditioning Units.
Garrison, Joe; And Others
As a part of the REACH (Refrigeration, Electro-Mechanical, Air-Conditioning, Heating) electromechanical cluster, this student manual contains individualized instructional units in the area of air conditioning. The instructional units focus on air conditioning fundamentals, window air conditioning, system and installation, troubleshooting and…
Reaching into Pictorial Spaces
Volcic, Robert; Vishwanath, Dhanraj; Domini, Fulvio
2014-02-01
While binocular viewing of 2D pictures generates an impression of 3D objects and space, viewing a picture monocularly through an aperture produces a more compelling impression of depth and the feeling that the objects are "out there", almost touchable. Here, we asked observers to actually reach into pictorial space under both binocular- and monocular-aperture viewing. Images of natural scenes were presented at different physical distances via a mirror-system and their retinal size was kept constant. Targets that observers had to reach for in physical space were marked on the image plane, but at different pictorial depths. We measured the 3D position of the index finger at the end of each reach-to-point movement. Observers found the task intuitive. Reaching responses varied as a function of both pictorial depth and physical distance. Under binocular viewing, responses were mainly modulated by the different physical distances. Instead, under monocular viewing, responses were modulated by the different pictorial depths. Importantly, individual variations over time were minor, that is, observers conformed to a consistent pictorial space. Monocular viewing of 2D pictures thus produces a compelling experience of an immersive space and tangible solid objects that can be easily explored through motor actions.
Snow, Rufus; And Others
As a part of the REACH (Refrigeration, Electro-Mechanical, Air-Conditioning, Heating) electromechanical cluster, this student manual contains individualized instructional units in the area of refrigeration. The instructional units focus on refrigeration fundamentals, tubing and pipe, refrigerants, troubleshooting, window air conditioning, and…
Terry, Dorothy Givens
2012-01-01
Dr. Mae Jemison is the world's first woman astronaut of color who continues to reach for the stars. Jemison was recently successful in leading a team that has secured a $500,000 federal grant to make interstellar space travel a reality. The Dorothy Jemison Foundation for Excellence (named after Jemison's mother) was selected in June by the Defense…
Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a
M. Maric; M. de Haan; S.M. Hogendoorn; L.H. Wolters; H.M. Huizenga
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-
L. Bornmann; L. Leydesdorff
2011-01-01
The methods presented in this paper allow for a statistical analysis revealing centers of excellence around the world using programs that are freely available. Based on Web of Science data (a fee-based database), field-specific excellence can be identified in cities where highly cited papers were pu
Solar Hydrogen Reaching Maturity
Rongé Jan
2015-09-01
Full Text Available Increasingly vast research efforts are devoted to the development of materials and processes for solar hydrogen production by light-driven dissociation of water into oxygen and hydrogen. Storage of solar energy in chemical bonds resolves the issues associated with the intermittent nature of sunlight, by decoupling energy generation and consumption. This paper investigates recent advances and prospects in solar hydrogen processes that are reaching market readiness. Future energy scenarios involving solar hydrogen are proposed and a case is made for systems producing hydrogen from water vapor present in air, supported by advanced modeling.
Reach capacity in older women submitted to flexibility training
Elciana de Paiva Lima Vieira
2015-11-01
Full Text Available The aim of this study was to analyze the effect of flexibility training on the maximum range of motion levels and reach capacity of older women practitioners of aquatic exercises of the Prev-Quedas project. Participants were divided into two groups: intervention (IG, n = 25, which were submitted to flexibility training program and control (CG, n = 21, in which older women participated only in aquatic exercises. Flexibility training lasted three months with weekly frequency of two days, consisting of stretching exercises involving trunk and lower limbs performed after aquatic exercises. The stretching method used was passive static. Assessment consisted of the functional reach, lateral and goniometric tests. Statistical analysis was performed using the following tests: Shapiro-Wilk normality, ANCOVA, Pearson and Spearman correlations. Significant results for GI in gains of maximum range of motion for the right hip joint (p = 0.0025, however, the same result was not observed in other joints assessed, and there was no improvement in functional and lateral reach capacity for both groups. Significant correlations between reach capacity and range of motion in the trunk, hip and ankle were not observed. Therefore, flexibility training associated with the practice of aquatic exercises promoted increased maximum range of motion only for the hip joint; however, improvement in the reach capacity was not observed. The practice of aquatic exercises alone did not show significant results.
Westar reaches critical crossroads
1992-06-01
Westar Mining Ltd. has applied for court protection until September 30, 1992 to gain time to draw up a final reorganization plan. The Companies' Creditors Arrangement Act is a federal statute that allows a business to restructure financially without having to declare bankruptcy. Normal trade terms with suppliers are usually maintained during this period. The company is struggling under the effects of falling coal prices, a high Canadian dollar and a high debt burden. Changes in work practices at the company's Balmer mine are a major part of the restructuring. An agreement must be reached with the United Mineworkers of America and other stakeholders or the Balmer mine will close permanently. Employees have been locked out since May 1, 1992 when union members rejected the company's final offer.
Reaching Fleming's dicrimination bound
Gruebl, Gebhard
2012-01-01
Any rule for identifying a quantum system's state within a set of two non-orthogonal pure states by a single measurement is flawed. It has a non-zero probability of either yielding the wrong result or leaving the query undecided. This also holds if the measurement of an observable $A$ is repeated on a finite sample of $n$ state copies. We formulate a state identification rule for such a sample. This rule's probability of giving the wrong result turns out to be bounded from above by $1/n\\delta_{A}^{2}$ with $\\delta_{A}=|_{1}-_{2}|/(\\Delta_{1}A+\\Delta_{2}A).$ A larger $\\delta_{A}$ results in a smaller upper bound. Yet, according to Fleming, $\\delta_{A}$ cannot exceed $\\tan\\theta$ with $\\theta\\in(0,\\pi/2) $ being the angle between the pure states under consideration. We demonstrate that there exist observables $A$ which reach the bound $\\tan\\theta$ and we determine all of them.
2001-01-01
The creation of the world's largest sandstone cavern, not a small feat! At the bottom, cave-in preventing steel mesh can be seen clinging to the top of the tunnel. The digging of UX-15, the cavern that will house ATLAS, reached the upper ceiling of LEP on October 10th. The breakthrough which took place nearly 100 metres underground occurred precisely on schedule and exactly as planned. But much caution was taken beforehand to make the LEP breakthrough clean and safe. To prevent the possibility of cave-ins in the side tunnels that will eventually be attached to the completed UX-15 cavern, reinforcing steel mesh was fixed into the walls with bolts. Obviously no people were allowed in the LEP tunnels below UX-15 as the breakthrough occurred. The area was completely evacuated and fences were put into place to keep all personnel out. However, while personnel were being kept out of the tunnels below, this has been anything but the case for the work taking place up above. With the creation of the world's largest...
Marina Yepifanova; Boris Zhelezovsky
2012-01-01
It is theoretically proved and experimentally possibility of maintenance of continuity of formation of socially significant hierarchy of motives of the learning of senior pupils by means of multimedia...
Mexican agencies reach teenagers.
Brito Lemus, R; Beamish, J
1992-08-01
The Gente Joven project of the Mexican Foundation for Family Planning (MEXFAM) trains young volunteers in 19 cities to spread messages about sexually transmitted diseases and population growth to their peers. They also distribute condoms and spermicides. It also uses films and materials to spread its messages. The project would like to influence young men's behavior, but the Latin image of machismo poses a big challenge. It would like to become more responsible toward pregnancy prevention. About 50% of adolescents have sexual intercourse, but few use contraceptives resulting in a high adolescent pregnancy rate. Many of these pregnant teenagers choose not to marry. Adolescent pregnancy leads to girls leaving school, few marketable skills, and rearing children alone. Besides women who began childbearing as a teenager have 1.5 times more children than other women. Male involvement in pregnancy prevention should improve these statistics. As late as 1973, the Health Code banned promotion and sales of contraceptives, but by 1992 about 50% of women of reproductive age use contraceptives. The Center for the Orientation of Adolescents has organized 8 Young Men's Clubs in Mexico City to involve male teenagers more in family planning and to develop self-confidence. It uses a holistic approach to their development through discussions with their peers. A MEXFAM study shows that young men are not close with their fathers who tend to exude a machismo attitude, thus the young men do not have a role model for responsible sexual behavior. MEXFAM's work is cut out for them, however, since the same study indicates that 50% of the young men believe it is fine to have 1 girlfriend and 33% think women should earn more than men. A teenager volunteer reports, however, that more boys have been coming to him for contraception and information than girls in 1992 while in other years girls outnumbered the boys.
Policy Analysis Reaches Midlife
Beryl A. Radin
2013-07-01
Full Text Available The field of policy analysis that exists in the 21st century is quite different from that found earlier phases. The world of the 1960s that gave rise to this field in the US often seems unrelated to the world we experience today. These shifts have occurred as a result of a range of developments – technological changes, changes in the structure and processes of government both internally and globally, new expectations about accountability and transparency, economic and fiscal problems, and increased political and ideological conflict.It is clear globalization has had a significant impact on the field. Shifts in the type of decisionmaking also have created challenges for policy analysts since analysts are now clearly in every nook and cranny in the decisionmaking world. Thus it is relevant to look at the work that they do, the skills that they require, and the background experience that is relevant to them.
Impact of the REACH II and REACH VA Dementia Caregiver Interventions on Healthcare Costs.
Nichols, Linda O; Martindale-Adams, Jennifer; Zhu, Carolyn W; Kaplan, Erin K; Zuber, Jeffrey K; Waters, Teresa M
2017-05-01
Examine caregiver and care recipient healthcare costs associated with caregivers' participation in Resources for Enhancing Alzheimer's Caregivers Health (REACH II or REACH VA) behavioral interventions to improve coping skills and care recipient management. RCT (REACH II); propensity-score matched, retrospective cohort study (REACH VA). Five community sites (REACH II); 24 VA facilities (REACH VA). Care recipients with Alzheimer's disease and related dementias (ADRD) and their caregivers who participated in REACH II study (analysis sample of 110 caregivers and 197 care recipients); care recipients whose caregivers participated in REACH VA and a propensity matched control group (analysis sample of 491). Previously collected data plus Medicare expenditures (REACH II) and VA costs plus Medicare expenditures (REACH VA). There was no increase in VA or Medicare expenditures for care recipients or their caregivers who participated in either REACH intervention. For VA care recipients, REACH was associated with significantly lower total VA costs of care (33.6%). VA caregiver cost data was not available. In previous research, both REACH II and REACH VA have been shown to provide benefit for dementia caregivers at a cost of less than $5/day; however, concerns about additional healthcare costs may have hindered REACH's widespread adoption. Neither REACH intervention was associated with additional healthcare costs for caregivers or patients; in fact, for VA patients, there were significantly lower healthcare costs. The VA costs savings may be related to the addition of a structured format for addressing the caregiver's role in managing complex ADRD care to an existing, integrated care system. These findings suggest that behavioral interventions are a viable mechanism to support burdened dementia caregivers without additional healthcare costs. © 2017, Copyright the Authors Journal compilation © 2017, The American Geriatrics Society.
Bornmann, Lutz
2011-01-01
The methods presented in this paper allow for a spatial analysis revealing centers of excellence around the world using programs that are freely available. Based on Web of Science data, field-specific excellence can be identified in cities where highly-cited papers were published. Compared to the mapping approaches published hitherto, our approach is more analytically oriented by allowing the assessment of an observed number of excellent papers for a city against the expected number. With this feature, this approach can not only identify the top performers in output but the "true jewels." These are cities locating authors who publish significantly more top cited papers than can be expected. As the examples in this paper show for physics, chemistry, and psychology, these cities do not necessarily have a high output of excellent papers.
Reach Envelope of Human Extremities
YANG Jingzhou(杨景周); ZHANG Yunqing(张云清); CHEN Liping(陈立平); ABDEL-MALEK Karim
2004-01-01
Significant attention in recent years has been given to obtain a better understanding of human joint ranges, measurement, and functionality, especially in conjunction with commands issued by the central nervous system. While researchers have studied motor commands needed to drive a limb to follow a path trajectory, various computer algorithms have been reported that provide adequate analysis of limb modeling and motion. This paper uses a rigorous mathematical formulation to model human limbs, understand their reach envelope, delineate barriers therein where a trajectory becomes difficult to control, and help visualize these barriers. Workspaces of a typical forearm with 9 degrees of freedom, a typical finger modeled as a 4- degree-of-freedom system, and a lower extremity with 4 degrees of freedom are discussed. The results show that using the proposed formulation, joint limits play an important role in distinguishing the barriers.
William L Bigbee
2005-01-01
Full Text Available source of patient-specific information with high potential impact on the early detection and classification of cancer and other diseases. The new profiling technology comes, however, with numerous challenges and concerns. Particularly important are concerns of reproducibility of classification results and their significance. In this work we describe a computational validation framework, called PACE (Permutation-Achieved Classification Error, that lets us assess, for a given classification model, the significance of the Achieved Classification Error (ACE on the profile data. The framework compares the performance statistic of the classifier on true data samples and checks if these are consistent with the behavior of the classifier on the same data with randomly reassigned class labels. A statistically significant ACE increases our belief that a discriminative signal was found in the data. The advantage of PACE analysis is that it can be easily combined with any classification model and is relatively easy to interpret. PACE analysis does not protect researchers against confounding in the experimental design, or other sources of systematic or random error.We use PACE analysis to assess significance of classification results we have achieved on a number of published data sets. The results show that many of these datasets indeed possess a signal that leads to a statistically significant ACE.
Tonello, Lucio; Conway de Macario, Everly; Marino Gammazza, Antonella; Cocchi, Massimo; Gabrielli, Fabio; Zummo, Giovanni; Cappello, Francesco; Macario, Alberto J L
2015-03-01
The pathogenesis of Hashimoto's thyroiditis includes autoimmunity involving thyroid antigens, autoantibodies, and possibly cytokines. It is unclear what role plays Hsp60, but our recent data indicate that it may contribute to pathogenesis as an autoantigen. Its role in the induction of cytokine production, pro- or anti-inflammatory, was not elucidated, except that we found that peripheral blood mononucleated cells (PBMC) from patients or from healthy controls did not respond with cytokine production upon stimulation by Hsp60 in vitro with patterns that would differentiate patients from controls with statistical significance. This "negative" outcome appeared when the data were pooled and analyzed with conventional statistical methods. We re-analyzed our data with non-conventional statistical methods based on data mining using the classification and regression tree learning algorithm and clustering methodology. The results indicate that by focusing on IFN-γ and IL-2 levels before and after Hsp60 stimulation of PBMC in each patient, it is possible to differentiate patients from controls. A major general conclusion is that when trying to identify disease markers such as levels of cytokines and Hsp60, reference to standards obtained from pooled data from many patients may be misleading. The chosen biomarker, e.g., production of IFN-γ and IL-2 by PBMC upon stimulation with Hsp60, must be assessed before and after stimulation and the results compared within each patient and analyzed with conventional and data mining statistical methods.
Greene, Nicholas
2012-01-01
ABOUT THE BOOK Halo Reach is the latest installment, and goes back to Halo's roots in more ways than one. Set around one of the most frequently referenced events in the Haloverse-The Fall of Reach-Reach puts you in the shoes of Noble 6, an unnamed Spartan, fighting a doomed battle to save the planet. Dual-wielding's gone, health is back, and equipment now takes the form of different "classes," with different weapon loadouts and special abilities (such as sprinting, cloaking, or flight). If you're reading this guide, you're either new to the Halo franchise and looking to get a leg up on all
Lunar Probe Reaches Deep Space
2011-01-01
@@ China's second lunar probe, Chang'e-2, has reached an orbit 1.5 million kilometers from Earth for an additional mission of deep space exploration, the State Administration for Science, Technology and Industry for National Defense announced.
... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...
Cosmic Statistics of Statistics
Szapudi, I.; Colombi, S.; Bernardeau, F.
1999-01-01
The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...
Meneghetti, M; Dahle, H; Limousin, M
2013-01-01
The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...
J. de Haan; W.P. Knulst
2000-01-01
Original title: Het bereik van de kunsten. The reach of the arts (Het bereik van de kunsten) is the fourth study in a series which periodically analyses the status of cultural participation, reading and use of other media. The series, Support for culture (Het culturele draagvlak) is sponsored by th
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Freudenburg, William R.
2006-01-01
Rather than seeking ivory-tower isolation, members of the Rural Sociological Society have always been distinguished by a willingness to work with specialists from a broad range of disciplines, and to work on some of the world's most challenging problems. What is less commonly recognized is that the willingness to reach beyond disciplinary…
Sampling hard to reach populations.
Faugier, J; Sargeant, M
1997-10-01
Studies on 'hidden populations', such as homeless people, prostitutes and drug addicts, raise a number of specific methodological questions usually absent from research involving known populations and less sensitive subjects. This paper examines the advantages and limitations of nonrandom methods of data collection such as snowball sampling. It reviews the currently available literature on sampling hard to reach populations and highlights the dearth of material currently available on this subject. The paper also assesses the potential for using these methods in nursing research. The sampling methodology used by Faugier (1996) in her study of prostitutes, HIV and drugs is used as a current example within this context.
How to reach library users who cannot reach libraries?
Dragana Ljuić
2002-01-01
Full Text Available The article discusses the ways of getting library activities closer to the individuals or groups of users who have difficulties to or cannot visit the library themselves. The author presents the services offered by the Maribor Public Library and discusses how one of the basic human rights – the right to the access of cultural goods, knowledge and information - is exercised also through library activities. By enabling access to library material and information, public libraries help to fulfill basic human rights and thus raise the quality of living in a social environment. The following forms of library activities are presented in the article: »distance library« – borrowing books at home, in hospital, station for the bibliobus for disabled users, »mobile collections« in the institutions where users, due to their age or illness, have difficulties in accessing or even cannot reach library materials and information by themselves.
Hunt, N C; Ghosh, K M; Blain, A P; Rushton, S P; Longstaff, L M; Deehan, D J
2015-05-01
The aim of this study was to compare the maximum laxity conferred by the cruciate-retaining (CR) and posterior-stabilised (PS) Triathlon single-radius total knee arthroplasty (TKA) for anterior drawer, varus-valgus opening and rotation in eight cadaver knees through a defined arc of flexion (0º to 110º). The null hypothesis was that the limits of laxity of CR- and PS-TKAs are not significantly different. The investigation was undertaken in eight loaded cadaver knees undergoing subjective stress testing using a measurement rig. Firstly the native knee was tested prior to preparation for CR-TKA and subsequently for PS-TKA implantation. Surgical navigation was used to track maximal displacements/rotations at 0º, 30º, 60º, 90º and 110° of flexion. Mixed-effects modelling was used to define the behaviour of the TKAs. The laxity measured for the CR- and PS-TKAs revealed no statistically significant differences over the studied flexion arc for the two versions of TKA. Compared with the native knee both TKAs exhibited slightly increased anterior drawer and decreased varus-valgus and internal-external roational laxities. We believe further study is required to define the clinical states for which the additional constraint offered by a PS-TKA implant may be beneficial.
Norén, Patrik
2013-01-01
Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...
张文军; 齐艳红; 等
2002-01-01
Diversity and evenness indices were widely used in community ecology and biodiversity researches. However, shortage of statistic tests on these indices restricted their reliability. To develop statistic test methods on diversity is one of the focuses in biodiversity researches. In present study, some randomization tests on statistic significance of diversity and evenness indices, confidence interval of diversity and evenness, and randomization test on statistic significance of between-community differences were presented. Shannon-Wiener diversity index, Simpson diversity index, McIntosh diversity index, Berger-Parker diversity index, Hurlbert diversity index, Brillouin diversity index, and corresponding evenness indices are included in the randomization test procedure. The web-based computational software for the statistic tests, BiodiversityTest, which is comprised of seven Java classes and an HTML file, is developed. It can be run on various operational systems and java-enabled web browsers and, may read ODBC linked databases such as MS Access, Excel, FoxPro, dBASE, etc. Rice arthropod diversity (15 sampling sites, 125 arthropod species, 17 functional groups) was recorded on September,1996 in IRRI rice farm using RiceVac apparatus and bucket enclosure. The data were analysed using BiodiversityTest with Shannon-Wiener index and Berger-Parker index respectively, and the results showed that the changes of diversity and evenness can be effectively detected by these tests. The randomization tests will correct the possible wrong conclusions aroused in direct comparison of arthropod diversity which was used in most of the researches up to now. The development of randomization tests on biodiversity will provide a quantitative tool for stricter statistic comparison of biodiversity between communities and present an absolute criterion fordiversity measuring. BiodiversityTest will make the computation realistic and accessible on Internet.%多样性指数和均匀
Proprioceptive recalibration arises slowly compared to reach adaptation.
Zbib, Basel; Henriques, Denise Y P; Cressman, Erin K
2016-08-01
When subjects reach in a novel visuomotor environment (e.g. while viewing a cursor representing their hand that is rotated from their hand's actual position), they typically adjust their movements (i.e. bring the cursor to the target), thus reducing reaching errors. Additionally, research has shown that reaching with altered visual feedback of the hand results in sensory changes, such that proprioceptive estimates of hand position are shifted in the direction of the visual feedback experienced (Cressman and Henriques in J Neurophysiol 102:3505-3518, 2009). This study looked to establish the time course of these sensory changes. Additionally, the time courses of implicit sensory and motor changes were compared. Subjects reached to a single visual target while seeing a cursor that was either aligned with their hand position (50 trials) or rotated 30° clockwise relative to their hand (150 trials). Reach errors and proprioceptive estimates of felt hand position were assessed following the aligned reach training trials and at seven different times during the rotated reach training trials by having subjects reach to the target without visual feedback, and provide estimates of their hand relative to a visual reference marker, respectively. Results revealed a shift in proprioceptive estimates throughout the rotated reach training trials; however, significant sensory changes were not observed until after 70 trials. In contrast, results showed a greater change in reaches after a limited number of reach training trials with the rotated cursor. These findings suggest that proprioceptive recalibration arises more slowly than reach adaptation.
Statistical Symbolic Execution with Informed Sampling
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
新家, 健精
2013-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Spiking and LFP activity in PRR during symbolically instructed reaches.
Hwang, Eun Jung; Andersen, Richard A
2012-02-01
The spiking activity in the parietal reach region (PRR) represents the spatial goal of an impending reach when the reach is directed toward or away from a visual object. The local field potentials (LFPs) in this region also represent the reach goal when the reach is directed to a visual object. Thus PRR is a candidate area for reading out a patient's intended reach goals for neural prosthetic applications. For natural behaviors, reach goals are not always based on the location of a visual object, e.g., playing the piano following sheet music or moving following verbal directions. So far it has not been directly tested whether and how PRR represents reach goals in such cognitive, nonlocational conditions, and knowing the encoding properties in various task conditions would help in designing a reach goal decoder for prosthetic applications. To address this issue, we examined the macaque PRR under two reach conditions: reach goal determined by the stimulus location (direct) or shape (symbolic). For the same goal, the spiking activity near reach onset was indistinguishable between the two tasks, and thus a reach goal decoder trained with spiking activity in one task performed perfectly in the other. In contrast, the LFP activity at 20-40 Hz showed small but significantly enhanced reach goal tuning in the symbolic task, but its spatial preference remained the same. Consequently, a decoder trained with LFP activity performed worse in the other task than in the same task. These results suggest that LFP decoders in PRR should take into account the task context (e.g., locational vs. nonlocational) to be accurate, while spike decoders can robustly provide reach goal information regardless of the task context in various prosthetic applications.
Eliazar, Iddo
2017-05-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.
Olefins and chemical regulation in Europe: REACH.
Penman, Mike; Banton, Marcy; Erler, Steffen; Moore, Nigel; Semmler, Klaus
2015-11-05
REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals) is the European Union's chemical regulation for the management of risk to human health and the environment (European Chemicals Agency, 2006). This regulation entered into force in June 2007 and required manufacturers and importers to register substances produced in annual quantities of 1000 tonnes or more by December 2010, with further deadlines for lower tonnages in 2013 and 2018. Depending on the type of registration, required information included the substance's identification, the hazards of the substance, the potential exposure arising from the manufacture or import, the identified uses of the substance, and the operational conditions and risk management measures applied or recommended to downstream users. Among the content developed to support this information were Derived No-Effect Levels or Derived Minimal Effect Levels (DNELs/DMELs) for human health hazard assessment, Predicted No Effect Concentrations (PNECs) for environmental hazard assessment, and exposure scenarios for exposure and risk assessment. Once registered, substances may undergo evaluation by the European Chemicals Agency (ECHA) or Member State authorities and be subject to requests for additional information or testing as well as additional risk reduction measures. To manage the REACH registration and related activities for the European olefins and aromatics industry, the Lower Olefins and Aromatics REACH Consortium was formed in 2008 with administrative and technical support provided by Penman Consulting. A total of 135 substances are managed by this group including 26 individual chemical registrations (e.g. benzene, 1,3-butadiene) and 13 categories consisting of 5-26 substances. This presentation will describe the content of selected registrations prepared for 2010 in addition to the significant post-2010 activities. Beyond REACH, content of the registrations may also be relevant to other European activities, for
How Do Chinese Enterprises Look at REACH?
无
2007-01-01
@@ The new European REACH (Registration, Evaluation, Authorization of Chemicals) regulation has come into force. As soon as the REACH white paper was issued, Chinese enterprises started to research the possible impacts of REACH and prepare to cope with them. How then do these Chinese enterprises look at REACH? Following are views of some Chinese enterprises exporting chemical products to the European Union.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...
Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.
2010-01-01
A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Lyons, L
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
ALMA telescope reaches new heights
2009-09-01
of the Array Operations Site. This means surviving strong winds and temperatures between +20 and -20 Celsius whilst being able to point precisely enough that they could pick out a golf ball at a distance of 15 km, and to keep their smooth reflecting surfaces accurate to better than 25 micrometres (less than the typical thickness of a human hair). Once the transporter reached the high plateau it carried the antenna to a concrete pad - a docking station with connections for power and fibre optics - and positioned it with an accuracy of a few millimetres. The transporter is guided by a laser steering system and, just like some cars today, also has ultrasonic collision detectors. These sensors ensure the safety of the state-of-the-art antennas as the transporter drives them across what will soon be a rather crowded plateau. Ultimately, ALMA will have at least 66 antennas distributed over about 200 pads, spread over distances of up to 18.5 km and operating as a single, giant telescope. Even when ALMA is fully operational, the transporters will be used to move the antennas between pads to reconfigure the telescope for different kinds of observations. "Transporting our first antenna to the Chajnantor plateau is a epic feat which exemplifies the exciting times in which ALMA is living. Day after day, our global collaboration brings us closer to the birth of the most ambitious ground-based astronomical observatory in the world", said Thijs de Graauw, ALMA Director. This first ALMA antenna at the high site will soon be joined by others and the ALMA team looks forward to making their first observations from the Chajnantor plateau. They plan to link three antennas by early 2010, and to make the first scientific observations with ALMA in the second half of 2011. ALMA will help astronomers answer important questions about our cosmic origins. The telescope will observe the Universe using light with millimetre and submillimetre wavelengths, between infrared light and radio waves in
Ross, Sheldon M
2005-01-01
In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin
Ross, Sheldon M
2010-01-01
In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Wannier, Gregory H
2010-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Stream Habitat Reach Summary - NCWAP [ds158
California Department of Resources — The Stream Habitat - NCWAP - Reach Summary [ds158] shapefile contains in-stream habitat survey data summarized to the stream reach level. It is a derivative of the...
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
Contributions to industrial statistics
2015-01-01
This thesis is about statistics' contributions to industry. It is an article compendium comprising four articles divided in two blocks: (i) two contributions for a water supply company, and (ii) significance of the effects in Design of Experiments. In the first block, great emphasis is placed on how the research design and statistics can be applied to various real problems that a water company raises and it aims to convince water management companies that statistics can be very useful to impr...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Freund, Rudolf J; Wilson, William J
2010-01-01
Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
Gallavotti, Giovanni
2011-01-01
C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.
Sreeram V Ramagopalan
2015-04-01
Full Text Available Background: We and others have shown a significant proportion of interventional trials registered on ClinicalTrials.gov have their primary outcomes altered after the listed study start and completion dates. The objectives of this study were to investigate whether changes made to primary outcomes are associated with the likelihood of reporting a statistically significant primary outcome on ClinicalTrials.gov. Methods: A cross-sectional analysis of all interventional clinical trials registered on ClinicalTrials.gov as of 20 November 2014 was performed. The main outcome was any change made to the initially listed primary outcome and the time of the change in relation to the trial start and end date. Findings: 13,238 completed interventional trials were registered with ClinicalTrials.gov that also had study results posted on the website. 2555 (19.3% had one or more statistically significant primary outcomes. Statistical analysis showed that registration year, funding source and primary outcome change after trial completion were associated with reporting a statistically significant primary outcome. Conclusions: Funding source and primary outcome change after trial completion are associated with a statistically significant primary outcome report on clinicaltrials.gov.
Natrella, Mary Gibbons
2005-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Reach preparation enhances visual performance and appearance.
Rolfs, Martin; Lawrence, Bonnie M; Carrasco, Marisa
2013-10-19
We investigated the impact of the preparation of reach movements on visual perception by simultaneously quantifying both an objective measure of visual sensitivity and the subjective experience of apparent contrast. Using a two-by-two alternative forced choice task, observers compared the orientation (clockwise or counterclockwise) and the contrast (higher or lower) of a Standard Gabor and a Test Gabor, the latter of which was presented during reach preparation, at the reach target location or the opposite location. Discrimination performance was better overall at the reach target than at the opposite location. Perceived contrast increased continuously at the target relative to the opposite location during reach preparation, that is, after the onset of the cue indicating the reach target. The finding that performance and appearance do not evolve in parallel during reach preparation points to a distinction with saccade preparation, for which we have shown previously there is a parallel temporal evolution of performance and appearance. Yet akin to saccade preparation, this study reveals that overall reach preparation enhances both visual performance and appearance.
2012-01-01
In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...
Sheffield, Scott
2009-01-01
In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.
Improving exposure scenario definitions within REACH
Lee, Jihyun; Pizzol, Massimo; Thomsen, Marianne
instruments to support a precautionary chemicals management system and to protect receptor’s health have also been increasing. Since 2007, the European Union adopted REACH (the Regulation on Registration, Evaluation, Authorisation and Restriction of Chemicals): REACH makes industry responsible for assessing...... the different background exposure between two countries allows in fact the definition of a common framework for improving exposure scenarios within REACH system, for monitoring environmental health, and for increasing degree of circularity of resource and substance flows. References 1. European Commission...
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.
U.S. Department of Health & Human Services — This reference provides significant summary information about health expenditures and the Centers for Medicare & Medicaid Services' (CMS) programs. The...
Compact muon solenoid magnet reaches full field
2006-01-01
Scientist of the U.S. Department of Energy in Fermilab and collaborators of the US/CMS project announced that the world's largest superconducting solenoid magnet has reached full field in tests at CERN. (1 apge)
Hanford Reach - Ringold Russian Knapweed Treatment
US Fish and Wildlife Service, Department of the Interior — Increase the diversity of the seed mix on approximately 250 acres in the Ringold Unit of the Hanford Reach National Monument (Monument) treated with aminopyralid as...
RICHY
Expanded Program on Immunisation (EPI) training in. Zambia and critically analyses ... excellence in skills such as sport, music or dance, so it is ... only improve through reaching every child both physically and in .... Non-verbal communication.
Women Reaching Equality in Dubious Habit: Drinking
... page: https://medlineplus.gov/news/fullstory_161640.html Women Reaching Equality in Dubious Habit: Drinking Females also ... 25, 2016 MONDAY, Oct. 24, 2016 (HealthDay News) -- Women have made major strides towards equality with men, ...
Reaching the Overlooked Student in Physical Education
Esslinger, Keri; Esslinger, Travis; Bagshaw, Jarad
2015-01-01
This article describes the use of live action role-playing, or "LARPing," as a non-traditional activity that has the potential to reach students who are not interested in traditional physical education.
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better
Parallel explicit and implicit control of reaching.
Pietro Mazzoni
Full Text Available BACKGROUND: Human movement can be guided automatically (implicit control or attentively (explicit control. Explicit control may be engaged when learning a new movement, while implicit control enables simultaneous execution of multiple actions. Explicit and implicit control can often be assigned arbitrarily: we can simultaneously drive a car and tune the radio, seamlessly allocating implicit or explicit control to either action. This flexibility suggests that sensorimotor signals, including those that encode spatially overlapping perception and behavior, can be accurately segregated to explicit and implicit control processes. METHODOLOGY/PRINCIPAL FINDINGS: We tested human subjects' ability to segregate sensorimotor signals to parallel control processes by requiring dual (explicit and implicit control of the same reaching movement and testing for interference between these processes. Healthy control subjects were able to engage dual explicit and implicit motor control without degradation of performance compared to explicit or implicit control alone. We then asked whether segregation of explicit and implicit motor control can be selectively disrupted by studying dual-control performance in subjects with no clinically manifest neurologic deficits in the presymptomatic stage of Huntington's disease (HD. These subjects performed successfully under either explicit or implicit control alone, but were impaired in the dual-control condition. CONCLUSION/SIGNIFICANCE: The human nervous system can exert dual control on a single action, and is therefore able to accurately segregate sensorimotor signals to explicit and implicit control. The impairment observed in the presymptomatic stage of HD points to a possible crucial contribution of the striatum to the segregation of sensorimotor signals to multiple control processes.
Statistical Inference and String Theory
Heckman, Jonathan J
2013-01-01
In this note we expose some surprising connections between string theory and statistical inference. We consider a large collective of agents sweeping out a family of nearby statistical models for an M-dimensional manifold of statistical fitting parameters. When the agents making nearby inferences align along a d-dimensional grid, we find that the pooled probability that the collective reaches a correct inference is the partition function of a non-linear sigma model in d dimensions. Stability under perturbations to the original inference scheme requires the agents of the collective to distribute along two dimensions. Conformal invariance of the sigma model corresponds to the condition of a stable inference scheme, directly leading to the Einstein field equations for classical gravity. By summing over all possible arrangements of the agents in the collective, we reach a string theory. We also use this perspective to quantify how much an observer can hope to learn about the internal geometry of a superstring com...
Positive effects of robotic exoskeleton training of upper limb reaching movements after stroke
Frisoli Antonio
2012-06-01
Full Text Available Abstract This study, conducted in a group of nine chronic patients with right-side hemiparesis after stroke, investigated the effects of a robotic-assisted rehabilitation training with an upper limb robotic exoskeleton for the restoration of motor function in spatial reaching movements. The robotic assisted rehabilitation training was administered for a period of 6 weeks including reaching and spatial antigravity movements. To assess the carry-over of the observed improvements in movement during training into improved function, a kinesiologic assessment of the effects of the training was performed by means of motion and dynamic electromyographic analysis of reaching movements performed before and after training. The same kinesiologic measurements were performed in a healthy control group of seven volunteers, to determine a benchmark for the experimental observations in the patients’ group. Moreover degree of functional impairment at the enrolment and discharge was measured by clinical evaluation with upper limb Fugl-Meyer Assessment scale (FMA, 0–66 points, Modified Ashworth scale (MA, 0–60 pts and active ranges of motion. The robot aided training induced, independently by time of stroke, statistical significant improvements of kinesiologic (movement time, smoothness of motion and clinical (4.6 ± 4.2 increase in FMA, 3.2 ± 2.1 decrease in MA parameters, as a result of the increased active ranges of motion and improved co-contraction index for shoulder extension/flexion. Kinesiologic parameters correlated significantly with clinical assessment values, and their changes after the training were affected by the direction of motion (inward vs. outward movement and position of target to be reached (ipsilateral, central and contralateral peripersonal space. These changes can be explained as a result of the motor recovery induced by the robotic training, in terms of regained ability to execute single joint movements and of improved
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Paolucci, Teresa; Zangrando, Federico; Piccinini, Giulia; Sciarra, Federico; Pallotta, Rocco; Mannocci, Alice; la Torre, Giuseppe; Bini, Fabiano; Marinozzi, Franco; Gumina, Stefano; Padua, Luca; Saraceni, Vincenzo Maria
2016-01-01
Background. The position sense of the shoulder joint is important during reaching. Objective. To examine the existence of additional competence of the shoulder with regard to the ability to measure extracorporeal space, through a novel approach, using the shoulder proprioceptive rehabilitation tool (SPRT), during reaching. Design. Observational case-control study. Methods. We examined 50 subjects: 25 healthy and 25 with impingement syndrome with a mean age [years] of 64.52 +/- 6.98 and 68.36 +/- 6.54, respectively. Two parameters were evaluated using the SPRT: the integration of visual information and the proprioceptive afferents of the shoulder (Test 1) and the discriminative proprioceptive capacity of the shoulder, with the subject blindfolded (Test 2). These tasks assessed the spatial error (in centimeters) by the shoulder joint in reaching movements on the sagittal plane. Results. The shoulder had proprioceptive features that allowed it to memorize a reaching position and reproduce it (error of 1.22 cm to 1.55 cm in healthy subjects). This ability was lower in the impingement group, with a statistically significant difference compared to the healthy group (p shoulder has specific expertise in the measurement of the extracorporeal space during reaching movements that gradually decreases in impingement syndrome.
Do working environment interventions reach shift workers?
Nabe-Nielsen, Kirsten; Jørgensen, Marie Birk; Garde, Anne Helene
2016-01-01
workers were less likely to be reached by workplace interventions. For example, night workers less frequently reported that they had got more flexibility (OR 0.5; 95 % CI 0.3-0.7) or that they had participated in improvements of the working procedures (OR 0.6; 95 % CI 0.5-0.8). Quality of leadership......PURPOSE: Shift workers are exposed to more physical and psychosocial stressors in the working environment as compared to day workers. Despite the need for targeted prevention, it is likely that workplace interventions less frequently reach shift workers. The aim was therefore to investigate whether...... the reach of workplace interventions varied between shift workers and day workers and whether such differences could be explained by the quality of leadership exhibited at different times of the day. METHODS: We used questionnaire data from 5361 female care workers in the Danish eldercare sector...
REACH. Analytical characterisation of petroleum UVCB substances
De Graaff, R.; Forbes, S.; Gennart, J.P.; Gimeno Cortes, M.J.; Hovius, H.; King, D.; Kleise, H.; Martinez Martin, C.; Montanari, L.; Pinzuti, M.; Pollack, H.; Ruggieri, P.; Thomas, M.; Walton, A.; Dmytrasz, B.
2012-10-15
The purpose of this report is to summarise the findings of the scientific and technical work undertaken by CONCAWE to assess the feasibility and potential benefit of characterising petroleum UVCB substances (Substances of Unknown or Variable Composition, Complex reaction products or Biological Materials) beyond the recommendations issued by CONCAWE for the substance identification of petroleum substances under REACH. REACH is the European Community Regulation on chemicals and their safe use (EC 1907/2006). It deals with the Registration, Evaluation, Authorisation and Restriction of Chemical substances. The report is based on Member Company experience of the chemical analysis of petroleum UVCB substances, including analysis in support of REACH registrations undertaken in 2010. This report is structured into four main sections, namely: Section 1 which provides an introduction to the subject of petroleum UVCB substance identification including the purpose of the report, regulatory requirements, the nature of petroleum UVCB substances, and CONCAWE's guidance to Member Companies and other potential registrants. Section 2 provides a description of the capabilities of each of the analytical techniques described in the REACH Regulation. This section also includes details on the type of analytical information obtained by each technique and an evaluation of what each technique can provide for the characterisation of petroleum UVCB substances. Section 3 provides a series of case studies for six petroleum substance categories (low boiling point naphthas, kerosene, heavy fuel oils, other lubricant base oils, residual aromatic extracts and bitumens) to illustrate the value of the information derived from each analytical procedure, and provide an explanation for why some techniques are not scientifically necessary. Section 4 provides a summary of the conclusions reached from the technical investigations undertaken by CONCAWE Member Companies, and summarising the
Esposti, Roberto; Bruttini, Carlo; Bolzoni, Francesco; Cavallari, Paolo
2017-02-17
During goal-directed arm movements, the eyes, head, and arm are coordinated to look at and reach the target. We examined whether the expectancy of visual information about the target modifies Anticipatory Postural Adjustments (APAs). Ten standing subjects had to (1) move the eyes, head and arm, so as to reach, with both gaze and index-finger, a target of known position placed outside their visual field (Gaze-Reach); (2) look at the target while reaching it (Reach in Full Vision); (3) keep the gaze away until having touched it (Reach then Gaze) and (4) just Gaze without Reach the target. We recorded eye, head, right arm, and acromion kinematics, EMGs from upper- and lower-limb muscles, and forces exerted on the ground. In Gaze-Reach, two coordination strategies were found: when gaze preceded arm muscle recruitment (Gaze-first) and when the opposite occurred (Reach-first). APAs in acromion kinematics, leg muscles, and ground forces started significantly earlier in Gaze-first vs. Reach-first (mean time advance: 44.3 ± 8.9 ms), as it was in Reach in Full Vision vs. Reach then Gaze (39.5 ± 7.9 ms). The Gaze-first to Reach-first time-shift was similar to that between Reach in Full Vision and Reach then Gaze (p = 0.58). Moreover, Gaze without Reach data witnessed that the head-induced postural actions did not affect the APA onset in Gaze-first and Reach-first. In conclusion, in Gaze-first, the central control of posture considers visual information while planning the movement, like in Reach in Full Vision; while Reach-first is more similar to Reach then Gaze, where vision is not required.
The statistical significance of hippotherapy for children with psychomotor disabilities
Anca Nicoleta BNLBA
2015-01-01
Topic The recovery and social integration of children with psychomotility disabilities is an important goal for the integration of Romania into the European Union. Studies conducted in this area reveal that people, who practice therapy using the horse due to a recommendation by professionals, benefit from a much faster recovery and at a much higher level. Purpose of study Identification of results for adaptive areas due to participation in a therapy program with the help of the horse for chil...
Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks
2016-04-26
right of the red line correspond to individuals who became associated with the author through marriage . Essentially there are three main clusters...public release. [8] Zachary, W., 1977. “An information flow model for conflict and fission in small groups,” Journal of Anthropological Research 33, pp
The questioned p value: clinical, practical and statistical significance
Rosa Jiménez-Paneque
2016-09-01
Full Text Available Resumen El uso del valor de p y la significación estadística han estado en entredicho desde principios de la década de los 80 en el siglo pasado hasta nuestros días. Mucho se ha discutido al respecto en el ámbito de la estadística y sus aplicaciones, en particular a la Epidemiología y la Salud Pública. El valor de p y su equivalente, la significación estadística, son por demás conceptos difíciles de asimilar para los muchos profesionales de la salud involucrados de alguna manera en la investigación aplicada a sus áreas de trabajo. Sin embargo, su significado debería ser claro en términos intuitivos a pesar de que se basa en conceptos teóricos del terreno de la Estadística-Matemática. Este artículo intenta presentar al valor de p como un concepto que se aplica a la vida diaria y por tanto intuitivamente sencillo pero cuyo uso adecuado no se puede separar de elementos teóricos y metodológicos con complejidad intrínseca. Se explican también de manera intuitiva las razones detrás de las críticas que ha recibido el valor de p y su uso aislado, principalmente la necesidad de deslindar significación estadística de significación clínica y se mencionan algunos de los remedios propuestos para estos problemas. Se termina aludiendo a la actual tendencia a reivindicar su uso apelando a la conveniencia de utilizarlo en ciertas situaciones y la reciente declaración de la Asociación Americana de Estadística al respecto.
Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks
2015-03-16
level of transitivity are often more stable, balanced and harmonious . For social networks, Granovetter [3] in his work on “strength of weak ties...of schedules for the independent teams , relative to the other conferences in the FBS. Applying the proposed clustering algorithm to the FBS network...correctly identified all 11 conferences, as well as those teams that belong to those conferences. The “independent” teams were also assigned to a conference
The questioned p value: clinical, practical and statistical significance
Rosa Jiménez-Paneque
2016-01-01
Resumen El uso del valor de p y la significación estadística han estado en entredicho desde principios de la década de los 80 en el siglo pasado hasta nuestros días. Mucho se ha discutido al respecto en el ámbito de la estadística y sus aplicaciones, en particular a la Epidemiología y la Salud Pública. El valor de p y su equivalente, la significación estadística, son por demás conceptos difíciles de asimilar para los muchos profesionales de la salud involucrados de alguna manera en la inve...
Zhao, Qing-he; Liu, Qian; Ma, Li-jiao; Ding, Sheng-yan; Lu, Xun-ling; Tang, Qian; Xu, Shan-shan
2015-12-01
Soil and vegetation are the foundation of maintaining riparian ecosystem services, and their spatial distribution and variations can determine the effects of ecological functions. In the present study, selecting the typical reach of the middle and lower reaches of the Yellow River as the study area, the spatial distributions of riparian soil physicochemical properties and their response to environmental factors were analyzed by employing methods of field investigation, experimental analysis, and redundancy analysis (RDA). The results showed that soil particle was composed significantly of silt in the study area, with the increase of riparian buffer distance, soil bulk density increased initially and then decreased, whereas soil moisture showed the opposite pattern. Changes in total soil phosphorus (TP), available phosphorus (AP), total carbon (TC), total organic carbon (TOC), total nitrogen (TN); ammonium nitrogen (NH₄⁺-N) and nitrate nitrogen (NO₃⁻-N) contents under different riparian buffer distance showed no statistically significant differences. The spatial distribution of soil chemical properties was generally insignificantly different through changes between two vegetation types. Pearson correlation analysis showed that there was close relationship between soil physical and chemical properties, therein, TOC content in the study area was positively and significantly related to TN (P soil and then accelerate the degradation rate of organic matters in soils. In addition, the results of RDA indicated that TOC and NH₄⁺-N contents increased with increasing the height and coverage of the tree layer. Soil TP and NO₃⁻-N contents increased with increasing the plant diameter at breast height (DBH) of the tree layer and coverage of the herb layer. Meanwhile, with the increase of elevation gradient, the content of soil NH₄⁺-N presented an increasing trend, indicating that soil properties were significantly influenced by the effects of community
Smith, Kristy Breuhl; Smith, Michael Seth
2016-03-01
Obesity is a chronic disease that is strongly associated with an increase in mortality and morbidity including, certain types of cancer, cardiovascular disease, disability, diabetes mellitus, hypertension, osteoarthritis, and stroke. In adults, overweight is defined as a body mass index (BMI) of 25 kg/m(2) to 29 kg/m(2) and obesity as a BMI of greater than 30 kg/m(2). If current trends continue, it is estimated that, by the year 2030, 38% of the world's adult population will be overweight and another 20% obese. Significant global health strategies must reduce the morbidity and mortality associated with the obesity epidemic.
THE LONG REACH OF EDUCATION: EARLY RETIREMENT.
Venti, Steven; Wise, David A
2015-12-01
The goal of this paper is to draw attention to the long lasting effect of education on economic outcomes. We use the relationship between education and two routes to early retirement - the receipt of Social Security Disability Insurance (DI) and the early claiming of Social Security retirement benefits - to illustrate the long-lasting influence of education. We find that for both men and women with less than a high school degree the median DI participation rate is 6.6 times the participation rate for those with a college degree or more. Similarly, men and women with less than a high school education are over 25 percentage points more likely to claim Social Security benefits early than those with a college degree or more. We focus on four critical "pathways" through which education may indirectly influence early retirement - health, employment, earnings, and the accumulation of assets. We find that for women health is the dominant pathway through which education influences DI participation. For men, the health, earnings, and wealth pathways are of roughly equal magnitude. For both men and women the principal channel through which education influences early Social Security claiming decisions is the earnings pathway. We also consider the direct effect of education that does not operate through these pathways. The direct effect of education is much greater for early claiming of Social Security benefits than for DI participation, accounting for 72 percent of the effect of education for men and 67 percent for women. For women the direct effect of education on DI participation is not statistically significant, suggesting that the total effect may be through the four pathways.
Polishing Difficult-To-Reach Cavities
Malinzak, R. Michael; Booth, Gary N.
1990-01-01
Springy abrasive tool used to finish surfaces of narrow cavities made by electrical-discharge machining. Robot arm moves vibrator around perimeters of cavities, polishing walls of cavities as it does so. Tool needed because such cavities inaccessible or at least difficult to reach with most surface-finishing tools.
REACH. Electricity Units, Post-Secondary.
Smith, Gene; And Others
As a part of the REACH (Refrigeration, Electro-Mechanical, Air-Conditioning, Heating) electromechanical cluster, this postsecondary student manual contains individualized instructional units in the area of electricity. The instructional units focus on electricity fundamentals, electric motors, electrical components, and controls and installation.…
Reliability of the Advanced REACH Tool (ART)
Schinkel, J.; Fransman, W.; McDonnell, P.E.; Entink, R.K.; Tielemans, E.; Kromhout, H.
2014-01-01
Objectives: The aim of this study was to assess the reliability of the Advanced REACH Tool (ART) by (i) studying interassessor agreement of the resulting exposure estimates generated by the ART mechanistic model, (ii) studying interassessor agreement per model parameters of the ART mechanistic model
Reliability of the Advanced REACH Tool (ART)
Schinkel, J.; Fransman, W.; McDonnell, P.E.; Entink, R.K.; Tielemans, E.; Kromhout, H.
2014-01-01
Objectives: The aim of this study was to assess the reliability of the Advanced REACH Tool (ART) by (i) studying interassessor agreement of the resulting exposure estimates generated by the ART mechanistic model, (ii) studying interassessor agreement per model parameters of the ART mechanistic
Guiding Warfare to Reach Sustainable Peace
Vestenskov, David; Drewes, Line
The conference report Guiding Warfare to Reach Sustainable Peace constitutes the primary outcome of the conference It is based on excerpts from the conference presenters and workshop discussions. Furthermore, the report contains policy recommendations and key findings, with the ambition of develo...
ATLAS Barrel Toroid magnet reached nominal field
2006-01-01
Â OnÂ 9 November the barrel toroid magnet reached its nominal field of 4 teslas, with an electrical current of 21 000 amperes (21 kA) passing through the eight superconducting coils as shown on this graph
Science Experiments: Reaching Out to Our Users
Nolan, Maureen; Tschirhart, Lori; Wright, Stephanie; Barrett, Laura; Parsons, Matthew; Whang, Linda
2008-01-01
As more users access library services remotely, it has become increasingly important for librarians to reach out to their user communities and promote the value of libraries. Convincing the faculty and students in the sciences of the value of libraries and librarians can be a particularly "hard sell" as more and more of their primary…
The REACH Youth Program Learning Toolkit
Sierra Health Foundation, 2011
2011-01-01
Believing in the value of using video documentaries and data as learning tools, members of the REACH technical assistance team collaborated to develop this toolkit. The learning toolkit was designed using and/or incorporating components of the "Engaging Youth in Community Change: Outcomes and Lessons Learned from Sierra Health Foundation's…
Leonard, Christina M.; Legleiter, Carl; Overstreet, Brandon T.
2017-01-01
This study examined the effects of natural and anthropogenic changes in confining margin width by applying remote sensing techniques – fusing LiDAR topography with image-derived bathymetry – over a large spatial extent: 58 km of the Snake River, Wyoming, USA. Fused digital elevation models from 2007 and 2012 were differenced to quantify changes in the volume of stored sediment, develop morphological sediment budgets, and infer spatial gradients in bed material transport. Our study spanned two similar reaches that were subject to different controls on confining margin width: natural terraces versus artificial levees. Channel planform in reaches with similar slope and confining margin width differed depending on whether the margins were natural or anthropogenic. The effects of tributaries also differed between the two reaches. Generally, the natural reach featured greater confining margin widths and was depositional, whereas artificial lateral constriction in the leveed reach produced a sediment budget that was closer to balanced. Although our remote sensing methods provided topographic data over a large area, net volumetric changes were not statistically significant due to the uncertainty associated with bed elevation estimates. We therefore focused on along-channel spatial differences in bed material transport rather than absolute volumes of sediment. To complement indirect estimates of sediment transport derived by morphological sediment budgeting, we collected field data on bed mobility through a tracer study. Surface and subsurface grain size measurements were combined with bed mobility observations to calculate armoring and dimensionless sediment transport ratios, which indicated that sediment supply exceeded transport capacity in the natural reach and vice versa in the leveed reach. We hypothesize that constriction by levees induced an initial phase of incision and bed armoring. Because levees prevented bank erosion, the channel excavated sediment by
Whither Statistics Education Research?
Watson, Jane
2016-01-01
This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…
Hill, Graham; Maddock, Ian P.; Smolar-Žvanut, Nataša
2007-01-01
This paper examines the effects of flow regulation on the size, spatial distribution and connectivity of channel geomorphic units (CGU) in the Soča River, Slovenia. A river channel survey was completed along three reaches, i.e. an unregulated reach (reach 1), and two regulated reaches with lower discharges, (reach 2 and 3). Results demonstrated significant differences in the CGU composition between the unregulated and regulated reaches. Flow regulation in the Soča River alters the dominant ty...
Jenkins, M E; Johnson, A M; Holmes, J D; Stephenson, F F; Spaulding, S J
2010-07-01
Balance problems and falls are a common concern among individuals with Parkinson's disease (PD). Falls frequently occur during daily activities such as reaching into cupboards in the kitchen or bathroom. This study compared the correlation among two standard postural stability tests - the postural stability score on the Unified Parkinson's Disease Rating Scale (UPDRS) and the Functional Reach Test (FRT) - and ecologically valid reaching tasks that correspond to reaching at different cupboard heights among 20 individuals with PD and 20 age-matched controls. Both the FRT and the UPDRS postural stability tests are quick measures that can be performed during the clinical examination. The FRT, but not the postural stability score, demonstrated a significant correlation with the ecologically valid reaching tasks, among individuals with PD. Furthermore the FRT scores did not correlate with the UPDRS postural stability scores, indicating that these are measuring different aspects of balance. This study suggests that the FRT score may better predict the risk of postural instability encountered during daily activities among individuals with PD.
Nonparametric statistical methods
Hollander, Myles; Chicken, Eric
2013-01-01
Praise for the Second Edition"This book should be an essential part of the personal library of every practicing statistician."-Technometrics Thoroughly revised and updated, the new edition of Nonparametric Statistical Methods includes additional modern topics and procedures, more practical data sets, and new problems from real-life situations. The book continues to emphasize the importance of nonparametric methods as a significant branch of modern statistics and equips readers with the conceptual and technical skills necessary to select and apply the appropriate procedures for any given sit
Does workplace health promotion reach shift workers?
Nabe-Nielsen, Kirsten; Garde, Anne Helene; Clausen, Thomas;
2015-01-01
OBJECTIVES: One reason for health disparities between shift and day workers may be that workplace health promotion does not reach shift workers to the same extent as it reaches day workers. This study aimed to investigate the association between shift work and the availability of and participation...... in workplace health promotion. METHODS: We used cross-sectional questionnaire data from a large representative sample of all employed people in Denmark. We obtained information on the availability of and participation in six types of workplace health promotion. We also obtained information on working hours, ie......). RESULTS: In the general working population, fixed evening and fixed night workers, and employees working variable shifts including night work reported a higher availability of health promotion, while employees working variable shifts without night work reported a lower availability of health promotion...
Distance Reached in the Anteromedial Reach Test as a Function of Learning and Leg Length
Bent, Nicholas P.; Rushton, Alison B.; Wright, Chris C.; Batt, Mark E.
2012-01-01
The Anteromedial Reach Test (ART) is a new outcome measure for assessing dynamic knee stability in anterior cruciate ligament-injured patients. The effect of learning and leg length on distance reached in the ART was examined. Thirty-two healthy volunteers performed 15 trials of the ART on each leg. There was a moderate correlation (r = 0.44-0.50)…
Park, Hyekang; Kang, Youngeun; Yoo, Minah; Lee, Bomjin; Yang, Jeongok; Lee, Joongsook; Han, Dongwook; Oh, Taeyoung
2017-01-01
[Purpose] The aims of this study was to investigate mean velocity and angle of shoulder joint, activation of tibialis anterior and gastrocnemius according to both eyes, dominant eye and non-dominant eye condition during reaching task in normal adults. [Subjects and Methods] Our research recruited 24 participants (male 11, female 13) in Silla University. Participants were performed reaching out movement by conditions of both eye, dominants eye, non-dominants eye. The target was placed at 45 degree diagonal direction and distance far away 130% of their arm length. Kinetic analysis of the upper extremities was investigated by QUALISYS 3-dimensional motion analysis system. Muscle activation were measured by EMG during reaching tasks. The collected data were statistically processed using the SPSS for win version 20.0. [Results] There was a significant difference of shoulder joint velocity of flexion, abduction and internal rotation according to visual field condition during reaching tasks. There was no significant difference of shoulder joint angle and muscle activation according to visual field conditions during reaching tasks. [Conclusion] In conclusion, visual field has an influence on shoulder joint velocity. Therefore, the visual field may be to play an important role in reach performance. PMID:28210047
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
Do Economists Reach a Conclusion on Household and Municipal Recycling?
Matthew Gunter
2007-01-01
Do economists reach a conclusion on household and municipal recycling? I explore the policy judgments of published economists on recycling and find that there is no broad consensus. The mainstream recycling literature is dominated by a guided-market approach; taxes and subsidies are advocated to correct for market failures. There are two less popular but still significant approaches: a minimal government laissez faire approach and a command and control regulatory approach. Laissez faire econo...
Reaching Diverse Audiences through NOAO Education Programs
Pompea, Stephen M.; Sparks, R. T.; Walker, C. E.
2009-01-01
NOAO education programs are designed to reach diverse audiences. Examples described in this poster include the Hands-On Optics Project nationwide, an extension of the Hands-On Optics program at Boys and Girls Clubs in Arizona and in Hawaii, a professional development program for Navajo and Hopi teachers, a number of programs for the Tohono O'odham Nation, and a project collecting and reviewing Spanish language astronomy materials. Additionally NOAO is also involved in several local outreach projects for diverse and underserved audiences.
Can donated media placements reach intended audiences?
Cooper, Crystale Purvis; Gelb, Cynthia A; Chu, Jennifer; Polonec, Lindsey
2013-09-01
Donated media placements for public service announcements (PSAs) can be difficult to secure, and may not always reach intended audiences. Strategies used by the Centers for Disease Control and Prevention's (CDC) Screen for Life: National Colorectal Cancer Action Campaign (SFL) to obtain donated media placements include producing a diverse mix of high-quality PSAs, co-branding with state and tribal health agencies, securing celebrity involvement, monitoring media trends to identify new distribution opportunities, and strategically timing the release of PSAs. To investigate open-ended recall of PSAs promoting colorectal cancer screening, CDC conducted 12 focus groups in three U.S. cities with men and women either nearing age 50 years, when screening is recommended to begin, or aged 50-75 years who were not in compliance with screening guidelines. In most focus groups, multiple participants recalled exposure to PSAs promoting colorectal cancer screening, and most of these individuals reported having seen SFL PSAs on television, in transit stations, or on the sides of public buses. Some participants reported exposure to SFL PSAs without prompting from the moderator, as they explained how they learned about the disease. Several participants reported learning key campaign messages from PSAs, including that colorectal cancer screening should begin at age 50 years and screening can find polyps so they can be removed before becoming cancerous. Donated media placements can reach and educate mass audiences, including millions of U.S. adults who have not been screened appropriately for colorectal cancer.
Extended-reach wells tap outlying reserves
Nazzal, G. (Eastman Teleco, Houston, TX (United States))
1993-03-01
Extended-reach drilling (ERD) is being used to exploit fields and reserves that are located far from existing platforms. Effective wellbore placement from fewer platforms can reduce development costs, maximize production and increase reserve recovery. Six wells drilled offshore in the US, North Sea and Australia illustrate how to get the most economic benefit from available infrastructure. These wells are divided into three categories by depth (shallow, medium and deep). Vertical depth of these wells range from 963 to 12,791 ft TVD and displacements range from 4,871 to 23,917 ft. Important factors for successful extended-reach drilling included: careful, comprehensive pre-planning; adequate cuttings removal in all sections; hole stability in long, exposed intervals; torque and drag modeling of drilling BHAs, casing and liners; buoyancy-assisted casing techniques where appropriate; critical modifications to drilling rig and top drive, for medium and deep ERD; modified power swivels for shallow operations; drill pipe rubbers or other casing protection during extended periods of drill string rotation; heavy-wall casting across anticipated high-wear areas; survey accuracy and frequency; sound drilling practices and creativity to accomplish goals and objectives. This paper reviews the case history of these sites and records planning and design procedures.
Napa River Restoration Project: Rutherford Reach Completion and Oakville to Oak Knoll Reach
Information about the SFBWQP Napa River Restoration Project: Rutherford Reach Completion/Oakville to Oak Knoll, part of an EPA competitive grant program to improve SF Bay water quality focused on restoring impaired waters and enhancing aquatic resources.
... PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the American Society of Plastic Surgeons. Statistics by Year Print 2016 Plastic Surgery Statistics 2015 ...
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...
KRM Silveira
2006-12-01
Full Text Available OBJETIVOS: O objetivo deste estudo foi avaliar os padrões de desempenho dos testes Functional Reach e Lateral Reach em uma amostra de indivíduos saudáveis de 20 a 87 anos e verificar a influência do gênero, idade, estatura do indivíduo, peso corporal, comprimentos do braço e do pé. MÉTODO: foi realizado um estudo observacional transversal com 98 pessoas de ambos os gêneros, que residiam na capital e interior de São Paulo. Os voluntários tiveram suas medidas descritivas registradas e posteriormente foram submetidos aos testes Functional Reach e Lateral Reach. RESULTADOS: Para o FR, todas as variáveis tiveram influência, exceto o comprimento do braço (p=0,057, o peso corporal (p=0,746 e a base de suporte usada no momento da avaliação (p=0,384. As variáveis que exerceram maior influência foram o gênero (p=0,001, a idade (pOBJECTIVE: To assess the performance in the functional reach test (FR and lateral reach test (LR among a sample of healthy individuals aged 20 to 87 years and to verify the influence of gender, age, height, body weight, arm length and foot length. METHOD: A cross-sectional observational study was conducted on 98 people of both genders living in the city of São Paulo and other places in the State of São Paulo. The volunteers were measured and then underwent FR and LR. RESULTS: All the variables had an influence on FR, except arm length (p=0.057, body weight (p=0.746 and the support base used at the time of assessment (p=0.384. The variables exerting greatest influence were the individual's gender (p=0.001, age (p<0.001 and height (p=0.004. This analysis showed that women had less anterior and lateral functional reach than men. There was a substantial positive correlation (r=0.696 between the left and right LR findings. FR had a moderate positive correlation of 0.405 with the left LR and a substantial positive correlation of 0.614 with the right LR. For LR, the height, weight, foot length and arm length
Reach and get capability in a computing environment
Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM
2012-06-05
A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.
Speeded reaching movements around invisible obstacles.
Todd E Hudson
Full Text Available We analyze the problem of obstacle avoidance from a Bayesian decision-theoretic perspective using an experimental task in which reaches around a virtual obstacle were made toward targets on an upright monitor. Subjects received monetary rewards for touching the target and incurred losses for accidentally touching the intervening obstacle. The locations of target-obstacle pairs within the workspace were varied from trial to trial. We compared human performance to that of a Bayesian ideal movement planner (who chooses motor strategies maximizing expected gain using the Dominance Test employed in Hudson et al. (2007. The ideal movement planner suffers from the same sources of noise as the human, but selects movement plans that maximize expected gain in the presence of that noise. We find good agreement between the predictions of the model and actual performance in most but not all experimental conditions.
Priority setting in the REACH system.
Hansson, Sven Ove; Rudén, Christina
2006-04-01
Due to the large number of chemicals for which toxicological and ecotoxicological information is lacking, priority setting for data acquisition is a major concern in chemicals regulation. In the current European system, two administrative priority-setting criteria are used, namely novelty (i.e., time of market introduction) and production volume. In the proposed Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system, the novelty criterion is no longer used, and production volume will be the main priority-setting criterion for testing requirements, supplemented in some cases with hazard indications obtained from QSAR modelling. This system for priority setting has severe weaknesses. In this paper we propose that a multicriteria system should be developed that includes at least three additional criteria: chemical properties, results from initial testing in a tiered system, and voluntary testing for which efficient incentives can be created. Toxicological and decision-theoretical research is needed to design testing systems with validated priority-setting mechanisms.
Reaching Consensus by Allowing Moments of Indecision
Svenkeson, A.; Swami, A.
2015-10-01
Group decision-making processes often turn into a drawn out and costly battle between two opposing subgroups. Using analytical arguments based on a master equation description of the opinion dynamics occurring in a three-state model of cooperatively interacting units, we show how the capability of a social group to reach consensus can be enhanced when there is an intermediate state for indecisive individuals to pass through. The time spent in the intermediate state must be relatively short compared to that of the two polar states in order to create the beneficial effect. Furthermore, the cooperation between individuals must not be too low, as the benefit to consensus is possible only when the cooperation level exceeds a specific threshold. We also discuss how zealots, agents that remain in one state forever, can affect the consensus among the rest of the population by counteracting the benefit of the intermediate state or making it virtually impossible for an opposition to form.
Fluvial geomorphology of the Middle Reach of the Huai River
Bang-yi YU; Peng WU; Jue-yi SUI; Xing-ju YANG; Jin NI
2014-01-01
The Middle Reach of the Huai River (MRHR) flows northeast into the Hongzehu Lake. Before entering the Hongzehu Lake, the Huai River has a braided channel which is shallow and wide, and the riverbed has a negative slope. Based on the characteristics of the MRHR, this river reach can be divided into the following sections: a quasi-straight (or mildly curved) section, a bend section, and a braided section. The majority of the MRHR is quasi-straight. In this paper, several parameters are used to assess the geomorphology of the MRHR. Statistical analyses are performed to establish a relationship between the span length“L”and channel width“B”for different channel patterns. The relationship between the meandering length “S” and bankfull channel width “B” is also derived. Results indicate that the bankfull channel width “B”, the bankfull cross sectional area “A” and the average flow depth“H”are mainly dependent on the dominant discharge in the channel. A relationship is derived that describes the dependency of the curvature radius“R”on the dominant discharge“Q”, water surface slope“J”and the turning angle“α”.
Morphodynamics of a pseudomeandering gravel bar reach
Bartholdy, J.; Billi, P.
2002-01-01
A large number of rivers in Tuscany have channel planforms, which are neither straight nor what is usually understood as meandering. In the typical case, they consist of an almost straight, slightly incised main channel fringed with large lateral bars and lunate-shaped embayments eroded into the former flood plain. In the past, these rivers have not been recognised as an individual category and have often been considered to be either braided or meandering. It is suggested here that this type of river planform be termed pseudomeandering. A typical pseudomeandering river (the Cecina River) is described and analysed to investigate the main factors responsible for producing this channel pattern. A study reach (100×300 m) was surveyed in detail and related to data on discharge, channel changes after floods and grain-size distribution of bed sediments. During 18 months of topographic monitoring, the inner lateral bar in the study reach expanded and migrated towards the concave outer bank which, concurrently, retreated by as much as 25 m. A sediment balance was constructed to analyse bar growth and bank retreat in relation to sediment supply and channel morphology. The conditions necessary to maintain the pseudomeandering morphology of these rivers by preventing them from developing a meandering planform, are discussed and interpreted as a combination of a few main factors such as the flashy character of floods, sediment supply (influenced by both natural processes and human impact), the morphological effects of discharges with contrasting return intervals and the short duration of flood events. Finally, the channel response to floods with variable sediment transport capacity (represented by bed shear stress) is analysed using a simple model. It is demonstrated that bend migration is associated with moderate floods while major floods are responsible for the development of chute channels, which act to suppress bend growth and maintain the low sinuosity configuration of
Key Design Requirements for Long-Reach Manipulators
Kwon, D.S.
2001-01-01
Long-reach manipulators differ from industrial robots and teleoperators typically used in the nuclear industry in that the aspect ratio (length to diameter) of links is much greater and link flexibility, as well as joint or drive train flexibility, is likely to be significant. Long-reach manipulators will be required for a variety of applications in the Environmental Restoration and Waste Management Program. While each application will present specific functional, kinematic, and performance requirements, an approach for determining the kinematic applicability and performance characteristics is presented, with a focus on waste storage tank remediation. Requirements are identified, kinematic configurations are considered, and a parametric study of link design parameters and their effects on performance characteristics is presented.
Hand preferences in preschool children: Reaching, pointing and symbolic gestures.
Cochet, Hélène; Centelles, Laurie; Jover, Marianne; Plachta, Suzy; Vauclair, Jacques
2015-01-01
Manual asymmetries emerge very early in development and several researchers have reported a significant right-hand bias in toddlers although this bias fluctuates depending on the nature of the activity being performed. However, little is known about the further development of asymmetries in preschoolers. In this study, patterns of hand preference were assessed in 50 children aged 3-5 years for different activities, including reaching movements, pointing gestures and symbolic gestures. Contrary to what has been reported in children before 3 years of age, we did not observe any difference in the mean handedness indices obtained in each task. Moreover, the asymmetry of reaching was found to correlate with that of pointing gestures, but not with that of symbolic gestures. In relation to the results reported in infants and adults, this study may help deciphering the mechanisms controlling the development of handedness by providing measures of manual asymmetries in an age range that has been so far rather neglected.
Statistics Anxiety among Postgraduate Students
Koh, Denise; Zawi, Mohd Khairi
2014-01-01
Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…
Statistical Model for Content Extraction
2011-01-01
We present a statistical model for content extraction from HTML documents. The model operates on Document Object Model (DOM) tree of the corresponding HTML document. It evaluates each tree node and associated statistical features to predict significance of the node towards overall content...
Predict! Teaching Statistics Using Informational Statistical Inference
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
Continental reach: The Westcoast Energy story
Newman, P. C.
2002-07-01
A historical account is given of the spectacular success that was Westcoast Energy Inc., a Canadian natural gas giant that charted a wilderness pipeline from natural gas fields in Canada's sub-arctic solitude. The beginning of the company is traced to an event in 1934 when near the bank of the Pouce Coupe River, close to the Alberta-British Columbia border, Frank McMahon, a solitary wildcatter and the eventual founder of the company, first sighted the fiery inferno of a runaway wildcat well, drilled by geologists of the Imperial Oil Company during their original search for the Canadian petroleum basin's motherlode. It was on this occasion in 1934 that McMahon first conceived a geological profile that connected the gas-bearing sandstone of Pouce Coupe with the reservoir rock of the biggest natural gas field of Alberta, and a pipeline from this sandstone storehouse across the rugged heart of British Columbia to Vancouver, and south into the United States. It took the better part of a quarter century to realize the dream of that pipeline which, in due course, turned out to be only the first step towards reaching the top rank of Canadian corporations in operational and financial terms, and becoming one of only a handful in terms of a story that became a Canadian corporate legend. By chronicling the lives and contributions of the company's founder and senior officials over the years, the book traces the company's meteoric rise from a gleam in its founder's eye to a cautious regional utility, and to the aggressive Canadian adventurer that went on to burst the boundaries of its Pacific Coast world, until the continental reach of its operations and interests run from Canada's Pacific shoreline to its Atlantic basins and Mexico's Campeche Bay to Alaska's Prudhoe Bay. The company's independent existence came to an end in 2002 when Westcoast Energy, by then a $15 billion operation, was acquired by Duke Energy Limited of North
New symmetry of intended curved reaches
Torres Elizabeth B
2010-04-01
Full Text Available Abstract Background Movement regularities are inherently present in automated goal-directed motions of the primate's arm system. They can provide important signatures of intentional behaviours driven by sensory-motor strategies, but it remains unknown if during motor learning new regularities can be uncovered despite high variability in the temporal dynamics of the hand motions. Methods We investigated the conservation and violation of new movement regularity obtained from the hand motions traced by two untrained monkeys as they learned to reach outwardly towards spatial targets while avoiding obstacles in the dark. The regularity pertains to the transformation from postural to hand paths that aim at visual goals. Results In length-minimizing curves the area enclosed between the Euclidean straight line and the curve up to its point of maximum curvature is 1/2 of the total area. Similar trend is found if one examines the perimeter. This new movement regularity remained robust to striking changes in arm dynamics that gave rise to changes in the speed of the reach, to changes in the hand path curvature, and to changes in the arm's postural paths. The area and perimeter ratios characterizing the regularity co-varied across repeats of randomly presented targets whenever the transformation from posture to hand paths was compliant with the intended goals. To interpret this conservation and the cases in which the regularity was violated and recovered, we provide a geometric model that characterizes arm-to-hand and hand-to-arm motion paths as length minimizing curves (geodesics in a non-Euclidean space. Whenever the transformation from one space to the other is distance-metric preserving (isometric the two symmetric ratios co-vary. Otherwise, the symmetric ratios and their co-variation are violated. As predicted by the model we found empirical evidence for the violation of this movement regularity whenever the intended goals mismatched the actions. This
CURRENT STATUS OF NONPARAMETRIC STATISTICS
Orlov A. I.
2015-02-01
Full Text Available Nonparametric statistics is one of the five points of growth of applied mathematical statistics. Despite the large number of publications on specific issues of nonparametric statistics, the internal structure of this research direction has remained undeveloped. The purpose of this article is to consider its division into regions based on the existing practice of scientific activity determination of nonparametric statistics and classify investigations on nonparametric statistical methods. Nonparametric statistics allows to make statistical inference, in particular, to estimate the characteristics of the distribution and testing statistical hypotheses without, as a rule, weakly proven assumptions about the distribution function of samples included in a particular parametric family. For example, the widespread belief that the statistical data are often have the normal distribution. Meanwhile, analysis of results of observations, in particular, measurement errors, always leads to the same conclusion - in most cases the actual distribution significantly different from normal. Uncritical use of the hypothesis of normality often leads to significant errors, in areas such as rejection of outlying observation results (emissions, the statistical quality control, and in other cases. Therefore, it is advisable to use nonparametric methods, in which the distribution functions of the results of observations are imposed only weak requirements. It is usually assumed only their continuity. On the basis of generalization of numerous studies it can be stated that to date, using nonparametric methods can solve almost the same number of tasks that previously used parametric methods. Certain statements in the literature are incorrect that nonparametric methods have less power, or require larger sample sizes than parametric methods. Note that in the nonparametric statistics, as in mathematical statistics in general, there remain a number of unresolved problems
Nursing student attitudes toward statistics.
Mathew, Lizy; Aktan, Nadine M
2014-04-01
Nursing is guided by evidence-based practice. To understand and apply research to practice, nurses must be knowledgeable in statistics; therefore, it is crucial to promote a positive attitude toward statistics among nursing students. The purpose of this quantitative cross-sectional study was to assess differences in attitudes toward statistics among undergraduate nursing, graduate nursing, and undergraduate non-nursing students. The Survey of Attitudes Toward Statistics Scale-36 (SATS-36) was used to measure student attitudes, with higher scores denoting more positive attitudes. The convenience sample was composed of 175 students from a public university in the northeastern United States. Statistically significant relationships were found among some of the key demographic variables. Graduate nursing students had a significantly lower score on the SATS-36, compared with baccalaureate nursing and non-nursing students. Therefore, an innovative nursing curriculum that incorporates knowledge of student attitudes and key demographic variables may result in favorable outcomes.
Important ATLAS Forward Calorimeter Milestone Reached
Loch, P.
The ATLAS Forward Calorimeter working group has reached an important milestone in the production of their detectors. The mechanical assembly of the first electromagnetic module (FCal1C) has been completed at the University of Arizona on February 25, 2002, only ten days after the originally scheduled date. The photo shows the University of Arizona FCal group in the clean room, together with the assembled FCal1C module. The module consists of a stack of 18 round copper plates, each about one inch thick. Each plate is about 90 cm in diameter, and has 12260 precision-drilled holes in it, to accommodate the tube/rod electrode assembly. The machining of the plates, which was done at the Science Technology Center (STC) at Carleton University, Ottawa, Canada, required high precision to allow for easy insertion of the electrode copper tube. The plates have been carefully cleaned at the University of Arizona, to remove any machining residue and metal flakes. This process alone took about eleven weeks. Exactly 122...
LEP Dismantling Reaches Half-Way Stage
2001-01-01
LEP's last superconducting module leaves its home port... Just seven months into the operation, LEP dismantling is forging ahead. Two of the eight arcs which form the tunnel have already been emptied and the last of the accelerator's radiofrequency (RF) cavities has just been raised to the surface. The 160 people working on LEP dismantling have reason to feel pleased with their progress. All of the accelerator's 72 superconducting RF modules have already been brought to the surface, with the last one being extracted on 2nd May. This represents an important step in the dismantling process, as head of the project, John Poole, explains. 'This was the most delicate part of the project, because the modules are very big and they could only come out at one place', he says. The shaft at point 1.8 through which the RF cavity modules pass is 18 metres in diameter, while each module is 11.5 metres long. Some modules had to travel more than 10 kilometres to reach the shaft. ... is lifted up the PM 1.8 shaft, after a m...
CAST reaches milestone but keeps on searching
CERN Courier (september 2011 issue)
2011-01-01
After eight years of searching for the emission of a dark matter candidate particle, the axion, from the Sun, the CERN Axion Solar Telescope (CAST) has fulfilled its original physics programme. Members of the CAST collaboration in July, together with dipole-based helioscope. CAST, the world’s most sensitive axion helioscope, points a recycled prototype LHC dipole magnet at the Sun at dawn and dusk, looking for the conversion of axions to X-rays. It incorporates four state-of-the-art X-ray detectors: three Micromegas detectors and a pn-CCD imaging camera attached to a focusing X-ray telescope that was recovered from the German space programme (see CERN Courier April 2010). Over the years, CAST has operated with the magnet bores - the location of the axion conversion - in different conditions: first in vacuum, covering axion masses up to 20 meV/c2, and then with a buffer gas (4He and later 3He) at various densities, finally reaching the goal of 1.17 eV/c2 on 22 ...
Media perspective - new opportunities for reaching audiences
Haswell, Katy
2007-08-01
The world of media is experiencing a period of extreme and rapid change with the rise of internet television and the download generation. Many young people no longer watch standard TV. Instead, they go on-line, talking to friends and downloading pictures, videos, music clips to put on their own websites and watch/ listen to on their laptops and mobile phones. Gone are the days when TV controllers determined what you watched and when you watched it. Now the buzzword is IPTV, Internet Protocol Television, with companies such as JOOST offering hundreds of channels on a wide range of subjects, all of which you can choose to watch when and where you wish, on your high-def widescreen with stereo surround sound at home or on your mobile phone on the train. This media revolution is changing the way organisations get their message out. And it is encouraging companies such as advertising agencies to be creative about new ways of accessing audiences. The good news is that we have fresh opportunities to reach young people through internet-based media and material downloaded through tools such as games machines, as well as through the traditional media. And it is important for Europlanet to make the most of these new and exciting developments.
Limit analysis of extended reach drilling in South China Sea
Gao Deli; Tan Chengjin; Tang Haixiong
2009-01-01
Extended reach wells (ERWs), especially horizontal extended reach well with a high HD (horizontal displacement) to TVD (true vertical depth) ratio, represent a frontier technology and challenge the drilling limitations.Oil and gas reservoir in beaches or lakes and offshore can be effectively exploited by using extended reach drilling (ERD) technology.This paper focuses on the difficult technological problems encountered during exploiting the Liuhua 11-1 oil field in the South China Sea, China.Emphasis is on investigating the key subjects including prediction and control of open hole limit extension in offshore ERD, prediction of casing wear and its prevention and torque reduction, φ244.5mm casing running with floating collars to control drag force, and steerable drilling modes.The basic concept of limit extension in ERD is presented and the prediction method for open hole limit extension is given in this paper.A set of advanced drilling mechanics and control technology has been established and its practical results are verified by field cases.All those efforts may be significant for further investigating and practicing ERD limit theory and control technology in the future.
The leading joint hypothesis for spatial reaching arm motions.
Ambike, Satyajit; Schmiedeler, James P
2013-02-01
The leading joint hypothesis (LJH), developed for planar arm reaching, proposes that the interaction torques experienced by the proximal joint are low compared to the corresponding muscle torques. The human central nervous system could potentially ignore these interaction torques at the proximal (leading) joint with little effect on the wrist trajectory, simplifying joint-level control. This paper investigates the extension of the LJH to spatial reaching. In spatial motion, a number of terms in the governing equation (Euler's angular momentum balance) that vanish for planar movements are non-trivial, so their contributions to the joint torque must be classified as net, interaction or muscle torque. This paper applies definitions from the literature to these torque components to establish a general classification for all terms in Euler's equation. This classification is equally applicable to planar and spatial motion. Additionally, a rationale for excluding gravity torques from the torque analysis is provided. Subjects performed point-to-point reaching movements between targets whose locations ensured that the wrist paths lay in various portions of the arm's spatial workspace. Movement kinematics were recorded using electromagnetic sensors located on the subject's arm segments and thorax. The arm was modeled as a three-link kinematic chain with idealized spherical and revolute joints at the shoulder and elbow. Joint torque components were computed using inverse dynamics. Most movements were 'shoulder-led' in that the interaction torque impulse was significantly lower than the muscle torque impulse for the shoulder, but not the elbow. For the few elbow-led movements, the interaction impulse at the elbow was low, while that at the shoulder was high, and these typically involved large elbow and small shoulder displacements. These results support the LJH and extend it to spatial reaching motion.
Access to expert stroke care with telemedicine: REACH MUSC
Abby Swanson Kazley
2012-03-01
Full Text Available Stroke is a leading cause of death and disability, and rtPA can significantly reduce the long-term impact of acute ischemic stroke (AIS if given within 3 hours of symptom onset. South Carolina is located in the stroke belt and has a high rate of stroke and stroke mortality. Many small rural SC hospitals do not maintain the expertise needed to treat AIS patients with rtPA. MUSC is an academic medical center using REACH MUSC telemedicine to deliver stroke care to 15 hospitals in the state, increasing the likelihood of timely treatment with rtPA. The purpose of this study is to determine the increase in access to rtPA through the use of telemedicine for AIS in the general population and in specific segments of the population based on age, gender, race, ethnicity, education, urban/rural residence, poverty, and stroke mortality.We used a retrospective cross-sectional design examining Census data from 2000 and Geographic Information Systems (GIS analysis to identify South Carolina residents that live within 30 or 60 minutes of a Primary Stroke Center (PSC or a REACH MUSC site. We include all South Carolina citizens in our analysis and specifically examine the population’s age, gender, race, ethnicity, education, urban/rural residence, poverty, and stroke mortality. Our sample includes 4,012,012 South Carolinians. The main measure is access to expert stroke care at a Primary Stroke Center (PSC or a REACH MUSC hospital within 30 or 60 minutes. We find that without REACH MUSC, only 38% of the population has potential access to expert stroke care in SC within sixty minutes given that most PSCs will maintain expert stroke coverage. REACH MUSC allows 76% of the population to be within sixty minutes of expert stroke care, and 43% of the population to be within 30 minute drive time of expert stroke care. These increases in access are especially significant for groups that have faced disparities in care and high rates of AIS. The use of telemedicine can
Planning of the Extended Reach well Dieksand 2; Planung der Extended Reach Bohrung Dieksand 2
Frank, U.; Berners, H. [RWE-DEA AG, Hamburg (Germany). Drilling Team Mittelplate und Dieksand; Hadow, A.; Klop, G.; Sickinger, W. [Wintershall AG Erdoelwerke, Barnstdorf (Germany); Sudron, K.
1998-12-31
The Mittelplate oil field is located 7 km offshore the town of Friedrichskoog. Reserves are estimated at 30 million tonnes of oil. At a production rate of 2,500 t/d, it will last about 33 years. The transport capacity of the offshore platform is limited, so that attempts were made to enhance production by constructing the extended reach borehole Dieksand 2. Details are presented. (orig.) [Deutsch] Das Erdoelfeld Mittelplate liegt am suedlichen Rand des Nationalparks Schleswig Holsteinisches Wattenmeer, ca. 7000 m westlich der Ortschaft Friedrichskoog. Die gewinnbaren Reserven betragen ca. 30 Millionen t Oel. Bei einer Foerderkapazitaet von 2.500 t/Tag betraegt die Foerderdauer ca. 33 Jahre. Aufgrund der begrenzten Transportkapazitaeten von der Insel, laesst sich durch zusaetzliche Bohrungen von der kuenstlichen Insel Mittelplate keine entscheidende Erhoehung der Foerderkapazitaet erzielen. Ab Sommer 1996 wurde erstmals die Moeglichkeit der Lagerstaettenerschliessung von Land untersucht. Ein im Mai 1997 in Hamburg etabliertes Drilling Team wurde mit der Aufgabe betraut, die Extended Reach Bohrung Dieksand 2 zu planen und abzuteufen. Die Planungsphasen fuer die Extended Reach Bohrung Dieksand 2 wurden aufgezeigt. Die fuer den Erfolg einer Extended Reach Bohrung wichtigen Planungsparameter wurden erlaeutert. Es wurden Wege gezeigt, wie bei diesem Projekt technische und geologische Risiken in der Planung mit beruecksichtigt und nach Beginn der Bohrung weiter bearbeitet werden koennen. (orig.)
Columbia River Estuary Ecosystem Classification Hydrogeomorphic Reach
Cannon, Charles M.; Ramirez, Mary F.; Heatwole, Danelle W.; Burke, Jennifer L.; Simenstad, Charles A.; O'Connor, Jim E.; Marcoe, Keith
2012-01-01
Estuarine ecosystems are controlled by a variety of processes that operate at multiple spatial and temporal scales. Understanding the hierarchical nature of these processes will aid in prioritization of restoration efforts. This hierarchical Columbia River Estuary Ecosystem Classification (henceforth "Classification") of the Columbia River estuary is a spatial database of the tidally-influenced reaches of the lower Columbia River, the tidally affected parts of its tributaries, and the landforms that make up their floodplains for the 230 kilometers between the Pacific Ocean and Bonneville Dam. This work is a collaborative effort between University of Washington School of Aquatic and Fishery Sciences (henceforth "UW"), U.S. Geological Survey (henceforth "USGS"), and the Lower Columbia Estuary Partnership (henceforth "EP"). Consideration of geomorphologic processes will improve the understanding of controlling physical factors that drive ecosystem evolution along the tidal Columbia River. The Classification is organized around six hierarchical levels, progressing from the coarsest, regional scale to the finest, localized scale: (1) Ecosystem Province; (2) Ecoregion; (3) Hydrogeomorphic Reach; (4) Ecosystem Complex; (5) Geomorphic Catena; and (6) Primary Cover Class. For Levels 4 and 5, we mapped landforms within the Holocene floodplain primarily by visual interpretation of Light Detection and Ranging (LiDAR) topography supplemented with aerial photographs, Natural Resources Conservation Service (NRCS) soils data, and historical maps. Mapped landforms are classified as to their current geomorphic function, the inferred process regime that formed them, and anthropogenic modification. Channels were classified primarily by a set of depth-based rules and geometric relationships. Classification Level 5 floodplain landforms ("geomorphic catenae") were further classified based on multivariate analysis of land-cover within the mapped landform area and attributed as "sub
Reaching the top: career anchors and professional development in nursing.
Kaplan, Ruth; Shmulevitz, Carmela; Raviv, Dennie
2009-01-01
This study, based on Shein's conceptual theory of career anchors, examined the relationship between career anchors, professional development and emerging career patterns for graduates of 12 consecutive two year second career programs in nursing (N=231) compared to graduates of concurrent four year academic programs (N=273). A 2-group comparison design was used and data collection tools included a demographic profile, a professional profile and a career anchor questionnaire. Statistically significant differences were found in regard to career anchors (pdevelopment (pnurses were specialization and lifestyle where academic graduates chose management, autonomy and service. Academics displayed a statistically significant preference for administrative specialization (34%) compared to the second career tract (6.5%). Researchers propose that each group develops differently and contributes to the workplace and the importance of both certification and academic incentives to ensure recruitment.
Tütün Yümin, Eylem; Şimşek, Tülay Tarsuslu; Sertel, Meral; Ankaralı, Handan; Yumin, Murat
2017-02-01
The aim of this study was to investigate the effect of manual foot plantar massage (classic and friction massage) on functional mobility level, balance, and functional reach in patients with type II diabetes mellitus (T2 DM). A total of 38 subjects diagnosed with T2 DM were included in the study. A healthy control group could not be formed in this study. After the subjects' socio-demographic data were obtained, Timed Up & Go (TUG) Test, functional reach test (FRT), one-leg standing test with eyes open-closed, and Visual Analogue Scale (VAS) to measure foot pain intensity were performed. The results were also divided and assessed in three groups according to the ages of the individuals (40-54, 55-64, and 65 and over). As a result of statistical analysis, a difference was found in the values obtained from TUG, FRT, and one-leg standing test with eyes open and closed (p massage, TUG values significantly decreased comparison with those before the massage, whereas the values of FRT and one-leg standing test with eyes open and closed significantly increased compared with those before the massage (p > 0.05). According to age groups, there were statistical differences (p massage. The results of our study indicated that application of plantar massage to patients with T2 DM caused an improvement in balance, functional mobility, and functional reach values. An increase in body balance and functional mobility may explain the improvement in TUG. Foot massage to be added to rehabilitation exercise programs of DM patients will be important in improving balance and mobility of patients.
Phenomena and characteristics of barrier river reaches in the middle and lower Yangtze River, China
You, Xingying; Tang, Jinwu
2017-06-01
Alluvial river self-adjustment describes the mechanism whereby a river that was originally in an equilibrium state of sediment transport encounters some disturbance that destroys the balance and results in responses such as riverbed deformation. A systematic study of historical and recent aerial photographs and topographic maps in the Middle and Lower Reaches of the Yangtze River (MLYR) shows that river self-adjustment has the distinguishing feature of transferring from upstream to downstream, which may affect flood safety, waterway morphology, bank stability, and aquatic environmental safety over relatively long reaches downstream. As a result, it is necessary to take measures to control or block this transfer. Using the relationship of the occurrence time of channel adjustments between the upstream and downstream, 34 single-thread river reaches in the MLYR were classified into four types: corresponding, basically corresponding, basically not corresponding, not corresponding. The latter two types, because of their ability to prevent upstream channel adjustment from transferring downstream, are called barrier river reaches in this study. Statistics indicate that barrier river reaches are generally single thread and slightly curved, with a narrow and deep cross-sectional morphology, and without flow deflecting nodes in the upper and middle parts of reaches. Moreover, in the MLYR, barrier river reaches have a hydrogeometric coefficient of {}1.2‱, a silty clay content of the concave bank {>}{9.5}%, and a median diameter of the bed sediment {>}{0.158} mm. The barrier river reach mechanism lies in that can effectively centralise the planimetric position of the main stream from different upstream directions, meaning that no matter how the upper channel adjusts, the main stream shows little change, providing relatively stable inflow conditions for the lower reaches. Regarding river regulation, it is necessary to optimise the benefits of barrier river reaches; long river
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
... Facts and Statistics Printable Version Blood Facts and Statistics Facts about blood needs Facts about the blood ... to Top Learn About Blood Blood Facts and Statistics Blood Components Whole Blood and Red Blood Cells ...
Algebraic statistics computational commutative algebra in statistics
Pistone, Giovanni; Wynn, Henry P
2000-01-01
Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.
Spiking and LFP activity in PRR during symbolically instructed reaches
2011-01-01
The spiking activity in the parietal reach region (PRR) represents the spatial goal of an impending reach when the reach is directed toward or away from a visual object. The local field potentials (LFPs) in this region also represent the reach goal when the reach is directed to a visual object. Thus PRR is a candidate area for reading out a patient's intended reach goals for neural prosthetic applications. For natural behaviors, reach goals are not always based on the location of a visual obj...
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Reaching remote areas in Latin America.
Jaimes, R
1994-01-01
Poor communities in remote and inaccessible areas tend to not only be cut off from family planning education and services, but they are also deprived of basic primary health care services. Efforts to bring family planning to such communities and populations should therefore be linked with other services. The author presents three examples of programs to bring effective family planning services to remote communities in Central and South America. Outside of the municipal center in the Tuxtlas region of Mexico, education and health levels are low and people live according to ancient customs. Ten years ago with the help of MEXFAM, the IPPF affiliate in Mexico, two social promoters established themselves in the town of Catemaco to develop a community program of family planning and health care offering education and prevention to improve the quality of people's lives. Through their health brigades taking health services to towns without an established health center, the program has influenced an estimated 100,000 people in 50 villages and towns. The program also has a clinic. In Guatemala, the Family Welfare Association (APROFAM) gave bicycles to 240 volunteer health care workers to facilitate their outreach work in rural areas. APROFAM since 1988 has operated an integrated program to treat intestinal parasites and promote family planning in San Lucas de Toliman, an Indian town close to Lake Atitlan. Providing health care to more than 10,000 people, the volunteer staff has covered the entire department of Solola, reaching each family in the area. Field educators travel on motorcycles through the rural areas of Guatemala coordinating with the health volunteers the distribution of contraceptives at the community level. The Integrated Project's Clinic was founded in 1992 and currently carries out pregnancy and Pap tests, as well as general lab tests. Finally, Puna is an island in the middle of the Gulf of Guayaquil, Ecuador. Women on the island typically have 10
Statistical learning and data science
Summa, Mireille Gettler; Goldfarb, Bernard; Murtagh, Fionn; Pardoux, Catherine; Touati, Myriam
2011-01-01
Data analysis is changing fast. Driven by a vast range of application domains and affordable tools, machine learning has become mainstream. Unsupervised data analysis, including cluster analysis, factor analysis, and low dimensionality mapping methods continually being updated, have reached new heights of achievement in the incredibly rich data world that we inhabit.Statistical Learning and Data Science is a work of reference in the rapidly evolving context of converging methodologies. It gathers contributions from some of the foundational thinkers in the different fields of data analysis to t
Statistical Study of Visual Binaries
Abdel-Rahman, H I; Elsanhoury, W H
2016-01-01
In this paper, some statistical distributions of wide pairs included in Double Star Catalogue are investigated. Frequency distributions and testing hypothesis are derived for some basic parameters of visual binaries. The results reached indicate that, it was found that the magnitude difference is distributed exponentially, which means that the majority of the component of the selected systems is of the same spectral type. The distribution of the mass ratios is concentrated about 0.7 which agree with Salpeter mass function. The distribution of the linear separation appears to be exponentially, which contradict with previous studies for close binaries.
陈艳妮
2014-01-01
孤独症谱系障碍已逐渐被大家熟知,其内涵也在发生着变化,及时准确地掌握其变化及意义对临床工作非常重要.现就该术语的产生、演变及最新《精神疾病诊断与统计手册》第5版中该术语的变化及意义进行阐述.%Autism spectrum disorders have gradually been known and its concept is also changing.It is important that comprehend the changing timely for clinical work.This paper is mainly about the formation and evolution for the term,as well as the change in Diagnostic and Statistical Manual of Mental Disorders 5th ed and the significance.
Acute effects of whole-body cryotherapy on sit-and-reach amplitude in women and men.
De Nardi, Massimo; La Torre, Antonio; Benis, Roberto; Sarabon, Nejc; Fonda, Borut
2015-12-01
Flexibility is an intrinsic property of body tissues, which among other factors determines the range of motion (ROM). A decrease in neural activation of the muscle has been linked with greater ROM. Cryotherapy is an effective technique to reduces neural activation. Hence, the aim of the present study was to evaluate if a single session of whole-body cryotherapy (WBC) affects ROM. 60 women and 60 men were divided into two groups (control and experimental). After the initial sit-and-reach test, experimental group performed a 150 s session of WBC, whereas the control group stayed in thermo-neutral environment. Immediately after, both groups performed another sit-and-reach test. A 3-way analysis of variance revealed statistically significant time×group and time × gender interaction. Experimental groups improved sit-and-reach amplitude to a greater extend than the control group. Our results support the hypothesis that ROM is increased immediately after a single session of WBC.
Statistical Computing in Information Society
Domański Czesław
2015-12-01
Full Text Available In the presence of massive data coming with high heterogeneity we need to change our statistical thinking and statistical education in order to adapt both - classical statistics and software developments that address new challenges. Significant developments include open data, big data, data visualisation, and they are changing the nature of the evidence that is available, the ways in which it is presented and the skills needed for its interpretation. The amount of information is not the most important issue – the real challenge is the combination of the amount and the complexity of data. Moreover, a need arises to know how uncertain situations should be dealt with and what decisions should be taken when information is insufficient (which can also be observed for large datasets. In the paper we discuss the idea of computational statistics as a new approach to statistical teaching and we try to answer a question: how we can best prepare the next generation of statisticians.
Dynamic channel adjustments in the Jingjiang Reach of the Middle Yangtze River.
Xia, Junqiang; Deng, Shanshan; Lu, Jinyou; Xu, Quanxi; Zong, Quanli; Tan, Guangming
2016-03-11
Significant channel adjustments have occurred in the Jingjiang Reach of the Middle Yangtze River, because of the operation of the Three Gorges Project (TGP). The Jingjiang Reach is selected as the study area, covering the Upper Jingjiang Reach (UJR) and Lower Jingjiang Reach (LJR). The reach-scale bankfull channel dimensions in the study reach were calculated annually from 2002 to 2013 by means of a reach-averaged approach and surveyed post-flood profiles at 171 sections. We find from the calculated results that: the reach-scale bankfull widths changed slightly in the UJR and LJR, with the corresponding depths increasing by 1.6 m and 1.0 m; the channel adjustments occurred mainly with respect to bankfull depth because of the construction of large-scale bank revetment works, although there were significant bank erosion processes in local regions without the bank protection engineering. The reach-scale bankfull dimensions in the UJR and LJR generally responded to the previous five-year average fluvial erosion intensity during flood seasons, with higher correlations being obtained for the depth and cross-sectional area. It is concluded that these dynamic adjustments of the channel geometry are a direct result of recent human activities such as the TGP operation.
Dynamic channel adjustments in the Jingjiang Reach of the Middle Yangtze River
Xia, Junqiang; Deng, Shanshan; Lu, Jinyou; Xu, Quanxi; Zong, Quanli; Tan, Guangming
2016-03-01
Significant channel adjustments have occurred in the Jingjiang Reach of the Middle Yangtze River, because of the operation of the Three Gorges Project (TGP). The Jingjiang Reach is selected as the study area, covering the Upper Jingjiang Reach (UJR) and Lower Jingjiang Reach (LJR). The reach-scale bankfull channel dimensions in the study reach were calculated annually from 2002 to 2013 by means of a reach-averaged approach and surveyed post-flood profiles at 171 sections. We find from the calculated results that: the reach-scale bankfull widths changed slightly in the UJR and LJR, with the corresponding depths increasing by 1.6 m and 1.0 m the channel adjustments occurred mainly with respect to bankfull depth because of the construction of large-scale bank revetment works, although there were significant bank erosion processes in local regions without the bank protection engineering. The reach-scale bankfull dimensions in the UJR and LJR generally responded to the previous five-year average fluvial erosion intensity during flood seasons, with higher correlations being obtained for the depth and cross-sectional area. It is concluded that these dynamic adjustments of the channel geometry are a direct result of recent human activities such as the TGP operation.
[Comment on] Statistical discrimination
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
Purohit, Sudha G; Deshmukh, Shailaja R
2015-01-01
STATISTICS USING R will be useful at different levels, from an undergraduate course in statistics, through graduate courses in biological sciences, engineering, management and so on. The book introduces statistical terminology and defines it for the benefit of a novice. For a practicing statistician, it will serve as a guide to R language for statistical analysis. For a researcher, it is a dual guide, simultaneously explaining appropriate statistical methods for the problems at hand and indicating how these methods can be implemented using the R language. For a software developer, it is a guide in a variety of statistical methods for development of a suite of statistical procedures.
ERF1 -- Enhanced River Reach File 1.2
U.S. Geological Survey, Department of the Interior — U.S. Environmental Protection Agency's River Reach File 1 (RF1)to ensure the hydrologic integrity of the digital reach traces and to quantify the mean water time of...
Yu, Xiaozhi; Ren, Jindong; Zhang, Qian; Liu, Qun; Liu, Honghao
2017-04-01
Reach envelopes are very useful for the design and layout of controls. In building reach envelopes, one of the key problems is to represent the reach limits accurately and conveniently. Spherical harmonics are proved to be accurate and convenient method for fitting of the reach capability envelopes. However, extensive study are required on what components of spherical harmonics are needed in fitting the envelope surfaces. For applications in the vehicle industry, an inevitable issue is to construct reach limit surfaces with consideration of the seating positions of the drivers, and it is desirable to use population envelopes rather than individual envelopes. However, it is relatively inconvenient to acquire reach envelopes via a test considering the seating positions of the drivers. In addition, the acquired envelopes are usually unsuitable for use with other vehicle models because they are dependent on the current cab packaging parameters. Therefore, it is of great significance to construct reach envelopes for real vehicle conditions based on individual capability data considering seating positions. Moreover, traditional reach envelopes provide little information regarding the assessment of reach difficulty. The application of reach envelopes will improve design quality by providing difficulty-rating information about reach operations. In this paper, using the laboratory data of seated reach with consideration of the subjective difficulty ratings, the method of modeling reach envelopes is studied based on spherical harmonics. The surface fitting using spherical harmonics is conducted for circumstances both with and without seat adjustments. For use with adjustable seat, the seating position model is introduced to re-locate the test data. The surface fitting is conducted for both population and individual reach envelopes, as well as for boundary envelopes. Comparison of the envelopes of adjustable seat and the SAE J287 control reach envelope shows that the latter
Interaction torque contributes to planar reaching at slow speed
Hoshi Fumihiko
2008-10-01
Full Text Available Abstract Background How the central nervous system (CNS organizes the joint dynamics for multi-joint movement is a complex problem, because of the passive interaction among segmental movements. Previous studies have demonstrated that the CNS predictively compensates for interaction torque (INT which is arising from the movement of the adjacent joints. However, most of these studies have mainly examined quick movements, presumably because the current belief is that the effects of INT are not significant at slow speeds. The functional contribution of INT for multijoint movements performed in various speeds is still unclear. The purpose of this study was to examine the contribution of INT to a planer reaching in a wide range of motion speeds for healthy subjects. Methods Subjects performed reaching movements toward five targets under three different speed conditions. Joint position data were recorded using a 3-D motion analysis device (50 Hz. Torque components, muscle torque (MUS, interaction torque (INT, gravity torque (G, and net torque (NET were calculated by solving the dynamic equations for the shoulder and elbow. NET at a joint which produces the joint kinematics will be an algebraic sum of torque components; NET = MUS - G - INT. Dynamic muscle torque (DMUS = MUS-G was also calculated. Contributions of INT impulse and DMUS impulse to NET impulse were examined. Results The relative contribution of INT to NET was not dependent on speed for both joints at every target. INT was additive (same direction to DMUS at the shoulder joint, while in the elbow DMUS counteracted (opposed to INT. The trajectory of reach was linear and two-joint movements were coordinated with a specific combination at each target, regardless of motion speed. However, DMUS at the elbow was opposed to the direction of elbow movement, and its magnitude varied from trial to trial in order to compensate for the variability of INT. Conclusion Interaction torque was important at
Minetti, Andrea; Hurtado, Northan; Grais, Rebecca F; Ferrari, Matthew
2014-01-15
Current mass vaccination campaigns in measles outbreak response are nonselective with respect to the immune status of individuals. However, the heterogeneity in immunity, due to previous vaccination coverage or infection, may lead to potential bias of such campaigns toward those with previous high access to vaccination and may result in a lower-than-expected effective impact. During the 2010 measles outbreak in Malawi, only 3 of the 8 districts where vaccination occurred achieved a measureable effective campaign impact (i.e., a reduction in measles cases in the targeted age groups greater than that observed in nonvaccinated districts). Simulation models suggest that selective campaigns targeting hard-to-reach individuals are of greater benefit, particularly in highly vaccinated populations, even for low target coverage and with late implementation. However, the choice between targeted and nonselective campaigns should be context specific, achieving a reasonable balance of feasibility, cost, and expected impact. In addition, it is critical to develop operational strategies to identify and target hard-to-reach individuals.
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
K. Geina
2015-03-01
Full Text Available Purpose. To analyze the dynamics of pike (Esox luceus Linnaeus, 1758 age structure of the Dnieper lower reaches in conditions of the modification of fishing pressure. Methodology. An analysis of fishing situation has been performed based on data of official fishery statistics. Fish sampling was done at control-observation posts of the Institute of Fisheries of the NAAS of Ukraine and directly in the fishery. Juvenile fish yield was determined using a complex of fry fishing gears using a stationary net-station. Field and cameral processing of the material was performed using generally accepted methods. Findings. A retrospective analysis of the situation in the Dnieper-Bug lower reach system clearly indicates on the presence of continuous tendency of catch decline of representative of native fish fauna – pike. With relatively uniform indices of the “yield” of its juveniles before Dnieper flow impoundment and in conditions of present time, its commercial catches significantly dropped. The dynamics of pike current age structure indicates on an increase of relative density of age groups, which form the recruitment of the commercial portion of the population (1-1+ and a decrease of importance of the component of the right side of age series. A discrepancy between the observed changes of the age group and commercial harvest quantities indicates on increased human pressure on this species. Originality. For the first, we analyzed the dynamics of fish juvenile “yield” and age structure of pike commercial stock of the Dnieper lower reaches in the river flow transformation process. Practical value. A decrease of the ichthyomass of piscivorous fishes in the Dnieper lower reaches results in changes of fish populations of littoral biotopes towards the prevalence of the dominance of coarse species that lead to a deterioration of forage availability for a number of valuable commercial species. An increase of the number of pike can regulate the strain
Reach/frequency for printed media: Personal probabilities or models
Mortensen, Peter Stendahl
2000-01-01
The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...
Kaufman Jay S
2008-07-01
Full Text Available Abstract In 2004, Garcia-Berthou and Alcaraz published "Incongruence between test statistics and P values in medical papers," a critique of statistical errors that received a tremendous amount of attention. One of their observations was that the final reported digit of p-values in articles published in the journal Nature departed substantially from the uniform distribution that they suggested should be expected. In 2006, Jeng critiqued that critique, observing that the statistical analysis of those terminal digits had been based on comparing the actual distribution to a uniform continuous distribution, when digits obviously are discretely distributed. Jeng corrected the calculation and reported statistics that did not so clearly support the claim of a digit preference. However delightful it may be to read a critique of statistical errors in a critique of statistical errors, we nevertheless found several aspects of the whole exchange to be quite troubling, prompting our own meta-critique of the analysis. The previous discussion emphasized statistical significance testing. But there are various reasons to expect departure from the uniform distribution in terminal digits of p-values, so that simply rejecting the null hypothesis is not terribly informative. Much more importantly, Jeng found that the original p-value of 0.043 should have been 0.086, and suggested this represented an important difference because it was on the other side of 0.05. Among the most widely reiterated (though often ignored tenets of modern quantitative research methods is that we should not treat statistical significance as a bright line test of whether we have observed a phenomenon. Moreover, it sends the wrong message about the role of statistics to suggest that a result should be dismissed because of limited statistical precision when it is so easy to gather more data. In response to these limitations, we gathered more data to improve the statistical precision, and
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1245, 2004. PMID: 15010446 11 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
... Statistics Students' Pages Errata Other Statistical Sites Subjects Inflation & Prices » Consumer Price Index Producer Price Indexes Import/Export Price ... Choose a Subject Employment and Unemployment Employment Unemployment Inflation, Prices, and ... price indexes Consumer spending Industry price indexes Pay ...
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Mathematical and statistical analysis
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the United States are diagnosed with Merkel cell skin cancer each year. Almost all people diagnosed with the ...
Experiment in Elementary Statistics
Fernando, P. C. B.
1976-01-01
Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)
Overweight and Obesity Statistics
... the full list of resources . Overweight and Obesity Statistics Page Content About Overweight and Obesity Prevalence of ... adults age 20 and older [ Top ] Physical Activity Statistics Adults Research Findings Research suggests that staying active ...
... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... fact sheet provides up-to-date data and statistics on youth violence. Data Sources Indicators of School ...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Justin eHorowitz
2016-01-01
Full Text Available Improvements in human-machine interaction may help overcome the unstable and uncertain environments that cause problems in everyday living. Here we experimentally evaluated intent feedback (IF, which estimates and displays the human operator's underlying intended trajectory in real-time. IF is a filter that combines a model of the arm with position and force data to determine the intended position. Subjects performed targeted reaching motions while seeing either their actual hand position or their estimated intent as a cursor while they experienced white noise forces rendered by a robotic handle. We found significantly better reaching performance during force exposure using the estimated intent. Additionally, in a second set of subjects with a reduced modeled stiffness, IF reduced estimated arm stiffness to about half that without IF, indicating a more relaxed state of operation. While visual distortions typically degrade performance and require an adaptation period to overcome, this particular distortion immediately enhanced performance. In the future, this method could provide novel insights into the nature of control. IF might also be applied in driving and piloting applications to best follow a person's desire in unpredictable or turbulent conditions.
On two methods of statistical image analysis
Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, KL
1999-01-01
The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition, smooth
Ranald Macdonald and statistical inference.
Smith, Philip T
2009-05-01
Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing.
Significance evaluation in factor graphs
Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet
2017-01-01
Background Factor graphs provide a flexible and general framework for specifying probability distributions. They can capture a range of popular and recent models for analysis of both genomics data as well as data from other scientific fields. Owing to the ever larger data sets encountered...... in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...
Software for Spatial Statistics
Edzer Pebesma
2015-02-01
Full Text Available We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth, point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.
Software for Spatial Statistics
Edzer Pebesma; Roger Bivand; Paulo Justiniano Ribeiro
2015-01-01
We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth), point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.
Walking is not like reaching: evidence from periodic mechanical perturbations.
Jooeun Ahn
Full Text Available The control architecture underlying human reaching has been established, at least in broad outline. However, despite extensive research, the control architecture underlying human locomotion remains unclear. Some studies show evidence of high-level control focused on lower-limb trajectories; others suggest that nonlinear oscillators such as lower-level rhythmic central pattern generators (CPGs play a significant role. To resolve this ambiguity, we reasoned that if a nonlinear oscillator contributes to locomotor control, human walking should exhibit dynamic entrainment to periodic mechanical perturbation; entrainment is a distinctive behavior of nonlinear oscillators. Here we present the first behavioral evidence that nonlinear neuro-mechanical oscillators contribute to the production of human walking, albeit weakly. As unimpaired human subjects walked at constant speed, we applied periodic torque pulses to the ankle at periods different from their preferred cadence. The gait period of 18 out of 19 subjects entrained to this mechanical perturbation, converging to match that of the perturbation. Significantly, entrainment occurred only if the perturbation period was close to subjects' preferred walking cadence: it exhibited a narrow basin of entrainment. Further, regardless of the phase within the walking cycle at which perturbation was initiated, subjects' gait synchronized or phase-locked with the mechanical perturbation at a phase of gait where it assisted propulsion. These results were affected neither by auditory feedback nor by a distractor task. However, the convergence to phase-locking was slow. These characteristics indicate that nonlinear neuro-mechanical oscillators make at most a modest contribution to human walking. Our results suggest that human locomotor control is not organized as in reaching to meet a predominantly kinematic specification, but is hierarchically organized with a semi-autonomous peripheral oscillator operating under
Significance analysis of prognostic signatures.
Andrew H Beck
Full Text Available A major goal in translational cancer research is to identify biological signatures driving cancer progression and metastasis. A common technique applied in genomics research is to cluster patients using gene expression data from a candidate prognostic gene set, and if the resulting clusters show statistically significant outcome stratification, to associate the gene set with prognosis, suggesting its biological and clinical importance. Recent work has questioned the validity of this approach by showing in several breast cancer data sets that "random" gene sets tend to cluster patients into prognostically variable subgroups. This work suggests that new rigorous statistical methods are needed to identify biologically informative prognostic gene sets. To address this problem, we developed Significance Analysis of Prognostic Signatures (SAPS which integrates standard prognostic tests with a new prognostic significance test based on stratifying patients into prognostic subtypes with random gene sets. SAPS ensures that a significant gene set is not only able to stratify patients into prognostically variable groups, but is also enriched for genes showing strong univariate associations with patient prognosis, and performs significantly better than random gene sets. We use SAPS to perform a large meta-analysis (the largest completed to date of prognostic pathways in breast and ovarian cancer and their molecular subtypes. Our analyses show that only a small subset of the gene sets found statistically significant using standard measures achieve significance by SAPS. We identify new prognostic signatures in breast and ovarian cancer and their corresponding molecular subtypes, and we show that prognostic signatures in ER negative breast cancer are more similar to prognostic signatures in ovarian cancer than to prognostic signatures in ER positive breast cancer. SAPS is a powerful new method for deriving robust prognostic biological signatures from clinically
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Teena Padiyar
2013-10-01
Full Text Available Background &Objective:Cerebrovascular accident is the major disease that leads to an increase in the numberof people with motor or sensory impairment or loss of function on one side of the body (hemipelgia.Poorsitting ability is a common problem after stroke. Sitting involves not only the ability to maintainthe seatedposture, but also the ability to reach for a variety of objects located both within and beyond arm’slength.Contoured foam seat (CFS have shown improvement in sitting posture, head control and upper extremityfunction in pediatric age group suffering from cerebral palsy in previous studies. So this study wasdesigned toevaluate the effectiveness of contoured foam seat on sitting posture and multidirectional reaching ability inacute stroke subjects.Methodology:The study design of this pilot study is having ten Acute stroke subjects assample. After taking the informed consent, subjects were made to sit on a chair and multidirectionalreachingability distance was measured with and without contoured foam seat. Multidirectional reaching distance wasassessed with CFS and without CFS and obtained data was analyzed. Data was collected by measuring themaximum reaching ability distance.Result:Reaching ability in sitting position significantly improved afterapplication of CFS. Unaffected side reaching was significantly improved as compare to forward and affectedside reach after application of CFS.Discussion & Conclusion:Contoured foam seat can significantly improvepelvic alignment and provide a good postural stability thereby improve sitting posture and functional reachingability in acute stroke subjects.
Changes in context and perception of maximum reaching height.
Wagman, Jeffrey B; Day, Brian M
2014-01-01
Successfully performing a given behavior requires flexibility in both perception and behavior. In particular, doing so requires perceiving whether that behavior is possible across the variety of contexts in which it might be performed. Three experiments investigated how (changes in) context (ie point of observation and intended reaching task) influenced perception of maximum reaching height. The results of experiment 1 showed that perceived maximum reaching height more closely reflected actual reaching ability when perceivers occupied a point of observation that was compatible with that required for the reaching task. The results of experiments 2 and 3 showed that practice perceiving maximum reaching height from a given point of observation improved perception of maximum reaching height from a different point of observation, regardless of whether such practice occurred at a compatible or incompatible point of observation. In general, such findings show bounded flexibility in perception of affordances and are thus consistent with a description of perceptual systems as smart perceptual devices.
Selling statistics[Statistics in scientific progress
Bridle, S. [Astrophysics Group, University College London (United Kingdom)]. E-mail: sarah@star.ucl.ac.uk
2006-09-15
From Cosmos to Chaos- Peter Coles, 2006, Oxford University Press, 224pp. To confirm or refute a scientific theory you have to make a measurement. Unfortunately, however, measurements are never perfect: the rest is statistics. Indeed, statistics is at the very heart of scientific progress, but it is often poorly taught and badly received; for many, the very word conjures up half-remembered nightmares of 'null hypotheses' and 'student's t-tests'. From Cosmos to Chaos by Peter Coles, a cosmologist at Nottingham University, is an approachable antidote that places statistics in a range of catchy contexts. Using this book you will be able to calculate the probabilities in a game of bridge or in a legal trial based on DNA fingerprinting, impress friends by talking confidently about entropy, and stretch your mind thinking about quantum mechanics. (U.K.)
Statistics Essentials For Dummies
Rumsey, Deborah
2010-01-01
Statistics Essentials For Dummies not only provides students enrolled in Statistics I with an excellent high-level overview of key concepts, but it also serves as a reference or refresher for students in upper-level statistics courses. Free of review and ramp-up material, Statistics Essentials For Dummies sticks to the point, with content focused on key course topics only. It provides discrete explanations of essential concepts taught in a typical first semester college-level statistics course, from odds and error margins to confidence intervals and conclusions. This guide is also a perfect re
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Reaching during virtual rotation: context specific compensations for expected coriolis forces.
Cohn, J V; DiZio, P; Lackner, J R
2000-06-01
Subjects who are in an enclosed chamber rotating at constant velocity feel physically stationary but make errors when pointing to targets. Reaching paths and endpoints are deviated in the direction of the transient inertial Coriolis forces generated by their arm movements. By contrast, reaching movements made during natural, voluntary torso rotation seem to be accurate, and subjects are unaware of the Coriolis forces generated by their movements. This pattern suggests that the motor plan for reaching movements uses a representation of body motion to prepare compensations for impending self-generated accelerative loads on the arm. If so, stationary subjects who are experiencing illusory self-rotation should make reaching errors when pointing to a target. These errors should be in the direction opposite the Coriolis accelerations their arm movements would generate if they were actually rotating. To determine whether such compensations exist, we had subjects in four experiments make visually open-loop reaches to targets while they were experiencing compelling illusory self-rotation and displacement induced by rotation of a complex, natural visual scene. The paths and endpoints of their initial reaching movements were significantly displaced leftward during counterclockwise illusory rotary displacement and rightward during clockwise illusory self-displacement. Subjects reached in a curvilinear path to the wrong place. These reaching errors were opposite in direction to the Coriolis forces that would have been generated by their arm movements during actual torso rotation. The magnitude of path curvature and endpoint errors increased as the speed of illusory self-rotation increased. In successive reaches, movement paths became straighter and endpoints more accurate despite the absence of visual error feedback or tactile feedback about target location. When subjects were again presented a stationary scene, their initial reaches were indistinguishable from pre
Reach adaptation and proprioceptive recalibration following terminal visual feedback of the hand
Victoria eBarkley
2014-09-01
Full Text Available We have shown that when subjects reach with continuous, misaligned visual feedback of their hand, their reaches are adapted and proprioceptive sense of hand position is recalibrated to partially match the visual feedback (Salomonczyk et al., 2011. It is unclear if similar changes arise after reaching with visual feedback that is provided only at the end of the reach (i.e., terminal feedback, when there are shorter temporal intervals for subjects to experience concurrent visual and proprioceptive feedback. Subjects reached to targets with an aligned hand-cursor that provided visual feedback at the end of each reach movement across a 99-trial training block, and with a rotated cursor over 3 successive blocks of 99 trials each. After each block, no cursor reaches, to measure aftereffects, and felt hand positions were measured. Felt hand position was determined by having subjects indicate the position of their unseen hand relative to a reference marker. We found that subjects adapted their reaches following training with rotated terminal visual feedback, yet slightly less (i.e., reach aftereffects were smaller, than subjects from a previous study who experienced continuous visual feedback. Nonetheless, current subjects recalibrated their sense of felt hand position in the direction of the altered visual feedback, but this proprioceptive change increased incrementally over the three rotated training blocks. Final proprioceptive recalibration levels were comparable to our previous studies in which subjects performed the same task with continuous visual feedback. Thus, compared to reach training with continuous, but altered visual feedback, subjects who received terminal altered visual feedback of the hand produced significant but smaller reach aftereffects and similar changes in hand proprioception when given extra training. Taken together, results suggest that terminal feedback of the hand is sufficient to drive motor adaptation, and also
Reaching during virtual rotation: context specific compensations for expected coriolis forces
Cohn, J. V.; DiZio, P.; Lackner, J. R.
2000-01-01
Subjects who are in an enclosed chamber rotating at constant velocity feel physically stationary but make errors when pointing to targets. Reaching paths and endpoints are deviated in the direction of the transient inertial Coriolis forces generated by their arm movements. By contrast, reaching movements made during natural, voluntary torso rotation seem to be accurate, and subjects are unaware of the Coriolis forces generated by their movements. This pattern suggests that the motor plan for reaching movements uses a representation of body motion to prepare compensations for impending self-generated accelerative loads on the arm. If so, stationary subjects who are experiencing illusory self-rotation should make reaching errors when pointing to a target. These errors should be in the direction opposite the Coriolis accelerations their arm movements would generate if they were actually rotating. To determine whether such compensations exist, we had subjects in four experiments make visually open-loop reaches to targets while they were experiencing compelling illusory self-rotation and displacement induced by rotation of a complex, natural visual scene. The paths and endpoints of their initial reaching movements were significantly displaced leftward during counterclockwise illusory rotary displacement and rightward during clockwise illusory self-displacement. Subjects reached in a curvilinear path to the wrong place. These reaching errors were opposite in direction to the Coriolis forces that would have been generated by their arm movements during actual torso rotation. The magnitude of path curvature and endpoint errors increased as the speed of illusory self-rotation increased. In successive reaches, movement paths became straighter and endpoints more accurate despite the absence of visual error feedback or tactile feedback about target location. When subjects were again presented a stationary scene, their initial reaches were indistinguishable from pre
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Estimation and inferential statistics
Sahu, Pradip Kumar; Das, Ajit Kumar
2015-01-01
This book focuses on the meaning of statistical inference and estimation. Statistical inference is concerned with the problems of estimation of population parameters and testing hypotheses. Primarily aimed at undergraduate and postgraduate students of statistics, the book is also useful to professionals and researchers in statistical, medical, social and other disciplines. It discusses current methodological techniques used in statistics and related interdisciplinary areas. Every concept is supported with relevant research examples to help readers to find the most suitable application. Statistical tools have been presented by using real-life examples, removing the “fear factor” usually associated with this complex subject. The book will help readers to discover diverse perspectives of statistical theory followed by relevant worked-out examples. Keeping in mind the needs of readers, as well as constantly changing scenarios, the material is presented in an easy-to-understand form.
Postural control during standing reach in children with Down syndrome.
Chen, Hao-Ling; Yeh, Chun-Fu; Howe, Tsu-Hsin
2015-03-01
The purpose of the present study was to investigate the dynamic postural control of children with Down syndrome (DS). Specifically, we compared postural control and goal-directed reaching performance between children with DS and typically developing children during standing reach. Standing reach performance was analyzed in three main phases using the kinematic and kinetic data collected from a force plate and a motion capture system. Fourteen children with DS, age and gender matched with fourteen typically developing children, were recruited for this study. The results showed that the demand of the standing reach task affected both dynamic postural control and reaching performance in children with DS, especially in the condition of beyond arm's length reaching. More postural adjustment strategies were recruited when reaching distance was beyond arm's length. Children with DS tended to use inefficient and conservative strategies for postural stability and reaching. That is, children with DS perform standing reach with increased reaction and execution time and decreased amplitudes of center of pressure displacements. Standing reach resembled functional balance that is required in daily activities. It is suggested to be considered as a part of strength and balance training program with graded task difficulty.
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National Statistica
Evaluation of Juvenile Fall Chinook Stranding on the Hanford Reach, 1997-1999 Interim Report.
Wagner, Paul; Nugent, John; Price, William (Washington Department of Fish and Wildlife, Olympia, WA)
1999-02-15
Pilot work conducted in 1997 to aid the development of the study for the 1998 Evaluation of Juvenile Fall Chinook Stranding on The Hanford Reach. The objectives of the 1997 work were to: (1) identify juvenile chinook production and rearing areas..., (2) identify sampling sites and develop the statistical parameters necessary to complete the study, (3) develop a study plan..., (4) conduct field sampling activities...
Gender Issues in Labour Statistics.
Greenwood, Adriana Mata
1999-01-01
Presents the main features needed for labor statistics to reflect the respective situations for women and men in the labor market. Identifies topics to be covered and detail needed for significant distinctions to emerge. Explains how the choice of measurement method and data presentation can influence the final result. (Author/JOW)
Investigation of PAM-4 for extending reach in data center interconnect applications
Vegas Olmos, Juan José; Teipen, Brian; Eiselt, Nicklas
2015-01-01
Optical four-level pulse amplitude modulation (PAM-4) is being widely studied for various short-reach optical interfaces, motivated by the need to keep cost structure low, and to increase link capacity despite various constraints in component bandwidth. When considering PAM-4 in applications...... with reach significantly greater than 10km, such as in extended data center interconnects which require optical amplification, impairments such as chromatic dispersion, optical filtering, and ASE must be controlled. We investigate and report on requirements of PAM-4 for extended-reach, data center...
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Reach/frequency for printed media: Personal probabilities or models
Mortensen, Peter Stendahl
2000-01-01
that, in order to prevent bias, ratings per group must be used as reading probabilities. Nevertheless, in most cases, the estimates are still biased compared with panel data, thus overestimating net ´reach. Models with the same assumptions as with assignments of reading probabilities are presented......The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...
Reach Scale Sediment Balance of Goodwin Creek Watershed, Mississippi
Ran, L.; Garcia, T.; Ye, S.; Harman, C. J.; Hassan, M. A.; Simon, A.
2010-12-01
Several reaches of Goodwin Creek, an experimental watershed within the Mississippi river basin, were analyzed for the period 1977-2007 in terms of long-term trends in sediment gain and loss in each reach, the relation of input and output to within-reach sediment fluxes, and the impacts of land use and bank erosion on reach sediment dynamics. Over the period 1977-2007, degradational and aggradational reaches were identified indicating slight vertical adjustment along the mainstream. Lateral adjustment was the main response of the channel to changes in flow and sediment regimes. Event-based sediment load was estimated using suspended concentration data, bedload transport rate, and changes in cross-sectional data. Bank erosion was estimated using cross-sectional data and models. The spatial and temporal patterns of within-reach sediment dynamics correspond closely with river morphology and also reflect basin conditions over the last three decades; thus they are conditioned by coeval trends in climate, hydrology, and land use. The sediment exchange within the mainstream was calculated by the development of reach sediment balances that reveal complex spatial and temporal patterns of sediment dynamics. Sediment load during the rising limb of the hydrograph was slightly higher than those estimated for the falling limb indicating the relative importance of sediment supply on reach sediment dynamic in the basin. Cumulative plots of sediment exchange reveal that major changes in within reach sediment storage are associated with large floods or major inputs from bank erosion.
Statistical learning and data science
Summa, Mireille Gettler; Goldfarb, Bernard; Murtagh, Fionn; Pardoux, Catherine; Touati, Myriam
2011-01-01
Data analysis is changing fast. Driven by a vast range of application domains and affordable tools, machine learning has become mainstream. Unsupervised data analysis, including cluster analysis, factor analysis, and low dimensionality mapping methods continually being updated, have reached new heights of achievement in the incredibly rich data world that we inhabit. Statistical Learning and Data Science is a work of reference in the rapidly evolving context of converging methodologies. It gathers contributions from some of the foundational thinkers in the different fields of data analysis to the major theoretical results in the domain. On the methodological front, the volume includes conformal prediction and frameworks for assessing confidence in outputs, together with attendant risk. It illustrates a wide range of applications, including semantics, credit risk, energy production, genomics, and ecology. The book also addresses issues of origin and evolutions in the unsupervised data analysis arena, and prese...
Phenomena and characteristics of barrier river reaches in the middle and lower Yangtze River, China
Xingying You; Jinwu Tang
2017-06-01
Alluvial river self-adjustment describes the mechanism whereby a river that was originally in an equilibrium state of sediment transport encounters some disturbance that destroys the balance and results in responses such as riverbed deformation. A systematic study of historical and recent aerial photographs and topographic maps in the Middle and Lower Reaches of the Yangtze River (MLYR) shows that river self-adjustment has the distinguishing feature of transferring from upstream to downstream, which may affect flood safety, waterway morphology, bank stability, and aquatic environmental safety over relatively long reaches downstream. As a result, it is necessary to take measures to control or block this transfer. Using the relationship of the occurrence time of channel adjustments between the upstream and downstream, 34 single-thread river reaches in the MLYR were classified into four types: corresponding, basically corresponding, basically not corresponding, not corresponding. The latter two types, because of their ability to prevent upstream channel adjustment from transferring downstream, are called barrier river reaches in this study. Statistics indicate that barrier river reaches are generally single thread and slightly curved, with a narrow and deep cross-sectional morphology, and without flow deflecting nodes in the upper and middle parts of reaches. Moreover, in the MLYR, barrier river reaches have a hydrogeometric coefficient of <4, a gradient >1.2‰, a silty clay content of the concave bank >9.5%, and a median diameter of the bed sediment >0.158 mm. The barrier river reach mechanism lies in that can effectively centralise the planimetric position of the main stream from different upstream directions, meaning that no matter how the upper channel adjusts, the main stream shows little change, providing relatively stable inflow conditions for the lower reaches. Regarding river regulation, it is necessary to optimise the benefits of barrier river reaches; long
Higher harmonics increase LISA's mass reach for supermassive black holes
Arun, K G; Sathyaprakash, B S; Sinha, Siddhartha
2007-01-01
Current expectations on the signal to noise ratios and masses of supermassive black holes which the Laser Interferometer Space Antenna (LISA) can observe are based on using in matched filtering only the dominant harmonic of the inspiral waveform at twice the orbital frequency. Other harmonics will affect the signal-to-noise ratio of systems currently believed to be observable by LISA. More significantly, inclusion of other harmonics in our matched filters would mean that more massive systems that were previously thought to be {\\it not} visible in LISA should be detectable with reasonable SNRs. Our estimates show that we should be able to significantly increase the mass reach of LISA and observe the more commonly occurring supermassive black holes of masses $\\sim 10^8M_\\odot.$ More specifically, with the inclusion of all known harmonics LISA will be able to observe even supermassive black hole coalescences with total mass $\\sim 10^8 M_\\odot (10^9M_\\odot)$ (and mass-ratio 0.1) for a low frequency cut-off of $10...
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Campbell, M J
2011-01-01
The new edition of this international bestseller continues to throw light on the world of statistics for health care professionals and medical students. Revised throughout, the 11th edition features new material in the areas of relative risk, absolute risk and numbers needed to treat diagnostic tests, sensitivity, specificity, ROC curves free statistical software The popular self-testing exercises at the end of every chapter are strengthened by the addition of new sections on reading and reporting statistics and formula appreciation.
Optimization techniques in statistics
Rustagi, Jagdish S
1994-01-01
Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mathematical statistics with applications
Wackerly, Dennis D; Scheaffer, Richard L
2008-01-01
In their bestselling MATHEMATICAL STATISTICS WITH APPLICATIONS, premiere authors Dennis Wackerly, William Mendenhall, and Richard L. Scheaffer present a solid foundation in statistical theory while conveying the relevance and importance of the theory in solving practical problems in the real world. The authors' use of practical applications and excellent exercises helps you discover the nature of statistics and understand its essential role in scientific research.
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Clades reach highest morphological disparity early in their evolution
Hughes, Martin; Gerber, Sylvain; Albion Wills, Matthew
2013-08-01
There are few putative macroevolutionary trends or rules that withstand scrutiny. Here, we test and verify the purported tendency for animal clades to reach their maximum morphological variety relatively early in their evolutionary histories (early high disparity). We present a meta-analysis of 98 metazoan clades radiating throughout the Phanerozoic. The disparity profiles of groups through time are summarized in terms of their center of gravity (CG), with values above and below 0.50 indicating top- and bottom-heaviness, respectively. Clades that terminate at one of the "big five" mass extinction events tend to have truncated trajectories, with a significantly top-heavy CG distribution overall. The remaining 63 clades show the opposite tendency, with a significantly bottom-heavy mean CG (relatively early high disparity). Resampling tests are used to identify groups with a CG significantly above or below 0.50; clades not terminating at a mass extinction are three times more likely to be significantly bottom-heavy than top-heavy. Overall, there is no clear temporal trend in disparity profile shapes from the Cambrian to the Recent, and early high disparity is the predominant pattern throughout the Phanerozoic. Our results do not allow us to distinguish between ecological and developmental explanations for this phenomenon. To the extent that ecology has a role, however, the paucity of bottom-heavy clades radiating in the immediate wake of mass extinctions suggests that early high disparity more probably results from the evolution of key apomorphies at the base of clades rather than from physical drivers or catastrophic ecospace clearing.
Ariwahjoedi, Seramika; Kosasih, Jusak Sali; Rovelli, Carlo; Zen, Freddy Permana
2016-01-01
Following our earlier work, we construct statistical discrete geometry by applying statistical mechanics to discrete (Regge) gravity. We propose a coarse-graining method for discrete geometry under the assumptions of atomism and background independence. To maintain these assumptions, restrictions are given to the theory by introducing cut-offs, both in ultraviolet and infrared regime. Having a well-defined statistical picture of discrete Regge geometry, we take the infinite degrees of freedom (large n) limit. We argue that the correct limit consistent with the restrictions and the background independence concept is not the continuum limit of statistical mechanics, but the thermodynamical limit.
2009-01-01
Ericsson is a global provider of telecommunications systems equipment and related services for mobile and fixed network operators. 3Gsim is a tool used by Ericsson in tests of the 3G RNC node. In order to validate the tests, statistics are constantly gathered within 3Gsim and users can use telnet to access the statistics using some system specific 3Gsim commands. The statistics can be retrieved but is unstructured for the human eye and needs parsing and arranging to be readable. The statist...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Record Statistics and Dynamics
Sibani, Paolo; Jensen, Henrik J.
2009-01-01
The term record statistics covers the statistical properties of records within an ordered series of numerical data obtained from observations or measurements. A record within such series is simply a value larger (or smaller) than all preceding values. The mathematical properties of records strongly...... fluctuations of e. g. the energy are able to push the system past some sort of ‘edge of stability’, inducing irreversible configurational changes, whose statistics then closely follows the statistics of record fluctuations....
Shasha, Dennis
2010-01-01
Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along
A Statistical Programme Assignment Model
Rosholm, Michael; Staghøj, Jonas; Svarer, Michael
assignment mechanism, which is based on the discretionary choice of case workers. This is done in a duration model context, using the timing-of-events framework to identify causal effects. We compare different assignment mechanisms, and the results suggest that a significant reduction in the average...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the plementation of such a system, especially the interplay between the statistical model and case workers....
Testing statistical hypotheses of equivalence
Wellek, Stefan
2010-01-01
Equivalence testing has grown significantly in importance over the last two decades, especially as its relevance to a variety of applications has become understood. Yet published work on the general methodology remains scattered in specialists' journals, and for the most part, it focuses on the relatively narrow topic of bioequivalence assessment.With a far broader perspective, Testing Statistical Hypotheses of Equivalence provides the first comprehensive treatment of statistical equivalence testing. The author addresses a spectrum of specific, two-sided equivalence testing problems, from the
Springer, Gregory S.; Wohl, Ellen E.; Foster, Julie A.; Boyer, Douglas G.
2003-11-01
An open question exists as to whether channel geometries and hydraulics are adjusted in bedrock streams with stable, concave profiles in a manner analogous to alluvial rivers. As a test of this problem, a comparison was undertaken of channel geometries and hydraulics among reaches with substrates that are of high mechanical resistance, but of variable chemical resistance. Reaches were selected from Buckeye Creek and Greenbrier River, West Virginia, USA because these streams flow over sandstones, limestones, and shales. The limestones have Selby rock resistance scores similar to those of the sandstones. A total of 13 reaches consisting of between 6 and 26 cross sections were surveyed in the streams. HEC-RAS was used to estimate unit stream power ( ω) and shear stress ( τ) for each reach. The reaches were selected to evaluate the null hypothesis that that ω and τ are equal atop soluble versus insoluble bedrock. Hypothesis tests consisted of paired t-tests and simultaneous, multiple comparisons. Geomorphic setting was included for Greenbrier River because previous studies have suggested that bedrock streams are intimately coupled with hillslopes. Holding geomorphic setting constant, three separate comparisons of ω and τ reveal that these variables are lowest atop soluble substrates in Greenbrier River (significance ≤0.05) and that changes in ω and τ are mediated by changes in channel geometry. Similarly, headwater reaches of Buckeye Creek developed atop shale and sandstone boulders are statistically distinguishable from downstream reaches wherein corrosion of limestone is the primary means of incision. However, comparisons in each stream reveal that channel geometries, ω and τ, are not strictly controlled by bed solubility. For constant substrate solubility along the Greenbrier River, ω and τ are consistently higher where a bedrock cutbank is present or coarse, insoluble sediment enters the channel. The latter is also associated with locally high values
Influence of CYP2C9 and VKORC1 polymorphisms on the time required to reach the therapeutic INR
Florentina C. Militaru
2012-12-01
Full Text Available Oral anticoagulation (OAC is characterized by a narrow therapeutic index and a high interindividual variability, both in terms ofpharmacokinetics and pharmacodynamics. We have considered useful and interesting to research the factors that could play a role in determiningthe time required for Acenocoumarol to achieve its optimal therapeutic response. Material and method: The present research is a cross analyticobservational study. We included 105 patients treated with an initial dose of 4 mg Acenocoumarol, for one or more of the following clinicalsituations: 1. Deep venous thrombosis of the lower limbs (DVT ± pulmonary thromboembolism (PTE; 2. Permanent atrial fibrillation (AF;3. Prosthetic heart valve. Results and conclusions: The presence of CYP2C9*2 and CYP2C9*3 alleles did not affect the time required to reacha therapeutic INR. The c.-1639G>A polymorphism of the VKORC1 gene significantly and statistically influenced the time to reach the targetINR. The existence of a supratherapeutic INR during the initial phase of anticoagulant treatment causes a 35% lower probability of reaching atherapeutic INR on the fifth day of anticoagulant treatment.
Statistical Mechanics of Zooplankton.
Hinow, Peter; Nihongi, Ai; Strickler, J Rudi
2015-01-01
Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.
Statistical Mechanics of Zooplankton.
Peter Hinow
Full Text Available Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.
Should these potential CMR substances have been registered under REACH?
Wedebye, Eva Bay; Nikolov, Nikolai Georgiev; Dybdahl, Marianne;
2013-01-01
(Q)SAR models were applied to screen around 68,000 REACH pre-registered substances for CMR properties (carcinogenic, mutagenic or toxic to reproduction). Predictions from 14 relevant models were combined to reach overall calls for C, M and R. Combining predictions may reduce “noise” and increase...
Guaranteed performance in reaching mode of sliding mode controlled systems
G K Singh; K E Holé
2004-02-01
Conventionally, the parameters of a sliding mode controller (SMC) are selected so as to reduce the time spent in the reaching mode. Although, an upper bound on the time to reach (reaching time) the sliding surface is easily derived, performance guarantee in the state/error space needs more consideration. This paper addresses the design of constant plus proportional rate reaching law-based SMC for second-order nonlinear systems. It is shown that this controller imposes a bounding second-order error-dynamics, and thus guarantees robust performance during the reaching phase. The choice of the controller parameters based on the time to reach a desirable level of output tracking error (OTE), rather than on the reaching time is proposed. Using the Lyapunov theory, it is shown that parameter selections, based on the reaching time criterion, may need substantially larger time to achieve the OTE. Simulation results are presented for a nonlinear spring-massdamper system. It is seen that parameter selections based on the proposed OTE criterion, result in substantially quicker tracking, while using similar levels of control effort.
A Statistical Programme Assignment Model
Rosholm, Michael; Staghøj, Jonas; Svarer, Michael
When treatment effects of active labour market programmes are heterogeneous in an observable way across the population, the allocation of the unemployed into different programmes becomes a particularly important issue. In this paper, we present a statistical model designed to improve the present...... assignment mechanism, which is based on the discretionary choice of case workers. This is done in a duration model context, using the timing-of-events framework to identify causal effects. We compare different assignment mechanisms, and the results suggest that a significant reduction in the average...... duration of unemployment spells may result if a statistical programme assignment model is introduced. We discuss several issues regarding the plementation of such a system, especially the interplay between the statistical model and case workers....
Statistical Engine Knock Control
Stotsky, Alexander A.
2008-01-01
A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
Statistical Hadronization and Holography
Bechi, Jacopo
2009-01-01
In this paper we consider some issues about the statistical model of the hadronization in a holographic approach. We introduce a Rindler like horizon in the bulk and we understand the string breaking as a tunneling event under this horizon. We calculate the hadron spectrum and we get a thermal......, and so statistical, shape for it....
Handbook of Spatial Statistics
Gelfand, Alan E
2010-01-01
Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.
Practical statistics simply explained
Langley, Dr Russell A
1971-01-01
For those who need to know statistics but shy away from math, this book teaches how to extract truth and draw valid conclusions from numerical data using logic and the philosophy of statistics rather than complex formulae. Lucid discussion of averages and scatter, investigation design, more. Problems with solutions.
Statistical methods in astronomy
Long, James P.; de Souza, Rafael S.
2017-01-01
We present a review of data types and statistical methods often encountered in astronomy. The aim is to provide an introduction to statistical applications in astronomy for statisticians and computer scientists. We highlight the complex, often hierarchical, nature of many astronomy inference problems and advocate for cross-disciplinary collaborations to address these challenges.
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
Inductive Logic and Statistics
Romeijn, J. -W.
2009-01-01
This chapter concerns inductive logic in relation to mathematical statistics. I start by introducing a general notion of probabilistic induc- tive inference. Then I introduce Carnapian inductive logic, and I show that it can be related to Bayesian statistical inference via de Finetti's representatio
Statistical mechanics of pluripotency.
MacArthur, Ben D; Lemischka, Ihor R
2013-08-01
Recent reports using single-cell profiling have indicated a remarkably dynamic view of pluripotent stem cell identity. Here, we argue that the pluripotent state is not well defined at the single-cell level but rather is a statistical property of stem cell populations, amenable to analysis using the tools of statistical mechanics and information theory.
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...
Council of Ontario Universities, Toronto.
Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…
Deconstructing Statistical Analysis
Snell, Joel
2014-01-01
Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…
Practical statistics for educators
Ravid, Ruth
2014-01-01
Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.
A comparison of the spine posture among several sit-and-reach test protocols.
Miñarro, Pedro A López; Andújar, Pilar Sáinz de Baranda; García, Pedro L Rodríguez; Toro, Enrique Ortega
2007-12-01
The purpose of the study was to compare the thoracic and lumbar spine posture among different sit-and-reach tests. Fifty-eight men and 47 women were asked to perform three trials of sit-and-reach test (SR), toe-touch test (TT), back-saver sit-and-reach test (BS) right and left, unilateral seated sit-and-reach test (USR) right and left, and V sit-and-reach test (VSR). Thoracic and lumbar angles were assessed with an inclinometer when subjects reached forward maximally. Women had a lower thoracic angle than men on all tests (pspine when compared to other tests (30.5 degrees in men and 32.0 degrees in women). Unilateral seated sit-and-reach test presented the lowest lumbar angle in men (24.2 degrees for right leg and 23.9 degrees for left leg) and women (23.9 degrees in both legs) and there were significant differences with respect to the other tests. Characteristics and administration procedures of tests, such us uni- or bilateral, sitting or standing, measuring with or without box, parallel or V position, and hip position influence thoracic and lumbar postures.
Statistical laws in linguistics
Altmann, Eduardo G
2015-01-01
Zipf's law is just one out of many universal laws proposed to describe statistical regularities in language. Here we review and critically discuss how these laws can be statistically interpreted, fitted, and tested (falsified). The modern availability of large databases of written text allows for tests with an unprecedent statistical accuracy and also a characterization of the fluctuations around the typical behavior. We find that fluctuations are usually much larger than expected based on simplifying statistical assumptions (e.g., independence and lack of correlations between observations).These simplifications appear also in usual statistical tests so that the large fluctuations can be erroneously interpreted as a falsification of the law. Instead, here we argue that linguistic laws are only meaningful (falsifiable) if accompanied by a model for which the fluctuations can be computed (e.g., a generative model of the text). The large fluctuations we report show that the constraints imposed by linguistic laws...
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Statistical Methods for Astronomy
Feigelson, Eric D
2012-01-01
This review outlines concepts of mathematical statistics, elements of probability theory, hypothesis tests and point estimation for use in the analysis of modern astronomical data. Least squares, maximum likelihood, and Bayesian approaches to statistical inference are treated. Resampling methods, particularly the bootstrap, provide valuable procedures when distributions functions of statistics are not known. Several approaches to model selection and good- ness of fit are considered. Applied statistics relevant to astronomical research are briefly discussed: nonparametric methods for use when little is known about the behavior of the astronomical populations or processes; data smoothing with kernel density estimation and nonparametric regression; unsupervised clustering and supervised classification procedures for multivariate problems; survival analysis for astronomical datasets with nondetections; time- and frequency-domain times series analysis for light curves; and spatial statistics to interpret the spati...
Scheen, A J; Schmitt, H; Jiang, H H; Ivanyi, T
2017-02-01
To evaluate factors associated with reaching or not reaching target glycated haemoglobin (HbA1c) levels by analysing the respective contributions of fasting hyperglycaemia (FHG), also referred to as basal hyperglycaemia, vs postprandial hyperglycaemia (PHG) before and after initiation of a basal or premixed insulin regimen in patients with type 2 diabetes. This post-hoc analysis of insulin-naïve patients in the DURABLE study randomised to receive either insulin glargine or insulin lispro mix 25 evaluated the percentages of patients achieving a target HbA1c of reached the target HbA1c. The higher the HbA1c quartile, the greater was the decrease in HbA1c, but also the smaller the percentage of patients achieving the target HbA1c. HbA1c and FHG decreased more in patients reaching the target, resulting in significantly lower values at endpoint in all baseline HbA1c quartiles with either insulin treatment. Patients not achieving the target HbA1c had slightly higher insulin doses, but lower total hypoglycaemia rates. Smaller decreases in FHG were associated with not reaching the target HbA1c, suggesting a need to increase basal or premixed insulin doses to achieve targeted fasting plasma glucose and improve patient response before introducing more intensive prandial insulin regimens. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Freitas, Juliana G; Rivett, Michael O; Roche, Rachel S; Durrant Neé Cleverly, Megan; Walker, Caroline; Tellam, John H
2015-02-01
The typically elevated natural attenuation capacity of riverbed-hyporheic zones is expected to decrease chlorinated hydrocarbon (CHC) groundwater plume discharges to river receptors through dechlorination reactions. The aim of this study was to assess physico-chemical processes controlling field-scale variation in riverbed-hyporheic zone dechlorination of a TCE groundwater plume discharge to an urban river reach. The 50-m long pool-riffle-glide reach of the River Tame in Birmingham (UK) studied is a heterogeneous high energy river environment. The shallow riverbed was instrumented with a detailed network of multilevel samplers. Freeze coring revealed a geologically heterogeneous and poorly sorted riverbed. A chlorine number reduction approach provided a quantitative indicator of CHC dechlorination. Three sub-reaches of contrasting behaviour were identified. Greatest dechlorination occurred in the riffle sub-reach that was characterised by hyporheic zone flows, moderate sulphate concentrations and pH, anaerobic conditions, low iron, but elevated manganese concentrations with evidence of sulphate reduction. Transient hyporheic zone flows allowing input to varying riverbed depths of organic matter are anticipated to be a key control. The glide sub-reach displayed negligible dechlorination attributed to the predominant groundwater baseflow discharge condition, absence of hyporheic zone, transition to more oxic conditions and elevated sulphate concentrations expected to locally inhibit dechlorination. The tail-of-pool-riffle sub-reach exhibited patchy dechlorination that was attributed to sub-reach complexities including significant flow bypass of a low permeability, high organic matter, silty unit of high dechlorination potential. A process-based conceptual model of reach-scale dechlorination variability was developed. Key findings of practitioner relevance were: riverbed-hyporheic zone CHC dechlorination may provide only a partial, somewhat patchy barrier to CHC
Decoding Grasping Movements from the Parieto-Frontal Reaching Circuit in the Nonhuman Primate.
Nelissen, Koen; Fiave, Prosper Agbesi; Vanduffel, Wim
2017-02-18
Prehension movements typically include a reaching phase, guiding the hand toward the object, and a grip phase, shaping the hand around it. The dominant view posits that these components rely upon largely independent parieto-frontal circuits: a dorso-medial circuit involved in reaching and a dorso-lateral circuit involved in grasping. However, mounting evidence suggests a more complex arrangement, with dorso-medial areas contributing to both reaching and grasping. To investigate the role of the dorso-medial reaching circuit in grasping, we trained monkeys to reach-and-grasp different objects in the dark and determined if hand configurations could be decoded from functional magnetic resonance imaging (MRI) responses obtained from the reaching and grasping circuits. Indicative of their established role in grasping, object-specific grasp decoding was found in anterior intraparietal (AIP) area, inferior parietal lobule area PFG and ventral premotor region F5 of the lateral grasping circuit, and primary motor cortex. Importantly, the medial reaching circuit also conveyed robust grasp-specific information, as evidenced by significant decoding in parietal reach regions (particular V6A) and dorsal premotor region F2. These data support the proposed role of dorso-medial "reach" regions in controlling aspects of grasping and demonstrate the value of complementing univariate with more sensitive multivariate analyses of functional MRI (fMRI) data in uncovering information coding in the brain. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Chiara eBegliomini
2014-09-01
Full Text Available Experimental evidence suggests the existence of a sophisticated brain circuit specifically dedicated to reach-to-grasp planning and execution, both in human and non human primates (Castiello, 2005. Studies accomplished by means of neuroimaging techniques suggest the hypothesis of a dichotomy between a reach-to-grasp circuit, involving the intraparietal area (AIP, the dorsal and ventral premotor cortices (PMd and PMv - Castiello and Begliomini, 2008; Filimon, 2010 and a reaching circuit involving the medial intraparietal area (mIP and the Superior Parieto-Occipital Cortex (SPOC (Culham et al., 2006. However, the time course characterizing the involvement of these regions during the planning and execution of these two types of movements has yet to be delineated. A functional magnetic resonance imaging (fMRI study has been conducted, including reach-to grasp and reaching only movements, performed towards either a small or a large stimulus, and Finite Impulse Response model (FIR - Henson, 2003 was adopted to monitor activation patterns from stimulus onset for a time window of 10 seconds duration. Data analysis focused on brain regions belonging either to the reaching or to the grasping network, as suggested by Castiello & Begliomini (2008.Results suggest that reaching and grasping movements planning and execution might share a common brain network, providing further confirmation to the idea that the neural underpinnings of reaching and grasping may overlap in both spatial and temporal terms (Verhagen et al., 2013.
Proprioceptive body illusions modulate the visual perception of reaching distance.
Agustin Petroni
Full Text Available The neurobiology of reaching has been extensively studied in human and non-human primates. However, the mechanisms that allow a subject to decide-without engaging in explicit action-whether an object is reachable are not fully understood. Some studies conclude that decisions near the reach limit depend on motor simulations of the reaching movement. Others have shown that the body schema plays a role in explicit and implicit distance estimation, especially after motor practice with a tool. In this study we evaluate the causal role of multisensory body representations in the perception of reachable space. We reasoned that if body schema is used to estimate reach, an illusion of the finger size induced by proprioceptive stimulation should propagate to the perception of reaching distances. To test this hypothesis we induced a proprioceptive illusion of extension or shrinkage of the right index finger while participants judged a series of LEDs as reachable or non-reachable without actual movement. Our results show that reach distance estimation depends on the illusory perceived size of the finger: illusory elongation produced a shift of reaching distance away from the body whereas illusory shrinkage produced the opposite effect. Combining these results with previous findings, we suggest that deciding if a target is reachable requires an integration of body inputs in high order multisensory parietal areas that engage in movement simulations through connections with frontal premotor areas.
The impact of REACH on classification for human health hazards.
Oltmanns, J; Bunke, D; Jenseit, W; Heidorn, C
2014-11-01
The REACH Regulation represents a major piece of chemical legislation in the EU and requires manufacturers and importers of chemicals to assess the safety of their substances. The classification of substances for their hazards is one of the crucial elements in this process. We analysed the effect of REACH on classification for human health endpoints by comparing information from REACH registration dossiers with legally binding, harmonised classifications. The analysis included 142 chemicals produced at very high tonnages in the EU, the majority of which have already been assessed in the past. Of 20 substances lacking a harmonised classification, 12 chemicals were classified in REACH registration dossiers. More importantly, 37 substances with harmonised classifications for human health endpoints had stricter classifications in registration dossiers and 29 of these were classified for at least one additional endpoint not covered by the harmonised classification. Substance-specific analyses suggest that one third of these additional endpoints emerged from experimental studies performed to fulfil information requirements under REACH, while two thirds resulted from a new assessment of pre-REACH studies. We conclude that REACH leads to an improved hazard characterisation even for substances with a potentially good data basis.
Proprioceptive Body Illusions Modulate the Visual Perception of Reaching Distance
Petroni, Agustin; Carbajal, M. Julia; Sigman, Mariano
2015-01-01
The neurobiology of reaching has been extensively studied in human and non-human primates. However, the mechanisms that allow a subject to decide—without engaging in explicit action—whether an object is reachable are not fully understood. Some studies conclude that decisions near the reach limit depend on motor simulations of the reaching movement. Others have shown that the body schema plays a role in explicit and implicit distance estimation, especially after motor practice with a tool. In this study we evaluate the causal role of multisensory body representations in the perception of reachable space. We reasoned that if body schema is used to estimate reach, an illusion of the finger size induced by proprioceptive stimulation should propagate to the perception of reaching distances. To test this hypothesis we induced a proprioceptive illusion of extension or shrinkage of the right index finger while participants judged a series of LEDs as reachable or non-reachable without actual movement. Our results show that reach distance estimation depends on the illusory perceived size of the finger: illusory elongation produced a shift of reaching distance away from the body whereas illusory shrinkage produced the opposite effect. Combining these results with previous findings, we suggest that deciding if a target is reachable requires an integration of body inputs in high order multisensory parietal areas that engage in movement simulations through connections with frontal premotor areas. PMID:26110274
HAN Xue; WEI Fengying; Yves M.TOURRE; DONG Wenjie
2008-01-01
The spatio-temporal variability of Northern Hemisphere Sea Level Pressure (SLP) and precipitation over the mid-to-low reaches of the Yangtze River (PMLY) is analyzed jointly using the multi-taper /singular value decomposition method (MTM-SVD). Statistically significant narrow frequency bands are obtained from the local fractional variance (LFV) spectrum. Significant interdecadal (i.e., 16-to-18-year periods) and interannual (i.e., 3-to-6-year periods) signals are identified. Moreover, a significant quasi-biennial signal is identified but only for PMLY data. The spatial joint evolution of patterns obtained for peaks in the LFV spectrum sheds light on relationships between SLP and PMLY: the Arctic Oscillation (AO) modulates the variability of the PMLY while the interannual variability of PMLY is in phase with the Northern Atlantic Oscillation (NAO) and the Northern Pacific Oscillation (NPO).
Statistics a complete introduction
Graham, Alan
2013-01-01
Statistics: A Complete Introduction is the most comprehensive yet easy-to-use introduction to using Statistics. Written by a leading expert, this book will help you if you are studying for an important exam or essay, or if you simply want to improve your knowledge. The book covers all the key areas of Statistics including graphs, data interpretation, spreadsheets, regression, correlation and probability. Everything you will need is here in this one book. Each chapter includes not only an explanation of the knowledge and skills you need, but also worked examples and test questions.
Statistics of football dynamics
Mendes, R S; Anteneodo, C
2007-01-01
We investigate the dynamics of football matches. Our goal is to characterize statistically the temporal sequence of ball movements in this collective sport game, searching for traits of complex behavior. Data were collected over a variety of matches in South American, European and World championships throughout 2005 and 2006. We show that the statistics of ball touches presents power-law tails and can be described by $q$-gamma distributions. To explain such behavior we propose a model that provides information on the characteristics of football dynamics. Furthermore, we discuss the statistics of duration of out-of-play intervals, not directly related to the previous scenario.
Siegel, Andrew
2011-01-01
Practical Business Statistics, Sixth Edition, is a conceptual, realistic, and matter-of-fact approach to managerial statistics that carefully maintains-but does not overemphasize-mathematical correctness. The book offers a deep understanding of how to learn from data and how to deal with uncertainty while promoting the use of practical computer applications. This teaches present and future managers how to use and understand statistics without an overdose of technical detail, enabling them to better understand the concepts at hand and to interpret results. The text uses excellent examples with
Multivariate Statistical Process Control
Kulahci, Murat
2013-01-01
As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
Mauro, John
2013-01-01
Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Statistical Engine Knock Control
Stotsky, Alexander A.
2008-01-01
A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency....... C ontrol algorithm which is used for minimization of the regulation error realizes a simple count-up-count-d own logic. A new ad aptation algorithm for the knock d etection threshold is also d eveloped . C onfi d ence interval method is used as the b asis for ad aptation. A simple statistical mod el...
Liao, Tim Futing
2011-01-01
An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde
Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.
2017-01-01
Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.
Approximating Stationary Statistical Properties
Xiaoming WANG
2009-01-01
It is well-known that physical laws for large chaotic dynamical systems are revealed statistically. Many times these statistical properties of the system must be approximated numerically. The main contribution of this manuscript is to provide simple and natural criterions on numerical methods (temporal and spatial discretization) that are able to capture the stationary statistical properties of the underlying dissipative chaotic dynamical systems asymptotically. The result on temporal approximation is a recent finding of the author, and the result on spatial approximation is a new one. Applications to the infinite Prandtl number model for convection and the barotropic quasi-geostrophic model are also discussed.
Commentary: statistics for biomarkers.
Lovell, David P
2012-05-01
This short commentary discusses Biomarkers' requirements for the reporting of statistical analyses in submitted papers. It is expected that submitters will follow the general instructions of the journal, the more detailed guidance given by the International Committee of Medical Journal Editors, the specific guidelines developed by the EQUATOR network, and those of various specialist groups. Biomarkers expects that the study design and subsequent statistical analyses are clearly reported and that the data reported can be made available for independent assessment. The journal recognizes that there is continuing debate about different approaches to statistical science. Biomarkers appreciates that the field continues to develop rapidly and encourages the use of new methodologies.
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
50 years sets with positive reach - a survey -
Christoph Thäle
2008-09-01
Full Text Available The purpose of this paper is to summarize results on various aspects of sets with positive reach, which are up to now not available in such a compact form. After recalling briefly the results before 1959, sets with positive reach and their associated curvature measures are introduced. We develop an integral and current representation of these curvature measures and show how the current representation helps to prove integralgeometric formulas, such as the principal kinematic formula. Also random sets with positive reach and random mosaics (or the more general random cell-complexes with general cell shape are considered.
REACH Basics for Chinese Producers of Electric Household Appliances
Dr.Klaus W.Mehl
2008-01-01
The following article explains the EU chemical regulation "REACH', explicates the requirements that Chinese producers are facing, and shows how they can fulfill the requirements and secure their access to the EU market. The consequences of failing to fulfill REACH requirements are given in REACH Article 5: No data, no market: ... substances ... in articles ... shall not be ... placed on the market unless they have been registered In other words: Without registration of chemicals Chinese producers of electric household appliances may loose their EU market.
On the statistical assessment of classifiers using DNA microarray data
Carella M
2006-08-01
decreases as the training set size increases, reaching its best performances with 35 training examples. In this case, RLS and SVM have error rates of e = 14% (p = 0.027 and e = 11% (p = 0.019. Concerning the number of genes, we found about 6000 genes (p e = 16% (p Conclusions The method proposed provides statistically significant answers to precise questions relevant for the diagnosis and prognosis of cancer. We found that, with as few as 15 examples, it is possible to train statistically significant classifiers for colon cancer diagnosis. As for the definition of the number of genes sufficient for a reliable classification of colon cancer, our results suggest that it depends on the accuracy required.
Sachdeva, Kuldeep Singh; Kumar, Ashok; Dewan, Puneet; Kumar, Ajay; Satyanarayana, Srinath
2012-05-01
The Phase II (2006-2012) of the Revised National Tuberculosis Control Programme (RNTCP) has been successful in achieving its objectives. Tuberculosis (TB) disease burden (prevalence and mortality) in India has reduced significantly when compared to 1990 levels, and India is on track to achieve the TB related millennium development goals. Despite significant progress, TB still continues to be one of the major public health problems in the country, and intensified efforts are required to reduce TB transmission and accelerate reductions in TB incidence, particularly in urban areas and difficult terrains. Achieving 'Universal access' is possible and necessary for the country. RNTCP during the 12 th Five Year Plan (2012-2017) aims to achieve 'Universal access' to quality assured TB diagnosis and treatment and elaborate plans are being made. This requires broad and concerted efforts and support from all stakeholders with substantial enhancement of commitment and financing at all levels. This paper describes the new vision of RNTCP and an overview of how this will be achieved.
Effect of speed manipulation on the control of aperture closure during reach-to-grasp movements.
Rand, Miya K; Squire, Linda M; Stelmach, George E
2006-09-01
This study investigates coordination between hand transport and grasp movement components by examining a hypothesis that the hand location, relative to the object, in which aperture closure is initiated remains relatively constant under a wide range of transport speed. Subjects made reach-to-grasp movements to a dowel under four speed conditions: slow, comfortable, fast but comfortable, and maximum (i.e., as fast as possible). The distance traveled by the wrist after aperture reached its maximum (aperture closure distance) increased with an increase of transport speed across the speed conditions. This finding rejected the hypothesis and suggests that the speed of hand transport is taken into account in aperture closure initiation. Within each speed condition, however, the closure distance exhibited relatively small variability across trials, even though the total distance traveled by the wrist during the entire transport movement varied from trial to trial. The observed stability in aperture closure distance across trials implies that the hand distance to the object plays an important role in the control law governing the initiation of aperture closure. Further analysis showed that the aperture closure distance depended on the amplitude of peak aperture as well as hand velocity and acceleration. To clarify the form of the above control law, we analyzed four different mathematical models, in which a decision to initiate grasp closure is made as soon as a specific movement parameter (wrist distance to target or transport time) crosses a threshold that is either a constant value or a function of the above-mentioned other movement-related parameters. Statistical analysis performed across all movement conditions revealed that the control law model (according to which grasp initiation is made when hand distance to target becomes less than a certain linear function of aperture amplitude, hand velocity, and hand acceleration) produced significantly smaller residual errors
What can be learnt from an ecotoxicity database in the framework of the REACh regulation?
Henegar, Adina; Mombelli, Enrico [Unite Modeles pour l' Ecotoxicologie et la Toxicologie (METO), INERIS, Parc Technologique Alata, BP2, 60550 Verneuil-en-Halatte (France); Pandard, Pascal [Unite Expertise et Essais en Ecotoxicologie (EXES), INERIS, Parc Technologique Alata, BP2, 60550 Verneuil-en-Halatte (France); Pery, Alexandre R.R., E-mail: alexandre.pery@ineris.fr [Unite Modeles pour l' Ecotoxicologie et la Toxicologie (METO), INERIS, Parc Technologique Alata, BP2, 60550 Verneuil-en-Halatte (France)
2011-01-01
Since REACh applies in all of EU, special emphasis has been put on the reduction of systematic ecotoxicity testing. In this context, it is important to extract a maximum of information from existing ecotoxicity databases in order to propose alternative methods aimed at replacing and reducing experimental testing. Consequently, we analyzed a database of new chemicals registered in France and Europe during the last twenty years reporting aquatic ecotoxicity data with respect to three trophic levels (i.e., Algae EC50 72 h, Daphnia EC50 48 h and Fish LC50 96 h). In order to ensure the relevance of the comparison between these three experimental tests, we performed a stringent data selection based on the pertinence and quality of available ecotoxicological information. At the end of this selection, less than 5% of the initial number of chemicals was retained for subsequent analysis. Such an analysis showed that fish was the least sensitive trophic level, whereas Daphnia had the highest sensitivity. Moreover, thanks to an analysis of the relative sensitivity of trophic levels, it was possible to establish that respective correction factors of 50 and 10 would be necessary if only one or two test values were available. From a physicochemical point of view, it was possible to characterize two significant correlations relating the sensitivity of the aforementioned trophic levels with the chemical structure of the retained substances. This analysis showed that algae displayed a higher sensitivity towards chemicals containing acid fragments whereas fish presented a higher sensitivity towards chemicals containing aromatic ether fragments. Overall, our work suggests that statistical analysis of historical data combined with data yielded by the REACh regulation should permit the derivation of robust safety factors, testing strategies and mathematical models. These alternative methods, in turn, could allow a replacement and reduction of ecotoxicological testing. - Research
Breast cancer statistics, 2011.
DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin
2011-01-01
In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population.
Elements of statistical thermodynamics
Nash, Leonard K
2006-01-01
Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.
LBVs and Statistical Inference
Davidson, Kris; Weis, Kerstin
2016-01-01
Smith and Tombleson (2015) asserted that statistical tests disprove the standard view of LBVs, and proposed a far more complex scenario to replace it. But Humphreys et al. (2016) showed that Smith and Tombleson's Magellanic "LBV" sample was a mixture of physically different classes of stars, and genuine LBVs are in fact statistically consistent with the standard view. Smith (2016) recently objected at great length to this result. Here we note that he misrepresented some of the arguments, altered the test criteria, ignored some long-recognized observational facts, and employed inadequate statistical procedures. This case illustrates the dangers of uncareful statistical sampling, as well as the need to be wary of unstated assumptions.
Ehrlichiosis: Statistics and Epidemiology
... a tick Diseases transmitted by ticks Statistics and Epidemiology Recommend on Facebook Tweet Share Compartir On this ... Holman RC, McQuiston JH, Krebs JW, Swerdlow DL. Epidemiology of human ehrlichiosis and anaplasmosis in the United ...
Harrison, JM; Robbins, JM; 10.1098/rspa.2010.0254
2011-01-01
Quantum graphs are commonly used as models of complex quantum systems, for example molecules, networks of wires, and states of condensed matter. We consider quantum statistics for indistinguishable spinless particles on a graph, concentrating on the simplest case of abelian statistics for two particles. In spite of the fact that graphs are locally one-dimensional, anyon statistics emerge in a generalized form. A given graph may support a family of independent anyon phases associated with topologically inequivalent exchange processes. In addition, for sufficiently complex graphs, there appear new discrete-valued phases. Our analysis is simplified by considering combinatorial rather than metric graphs -- equivalently, a many-particle tight-binding model. The results demonstrate that graphs provide an arena in which to study new manifestations of quantum statistics. Possible applications include topological quantum computing, topological insulators, the fractional quantum Hall effect, superconductivity and molec...
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical......, identify interest rate models, value bonds, estimate parameters, and much more. This textbook will help students understand and manage empirical research in financial engineering. It includes examples of how the statistical tools can be used to improve value-at-risk calculations and other issues...
CMS Statistics Reference Booklet
U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
... Non-Hodgkin) Lymphoma (Hodgkin) Neuroblastoma Osteosarcoma Retinoblastoma Rhabdomyosarcoma Skin Cancer Soft Tissue Sarcoma Thyroid Cancer Cancer Resources Childhood Cancer Statistics Coping With Cancer CureSearch CancerCare App Late Effects ...
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Shepperson, L
1997-12-01
Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...
Testa, Massimo
2015-01-01
Based on the fundamental principles of Relativistic Quantum Mechanics, we give a rigorous, but completely elementary, proof of the relation between fundamental observables of a statistical system when measured relatively to two inertial reference frames, connected by a Lorentz transformation.
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical......, identify interest rate models, value bonds, estimate parameters, and much more. This textbook will help students understand and manage empirical research in financial engineering. It includes examples of how the statistical tools can be used to improve value-at-risk calculations and other issues......Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
... gov Disability.gov Freedom of Information Act | Privacy & Security Statement | Disclaimers | Customer Survey | Important Web Site Notices U.S. Bureau of Labor Statistics | Postal Square Building, 2 Massachusetts Avenue, NE Washington, ...
Statistics For Neuroscientists
Subbakrishna D.K
2000-01-01
Full Text Available The role statistical methods play in medicine in the interpretation of empirical data is well recognized by researchers. With modern computing facilities and software packages there is little need for familiarity with the computational details of statistical calculations. However, for the researcher to understand whether these calculations are valid and appropriate it is necessary that the user is aware of the rudiments of the statistical methodology. Also, it needs to be emphasized that no amount of advanced analysis can be a substitute for a properly planned and executed study. An attempt is made in this communication to discuss some of the theoretical issues that are important for the valid analysis and interpretation of precious date that are gathered. The article summarises some of the basic statistical concepts followed by illustrations from live data generated from various research projects from the department of Neurology of this Institute.
Information theory and statistics
Kullback, Solomon
1997-01-01
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
Gumbel, E J
2012-01-01
This classic text covers order statistics and their exceedances; exact distribution of extremes; the 1st asymptotic distribution; uses of the 1st, 2nd, and 3rd asymptotes; more. 1958 edition. Includes 44 tables and 97 graphs.
Medicaid Drug Claims Statistics
U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.
U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...
Every year, the South African Minister of Police releases the crime statistics in ... prove an invaluable source of information for those who seek to better understand and respond to crime ... of Social Development in the JCPS may suggest a.
U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Downstream hydraulic geometry relationships: Gathering reference reach-scale width values from LiDAR
Sofia, G.; Tarolli, P.; Cazorzi, F.; Dalla Fontana, G.
2015-12-01
This paper examines the ability of LiDAR topography to provide reach-scale width values for the analysis of downstream hydraulic geometry relationships along some streams in the Dolomites (northern Italy). Multiple reach-scale dimensions can provide representative geometries and statistics characterising the longitudinal variability in the channel, improving the understanding of geomorphic processes across networks. Starting from the minimum curvature derived from a LiDAR DTM, the proposed algorithm uses a statistical approach for the identification of the scale of analysis, and for the automatic characterisation of reach-scale bankfull widths. The downstream adjustment in channel morphology is then related to flow parameters (drainage area and stream power). With the correct planning of a LiDAR survey, uncertainties in the procedure are principally due to the resolution of the DTM. The outputs are in general comparable in quality to field survey measurements, and the procedure allows the quick comparison among different watersheds. The proposed automatic approach could improve knowledge about river systems with highly variable widths, and about systems in areas covered by vegetation or inaccessible to field surveys. With proven effectiveness, this research could offer an interesting starting point for the analysis of differences between watersheds, and to improve knowledge about downstream channel adjustment in relation, for example, to scale and landscape forcing (e.g. sediment transport, tectonics, lithology, climate, geomorphology, and anthropic pressure).
2010-01-01
Abstract Background For years the Robert Koch Institute (RKI) has been annually pooling and reviewing the data from the German population-based cancer registries and evaluating them together with the cause-of-death statistics provided by the statistical offices. Traditionally, the RKI periodically estimates the number of new cancer cases in Germany on the basis of the available data from the regional cancer registries in which registration is complete; this figure, in turn, forms the basis fo...
Dominican Republic; Statistical Appendix
International Monetary Fund
2003-01-01
In this paper, statistical data for the Dominican Republic were presented as real, public, financial, and external sectors. In real sector, GDP by sector at constant prices, savings, investment, consumer price index, petroleum statistics, and so on, were outlined. The public sector summarizes operations of the consolidated public sector, central government, and revenues. A summary of the banking system, claims, interest rates, financial indicators, and reserve requirements were described in t...
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist