Statistically significant relational data mining :
Energy Technology Data Exchange (ETDEWEB)
Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.
2014-02-01
This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.
Statistical Significance for Hierarchical Clustering
Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.
2017-01-01
Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990
Statistical significance versus clinical relevance.
van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G
2017-04-01
In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Statistical significance of cis-regulatory modules
Directory of Open Access Journals (Sweden)
Smith Andrew D
2007-01-01
Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.
The thresholds for statistical and clinical significance
DEFF Research Database (Denmark)
Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per
2014-01-01
BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...
The insignificance of statistical significance testing
Johnson, Douglas H.
1999-01-01
Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.
Swiss solar power statistics 2007 - Significant expansion
International Nuclear Information System (INIS)
Hostettler, T.
2008-01-01
This article presents and discusses the 2007 statistics for solar power in Switzerland. A significant number of new installations is noted as is the high production figures from newer installations. The basics behind the compilation of the Swiss solar power statistics are briefly reviewed and an overview for the period 1989 to 2007 is presented which includes figures on the number of photovoltaic plant in service and installed peak power. Typical production figures in kilowatt-hours (kWh) per installed kilowatt-peak power (kWp) are presented and discussed for installations of various sizes. Increased production after inverter replacement in older installations is noted. Finally, the general political situation in Switzerland as far as solar power is concerned are briefly discussed as are international developments.
Significant Statistics: Viewed with a Contextual Lens
Tait-McCutcheon, Sandi
2010-01-01
This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…
Investigation of the statistical distance to reach stationary distributions
International Nuclear Information System (INIS)
Nicholson, S.B.; Kim, Eun-jin
2015-01-01
The thermodynamic length gives a Riemannian metric to a system's phase space. Here we extend the traditional thermodynamic length to the information length (L) out of equilibrium and examine its properties. We utilise L as a useful methodology of analysing non-equilibrium systems without evoking conventional assumptions such as Gaussian statistics, detailed balance, priori-known constraints, or ergodicity and numerically examine how L evolves in time for the logistic map in the chaotic regime depending on initial conditions. To this end, we propose a discrete version of L which is mathematically well defined by taking a set theoretic approach. We identify the areas of phase space where the loss of information of the system takes place most rapidly. In particular, we present an interesting result that the unstable fixed points turn out to most efficiently drive the logistic map towards a stationary distribution through L. - Highlights: • Define a set theoretic version of the discrete thermodynamic length. • These sets allow one to analyse systems having zero probabilities in their evolution. • Numerically analyse the Logistic map using the thermodynamic length. • Show how the unstable fixed points most efficiently lead the system to equilibrium
Testing the Difference of Correlated Agreement Coefficients for Statistical Significance
Gwet, Kilem L.
2016-01-01
This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…
Significance levels for studies with correlated test statistics.
Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S
2008-07-01
When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.
Caveats for using statistical significance tests in research assessments
DEFF Research Database (Denmark)
Schneider, Jesper Wiborg
2013-01-01
controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators...
Yang, Jing; Zammit, Christian; Dudley, Bruce
2017-04-01
The phenomenon of losing and gaining in rivers normally takes place in lowland where often there are various, sometimes conflicting uses for water resources, e.g., agriculture, industry, recreation, and maintenance of ecosystem function. To better support water allocation decisions, it is crucial to understand the location and seasonal dynamics of these losses and gains. We present a statistical methodology to predict losing and gaining river reaches in New Zealand based on 1) information surveys with surface water and groundwater experts from regional government, 2) A collection of river/watershed characteristics, including climate, soil and hydrogeologic information, and 3) the random forests technique. The surveys on losing and gaining reaches were conducted face-to-face at 16 New Zealand regional government authorities, and climate, soil, river geometry, and hydrogeologic data from various sources were collected and compiled to represent river/watershed characteristics. The random forests technique was used to build up the statistical relationship between river reach status (gain and loss) and river/watershed characteristics, and then to predict for river reaches at Strahler order one without prior losing and gaining information. Results show that the model has a classification error of around 10% for "gain" and "loss". The results will assist further research, and water allocation decisions in lowland New Zealand.
On detection and assessment of statistical significance of Genomic Islands
Directory of Open Access Journals (Sweden)
Chaudhuri Probal
2008-04-01
Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.
Your Chi-Square Test Is Statistically Significant: Now What?
Sharpe, Donald
2015-01-01
Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data…
Statistical Significance and Effect Size: Two Sides of a Coin.
Fan, Xitao
This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…
Reporting effect sizes as a supplement to statistical significance ...
African Journals Online (AJOL)
The purpose of the article is to review the statistical significance reporting practices in reading instruction studies and to provide guidelines for when to calculate and report effect sizes in educational research. A review of six readily accessible (online) and accredited journals publishing research on reading instruction ...
Test for the statistical significance of differences between ROC curves
International Nuclear Information System (INIS)
Metz, C.E.; Kronman, H.B.
1979-01-01
A test for the statistical significance of observed differences between two measured Receiver Operating Characteristic (ROC) curves has been designed and evaluated. The set of observer response data for each ROC curve is assumed to be independent and to arise from a ROC curve having a form which, in the absence of statistical fluctuations in the response data, graphs as a straight line on double normal-deviate axes. To test the significance of an apparent difference between two measured ROC curves, maximum likelihood estimates of the two parameters of each curve and the associated parameter variances and covariance are calculated from the corresponding set of observer response data. An approximate Chi-square statistic with two degrees of freedom is then constructed from the differences between the parameters estimated for each ROC curve and from the variances and covariances of these estimates. This statistic is known to be truly Chi-square distributed only in the limit of large numbers of trials in the observer performance experiments. Performance of the statistic for data arising from a limited number of experimental trials was evaluated. Independent sets of rating scale data arising from the same underlying ROC curve were paired, and the fraction of differences found (falsely) significant was compared to the significance level, α, used with the test. Although test performance was found to be somewhat dependent on both the number of trials in the data and the position of the underlying ROC curve in the ROC space, the results for various significance levels showed the test to be reliable under practical experimental conditions
Increasing the statistical significance of entanglement detection in experiments.
Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei
2010-05-28
Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.
Directory of Open Access Journals (Sweden)
Priya Ranganathan
2015-01-01
Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2015-01-01
In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958
Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies
International Nuclear Information System (INIS)
Weber, K.H.
1993-01-01
In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de
Systematic reviews of anesthesiologic interventions reported as statistically significant
DEFF Research Database (Denmark)
Imberger, Georgina; Gluud, Christian; Boylan, John
2015-01-01
statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that may......: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median number...
Increasing the statistical significance of entanglement detection in experiments
Energy Technology Data Exchange (ETDEWEB)
Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)
2010-07-01
Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.
Kramer, Karen L; Veile, Amanda; Otárola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1) as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2) competition from young siblings will negatively impact child growth during the post weaning period; 3) however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.
Directory of Open Access Journals (Sweden)
Karen L Kramer
Full Text Available Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1 as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2 competition from young siblings will negatively impact child growth during the post weaning period; 3 however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.
Ecological significance of riverine gravel bars in regulated river reaches below dams
Ock, G.; Takemon, Y.; Sumi, T.; Kondolf, G. M.
2012-12-01
A gravel bar has been recognized as ecologically significant in that they provide simplified habitat with topographical, hydrological and thermo-chemical diversity, while enhancing material exchanges as interfaces laterally between aquatic and terrestrial habitats, and vertically between surface and subsurface waters. During past several decades, regulated rivers below dams have been loss of a number of the geomorphological features due to sediment starvation by upstream dams, accompanied by a subsequent degradation of their ecological functions. Despite a growing concern for gravel bar management recognizing its importance in recovering riverine ecosystem services, the ecological roles of gravel bars have not been assessed enough from the empirical perspectives of habitat diversity and organic matter interactions. In this study, we investigate the 'natural filtering effects' for reducing lentic plankton and contaminants associated with self-purification, and 'physicochemical habitat complexity' of gravel bars, focusing on reach-scaled gravel bars in rivers located in three different countries; First is the Uji River in central Japan, where there has been a loss of gravel bars in the downstream reaches since an upstream dam was constructed in 1965; second is the Tagliamento River in northeast Italy, which shows morphologically intact braided bar channels by natural flooding events and sediment supply; third is the Trinity River in the United States (located in northern California), the site of ongoing restoration efforts for creating new gravel bars through gravel augmentation and channel rehabilitation activities. We traced the downstream changes in particulate organic matter (POM) trophic sources (composed of allochthonous terrestrial inputs, autochthonous instream production and lentic plankton from dam outflows) in order to evaluate the roles of the geomorphological features in tailwater ecosystem food-resources shifting. We calculated suspended POM
A tutorial on hunting statistical significance by chasing N
Directory of Open Access Journals (Sweden)
Denes Szucs
2016-09-01
Full Text Available There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticise some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post-hoc along potential but unplanned independent grouping variables. The first approach 'hacks' the number of participants in studies, the second approach ‘hacks’ the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20-50%, or more false positives.
Conducting tests for statistically significant differences using forest inventory data
James A. Westfall; Scott A. Pugh; John W. Coulston
2013-01-01
Many forest inventory and monitoring programs are based on a sample of ground plots from which estimates of forest resources are derived. In addition to evaluating metrics such as number of trees or amount of cubic wood volume, it is often desirable to make comparisons between resource attributes. To properly conduct statistical tests for differences, it is imperative...
Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks
2016-04-26
Systems, Statistics & Management Science, University of Alabama, USA. 1 DISTRIBUTION A: Distribution approved for public release. Contents 1 Summary 5...13 5 Application to Real Networks 18 5.1 2012 FBS Football Schedule Network... football schedule network. . . . . . . . . . . . . . . . . . . . . . 21 14 Stem plot of degree-ordered vertices versus the degree for college football
After statistics reform : Should we still teach significance testing?
A. Hak (Tony)
2014-01-01
textabstractIn the longer term null hypothesis significance testing (NHST) will disappear because p- values are not informative and not replicable. Should we continue to teach in the future the procedures of then abolished routines (i.e., NHST)? Three arguments are discussed for not teaching NHST in
Wilkinson, Michael
2014-03-01
Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.
Breunig, Nancy A.
Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza
2014-01-01
This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…
Brouwer, D.; Meijer, R.R.; Zevalkink, D.J.
2013-01-01
Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Health significance and statistical uncertainty. The value of P-value.
Consonni, Dario; Bertazzi, Pier Alberto
2017-10-27
The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".
DEFF Research Database (Denmark)
Engsted, Tom
I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model...
Zhao, Guixiang
2017-04-01
Based on the hourly TBB and cloud images of FY-2E, meteorological observation data, and NCEP reanalysis data with 1°×1° spatial resolution from May to October during 2005-2014, the climatic characteristics of mesoscale convective systems (MCS) over the middle reaches area of the Yellow River were analyzed, including mesoscale convective complex (MCC), persistent elongated convective systems (PECS), meso-βscale MCC (MβCCS) and Meso-βscale PECS (MβECS). The results are as follows: (1) MCS tended to occur over the middle and south of Gansu, the middle and south of Shanxi, the middle and north of Shaanxi, and the border of Shanxi, Shaanxi and Inner Mongolia. MCS over the middle reaches area of the Yellow River formed in May to October, and was easy to develop the mature in summer. MCC and MβECS were main MCS causing precipitation in summer. (2) The daily variation of MCS was obvious, and usually formed and matured in the afternoon and the evening to early morning of the next day. Most MCS generated fast and dissipated slowly, and were mainly move to the easterly and southeasterly, but the moving of round shape MCS was less than the elongated shape's. (3) The average TBB for the round shape MCS was lower than the elongated shape MCS. The development of MCC was most vigorous and strong, and it was the strongest in August, while that of MβECS wasn't obviously influenced by the seasonal change. The average eccentricity of the mature MCC and PECS over the middle reaches area of the Yellow River was greater than that in USA, and the former was greater than in the lower reaches area of the Yellow River, while the latter was smaller. (4) The characteristics of rainfall caused by MCS were complex over the middle reaches area of the Yellow River, and there were obvious regional difference. There was wider, stronger and longer precipitation when the multiple MCS merged. The rainfall in the center of cloud area was obviously greater than in other region of cloud area. The
Zhang, Zhang
2012-03-22
Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.
Directory of Open Access Journals (Sweden)
Zhang Zhang
2012-03-01
Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.
Directory of Open Access Journals (Sweden)
Melissa Coulson
2010-07-01
Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.
Farrell, Mary Beth
2018-06-01
This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being
Statistical significance of trends in monthly heavy precipitation over the US
Mahajan, Salil
2011-05-11
Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.
Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.
Kieffer, Kevin M.; Thompson, Bruce
As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…
Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.
Deegear, James
This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…
Sierevelt, Inger N.; van Oldenrijk, Jakob; Poolman, Rudolf W.
2007-01-01
In this paper we describe several issues that influence the reporting of statistical significance in relation to clinical importance, since misinterpretation of p values is a common issue in orthopaedic literature. Orthopaedic research is tormented by the risks of false-positive (type I error) and
Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.
2011-01-01
In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…
van Tulder, M.W.; Malmivaara, A.; Hayden, J.; Koes, B.
2007-01-01
STUDY DESIGN. Critical appraisal of the literature. OBJECIVES. The objective of this study was to assess if results of back pain trials are statistically significant and clinically important. SUMMARY OF BACKGROUND DATA. There seems to be a discrepancy between conclusions reported by authors and
P-Value, a true test of statistical significance? a cautionary note ...
African Journals Online (AJOL)
While it's not the intention of the founders of significance testing and hypothesis testing to have the two ideas intertwined as if they are complementary, the inconvenient marriage of the two practices into one coherent, convenient, incontrovertible and misinterpreted practice has dotted our standard statistics textbooks and ...
DEFF Research Database (Denmark)
Jones, Allan; Sommerlund, Bo
2007-01-01
The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...
DEFF Research Database (Denmark)
Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per
2014-01-01
BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...
Testing statistical significance scores of sequence comparison methods with structure similarity
Directory of Open Access Journals (Sweden)
Leunissen Jack AM
2006-10-01
Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.
Diedrich, Alice; Schlegl, Sandra; Greetfeld, Martin; Fumi, Markus; Voderholzer, Ulrich
2018-03-01
This study examines the statistical and clinical significance of symptom changes during an intensive inpatient treatment program with a strong psychotherapeutic focus for individuals with severe bulimia nervosa. 295 consecutively admitted bulimic patients were administered the Structured Interview for Anorexic and Bulimic Syndromes-Self-Rating (SIAB-S), the Eating Disorder Inventory-2 (EDI-2), the Brief Symptom Inventory (BSI), and the Beck Depression Inventory-II (BDI-II) at treatment intake and discharge. Results indicated statistically significant symptom reductions with large effect sizes regarding severity of binge eating and compensatory behavior (SIAB-S), overall eating disorder symptom severity (EDI-2), overall psychopathology (BSI), and depressive symptom severity (BDI-II) even when controlling for antidepressant medication. The majority of patients showed either reliable (EDI-2: 33.7%, BSI: 34.8%, BDI-II: 18.1%) or even clinically significant symptom changes (EDI-2: 43.2%, BSI: 33.9%, BDI-II: 56.9%). Patients with clinically significant improvement were less distressed at intake and less likely to suffer from a comorbid borderline personality disorder when compared with those who did not improve to a clinically significant extent. Findings indicate that intensive psychotherapeutic inpatient treatment may be effective in about 75% of severely affected bulimic patients. For the remaining non-responding patients, inpatient treatment might be improved through an even stronger focus on the reduction of comorbid borderline personality traits.
Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B
2013-03-23
Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Statistical significance estimation of a signal within the GooFit framework on GPUs
Directory of Open Access Journals (Sweden)
Cristella Leonardo
2017-01-01
Full Text Available In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
International Nuclear Information System (INIS)
DUDEK, J; SZPAK, B; FORNAL, B; PORQUET, M-G
2011-01-01
In this and the follow-up article we briefly discuss what we believe represents one of the most serious problems in contemporary nuclear structure: the question of statistical significance of parametrizations of nuclear microscopic Hamiltonians and the implied predictive power of the underlying theories. In the present Part I, we introduce the main lines of reasoning of the so-called Inverse Problem Theory, an important sub-field in the contemporary Applied Mathematics, here illustrated on the example of the Nuclear Mean-Field Approach.
Van Aert, R.C.M.; Van Assen, M.A.L.M.
2018-01-01
The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter
A Note on Comparing the Power of Test Statistics at Low Significance Levels.
Morris, Nathan; Elston, Robert
2011-01-01
It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.
Energy Technology Data Exchange (ETDEWEB)
Crow, C.J.
1985-01-01
Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.
Kellerer-Pirklbauer, Andreas
2016-04-01
Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter
Directory of Open Access Journals (Sweden)
Sadreyev Ruslan I
2004-08-01
Full Text Available Abstract Background Profile-based analysis of multiple sequence alignments (MSA allows for accurate comparison of protein families. Here, we address the problems of detecting statistically confident dissimilarities between (1 MSA position and a set of predicted residue frequencies, and (2 between two MSA positions. These problems are important for (i evaluation and optimization of methods predicting residue occurrence at protein positions; (ii detection of potentially misaligned regions in automatically produced alignments and their further refinement; and (iii detection of sites that determine functional or structural specificity in two related families. Results For problems (1 and (2, we propose analytical estimates of P-value and apply them to the detection of significant positional dissimilarities in various experimental situations. (a We compare structure-based predictions of residue propensities at a protein position to the actual residue frequencies in the MSA of homologs. (b We evaluate our method by the ability to detect erroneous position matches produced by an automatic sequence aligner. (c We compare MSA positions that correspond to residues aligned by automatic structure aligners. (d We compare MSA positions that are aligned by high-quality manual superposition of structures. Detected dissimilarities reveal shortcomings of the automatic methods for residue frequency prediction and alignment construction. For the high-quality structural alignments, the dissimilarities suggest sites of potential functional or structural importance. Conclusion The proposed computational method is of significant potential value for the analysis of protein families.
Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior
2011-09-23
Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.
Indirectional statistics and the significance of an asymmetry discovered by Birch
International Nuclear Information System (INIS)
Kendall, D.G.; Young, G.A.
1984-01-01
Birch (1982, Nature, 298, 451) reported an apparent 'statistical asymmetry of the Universe'. The authors here develop 'indirectional analysis' as a technique for investigating statistical effects of this kind and conclude that the reported effect (whatever may be its origin) is strongly supported by the observations. The estimated pole of the asymmetry is at RA 13h 30m, Dec. -37deg. The angular error in its estimation is unlikely to exceed 20-30deg. (author)
Directory of Open Access Journals (Sweden)
Mashhood Ahmed Sheikh
2017-08-01
mediate the association between childhood adversity and ADS in adulthood. However, when education was excluded as a mediator-response confounding variable, the indirect effect of childhood adversity on ADS in adulthood was statistically significant (p < 0.05. This study shows that a careful inclusion of potential confounding variables is important when assessing mediation.
Directory of Open Access Journals (Sweden)
Vujović Svetlana R.
2013-01-01
Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an
Energy Technology Data Exchange (ETDEWEB)
Fhager, V
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
International Nuclear Information System (INIS)
Fhager, V.
2000-01-01
In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy
Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao
2018-04-01
In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.
DEFF Research Database (Denmark)
Madsen, Tobias
2017-01-01
In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...
Statistical Analysis and Evaluation of the Depth of the Ruts on Lithuanian State Significance Roads
Directory of Open Access Journals (Sweden)
Erinijus Getautis
2011-04-01
Full Text Available The aim of this work is to gather information about the national flexible pavement roads ruts depth, to determine its statistical dispersijon index and to determine their validity for needed requirements. Analysis of scientific works of ruts apearance in the asphalt and their influence for driving is presented in this work. Dynamical models of ruts in asphalt are presented in the work as well. Experimental outcome data of rut depth dispersijon in the national highway of Lithuania Vilnius – Kaunas is prepared. Conclusions are formulated and presented. Article in Lithuanian
Directory of Open Access Journals (Sweden)
Dominic Beaulieu-Prévost
2006-03-01
Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.
Statistical determination of significant curved I-girder bridge seismic response parameters
Seo, Junwon
2013-06-01
Curved steel bridges are commonly used at interchanges in transportation networks and more of these structures continue to be designed and built in the United States. Though the use of these bridges continues to increase in locations that experience high seismicity, the effects of curvature and other parameters on their seismic behaviors have been neglected in current risk assessment tools. These tools can evaluate the seismic vulnerability of a transportation network using fragility curves. One critical component of fragility curve development for curved steel bridges is the completion of sensitivity analyses that help identify influential parameters related to their seismic response. In this study, an accessible inventory of existing curved steel girder bridges located primarily in the Mid-Atlantic United States (MAUS) was used to establish statistical characteristics used as inputs for a seismic sensitivity study. Critical seismic response quantities were captured using 3D nonlinear finite element models. Influential parameters from these quantities were identified using statistical tools that incorporate experimental Plackett-Burman Design (PBD), which included Pareto optimal plots and prediction profiler techniques. The findings revealed that the potential variation in the influential parameters included number of spans, radius of curvature, maximum span length, girder spacing, and cross-frame spacing. These parameters showed varying levels of influence on the critical bridge response.
The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data
Gregory, Kelvin
2006-01-01
The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…
Holtzman, Jessica N; Miller, Shefali; Hooshmand, Farnaz; Wang, Po W; Chang, Kiki D; Hill, Shelley J; Rasgon, Natalie L; Ketter, Terence A
2015-07-01
The strengths and limitations of considering childhood-and adolescent-onset bipolar disorder (BD) separately versus together remain to be established. We assessed this issue. BD patients referred to the Stanford Bipolar Disorder Clinic during 2000-2011 were assessed with the Systematic Treatment Enhancement Program for BD Affective Disorders Evaluation. Patients with childhood- and adolescent-onset were compared to those with adult-onset for 7 unfavorable bipolar illness characteristics with replicated associations with early-onset patients. Among 502 BD outpatients, those with childhood- (adolescent- (13-18 years, N=218) onset had significantly higher rates for 4/7 unfavorable illness characteristics, including lifetime comorbid anxiety disorder, at least ten lifetime mood episodes, lifetime alcohol use disorder, and prior suicide attempt, than those with adult-onset (>18 years, N=174). Childhood- but not adolescent-onset BD patients also had significantly higher rates of first-degree relative with mood disorder, lifetime substance use disorder, and rapid cycling in the prior year. Patients with pooled childhood/adolescent - compared to adult-onset had significantly higher rates for 5/7 of these unfavorable illness characteristics, while patients with childhood- compared to adolescent-onset had significantly higher rates for 4/7 of these unfavorable illness characteristics. Caucasian, insured, suburban, low substance abuse, American specialty clinic-referred sample limits generalizability. Onset age is based on retrospective recall. Childhood- compared to adolescent-onset BD was more robustly related to unfavorable bipolar illness characteristics, so pooling these groups attenuated such relationships. Further study is warranted to determine the extent to which adolescent-onset BD represents an intermediate phenotype between childhood- and adult-onset BD. Copyright © 2015 Elsevier B.V. All rights reserved.
Massey, J. L.
1976-01-01
The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Saha, Ranajit; Pan, Sudip; Chattaraj, Pratim K
2016-11-05
The validity of the maximum hardness principle (MHP) is tested in the cases of 50 chemical reactions, most of which are organic in nature and exhibit anomeric effect. To explore the effect of the level of theory on the validity of MHP in an exothermic reaction, B3LYP/6-311++G(2df,3pd) and LC-BLYP/6-311++G(2df,3pd) (def2-QZVP for iodine and mercury) levels are employed. Different approximations like the geometric mean of hardness and combined hardness are considered in case there are multiple reactants and/or products. It is observed that, based on the geometric mean of hardness, while 82% of the studied reactions obey the MHP at the B3LYP level, 84% of the reactions follow this rule at the LC-BLYP level. Most of the reactions possess the hardest species on the product side. A 50% null hypothesis is rejected at a 1% level of significance.
Directory of Open Access Journals (Sweden)
Justin London
2010-01-01
Full Text Available In “National Metrical Types in Nineteenth Century Art Song” Leigh Van Handel gives a sympathetic critique of William Rothstein’s claim that in western classical music of the late 18th and 19th centuries there are discernable differences in the phrasing and metrical practice of German versus French and Italian composers. This commentary (a examines just what Rothstein means in terms of his proposed metrical typology, (b questions Van Handel on how she has applied it to a purely melodic framework, (c amplifies Van Handel’s critique of Rothstein, and then (d concludes with a rumination on the reach of quantitative (i.e., statistically-driven versus qualitative claims regarding such things as “national metrical types.”
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Hashim, Muhammad Jawad
2010-09-01
Post-hoc secondary data analysis with no prespecified hypotheses has been discouraged by textbook authors and journal editors alike. Unfortunately no single term describes this phenomenon succinctly. I would like to coin the term "sigsearch" to define this practice and bring it within the teaching lexicon of statistics courses. Sigsearch would include any unplanned, post-hoc search for statistical significance using multiple comparisons of subgroups. It would also include data analysis with outcomes other than the prespecified primary outcome measure of a study as well as secondary data analyses of earlier research.
DEFF Research Database (Denmark)
Serviss, Jason T.; Gådin, Jesper R.; Eriksson, Per
2017-01-01
, e.g. genes in a specific pathway, alone can separate samples into these established classes. Despite this, the evaluation of class separations is often subjective and performed via visualization. Here we present the ClusterSignificance package; a set of tools designed to assess the statistical...... significance of class separations downstream of dimensionality reduction algorithms. In addition, we demonstrate the design and utility of the ClusterSignificance package and utilize it to determine the importance of long non-coding RNA expression in the identity of multiple hematological malignancies....
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Kim, Sung-Min; Choi, Yosoon
2017-06-18
To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
Directory of Open Access Journals (Sweden)
Sung-Min Kim
2017-06-01
Full Text Available To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH, high content with a low z-score (HL, low content with a high z-score (LH, and low content with a low z-score (LL. The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Directory of Open Access Journals (Sweden)
E. A. Tatokchin
2017-01-01
Full Text Available Development of the modern educational technologies caused by broad introduction of comput-er testing and development of distant forms of education does necessary revision of methods of an examination of pupils. In work it was shown, need transition to mathematical criteria, exami-nations of knowledge which are deprived of subjectivity. In article the review of the problems arising at realization of this task and are offered approaches for its decision. The greatest atten-tion is paid to discussion of a problem of objective transformation of rated estimates of the ex-pert on to the scale estimates of the student. In general, the discussion this question is was con-cluded that the solution to this problem lies in the creation of specialized intellectual systems. The basis for constructing intelligent system laid the mathematical model of self-organizing nonequilibrium dissipative system, which is a group of students. This article assumes that the dissipative system is provided by the constant influx of new test items of the expert and non-equilibrium – individual psychological characteristics of students in the group. As a result, the system must self-organize themselves into stable patterns. This patern will allow for, relying on large amounts of data, get a statistically significant assessment of student. To justify the pro-posed approach in the work presents the data of the statistical analysis of the results of testing a large sample of students (> 90. Conclusions from this statistical analysis allowed to develop intelligent system statistically significant examination of student performance. It is based on data clustering algorithm (k-mean for the three key parameters. It is shown that this approach allows you to create of the dynamics and objective expertise evaluation.
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo
2018-06-05
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.
Dou, Ming; Zhang, Yan; Zuo, Qiting; Mi, Qingbin
2015-08-01
The construction of sluices creates a strong disturbance in water environmental factors within a river. The change in water pollutant concentrations of sluice-controlled river reaches (SCRRs) is more complex than that of natural river segments. To determine the key factors affecting water pollutant concentration changes in SCRRs, river reaches near the Huaidian Sluice in the Shaying River of China were selected as a case study, and water quality monitoring experiments based on different regulating modes were implemented in 2009 and 2010. To identify the key factors affecting the change rates for the chemical oxygen demand of permanganate (CODMn) and ammonia nitrogen (NH3-N) concentrations in the SCRRs of the Huaidian Sluice, partial correlation analysis, principal component analysis and principal factor analysis were used. The results indicate four factors, i.e., the inflow quantity from upper reaches, opening size of sluice gates, water pollutant concentration from upper reaches, and turbidity before the sluice, which are the common key factors for the CODMn and NH3-N concentration change rates. Moreover, the dissolved oxygen before a sluice is a key factor for the permanganate concentration from CODMn change rate, and the water depth before a sluice is a key factor for the NH3-N concentration change rate. Multiple linear regressions between the water pollutant concentration change rate and key factors were established via multiple linear regression analyses, and the quantitative relationship between the CODMn and NH3-N concentration change rates and key affecting factors was analyzed. Finally, the mechanism of action for the key factors affecting the water pollutant concentration changes was analyzed. The results reveal that the inflow quantity from upper reaches, opening size of sluice gates, permanganate concentration from CODMn from upper reaches and dissolved oxygen before the sluice have a negative influence and the turbidity before the sluice has a positive
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P values values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Leitner Dietmar
2005-04-01
Full Text Available Abstract Background A reliable prediction of the Xaa-Pro peptide bond conformation would be a useful tool for many protein structure calculation methods. We have analyzed the Protein Data Bank and show that the combined use of sequential and structural information has a predictive value for the assessment of the cis versus trans peptide bond conformation of Xaa-Pro within proteins. For the analysis of the data sets different statistical methods such as the calculation of the Chou-Fasman parameters and occurrence matrices were used. Furthermore we analyzed the relationship between the relative solvent accessibility and the relative occurrence of prolines in the cis and in the trans conformation. Results One of the main results of the statistical investigations is the ranking of the secondary structure and sequence information with respect to the prediction of the Xaa-Pro peptide bond conformation. We observed a significant impact of secondary structure information on the occurrence of the Xaa-Pro peptide bond conformation, while the sequence information of amino acids neighboring proline is of little predictive value for the conformation of this bond. Conclusion In this work, we present an extensive analysis of the occurrence of the cis and trans proline conformation in proteins. Based on the data set, we derived patterns and rules for a possible prediction of the proline conformation. Upon adoption of the Chou-Fasman parameters, we are able to derive statistically relevant correlations between the secondary structure of amino acid fragments and the Xaa-Pro peptide bond conformation.
Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.
1999-01-01
Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier
Directory of Open Access Journals (Sweden)
Jiahui Fan
2016-06-01
Full Text Available Land use profoundly changes the terrestrial ecosystem and landscape patterns, and these changes reveal the extent and scope of the ecological influence of land use on the terrestrial ecosystem. The study area selected for this research was the middle reaches of the Heihe River. Based on land use data (1986, 2000, and 2014, we proposed an ecological risk index of land use by combining a landscape disturbance index with a landscape fragility index. An exponential model was selected to perform kriging interpolation, as well as spatial autocorrelations and semivariance analyses which could reveal the spatial aggregation patterns. The results indicated that the ecological risk of the middle reaches of the Heihe River was generally high, and higher in the northwest. The high values of the ecological risk index (ERI tended to decrease, and the low ERI values tended to increase. Positive spatial autocorrelations and a prominent scale-dependence were observed among the ERI values. The main hot areas with High-High local autocorrelations were located in the north, and the cold areas with low-low local autocorrelations were primarily located in the middle corridor plain and Qilian Mountains. From 1986 to 2014, low and relatively low ecological risk areas decreased while relatively high risk areas expanded. A middle level of ecological risk was observed in Ganzhou and Minle counties. Shandan County presented a serious polarization, with high ecological risk areas observed in the north and low ecological risk areas observed in the southern Shandan horse farm. In order to lower the eco-risk and achieve the sustainability of land use, these results suggest policies to strictly control the oasis expansion and the occupation of farmland for urbanization. Some inefficient farmland should transform into grassland in appropriate cases.
Bhiwandi, P; Campbell, M; Potts, M
1994-01-01
The 1994 International Conference on Population and Development proposed increasing contraceptive couple protection from 550 million in 1995 to 880 million in 2015. The task for family planning (FP) programs is to provide access to services for, sometimes, inaccessible rural populations. FP need based on desire for no more children has ranged from under 20% in Senegal to almost 80% in Peru. Socioeconomic development was found not to be a prerequisite for fertility change. Gender inequalities in education and social autonomy must be changed. FP access is very important among women with a disadvantaged background or among women unsure about FP. Bangladesh is a good example of a country with increased contraceptive prevalence despite low income. The rule of thumb is that contraception increases of 15% contribute to a drop in family size of about one child. Program effectiveness is related to a variety of factors: contraceptive availability at many locations, acceptable price of contraception, delivery of the oral contraceptives without prescriptions, and other strategies. FP is a service not a medical treatment. A range of methods must be promoted and available from a range of facilities. Contraceptive use is dependent on the woman's stage in her lifecycle and is dependent on informed choice. Community-based distribution systems are effective, whereas free distribution by poorly-trained field workers is not always very effective because patient payment of all or part of the cost assures quality and freedom of choice. Effective programs for underprivileged groups involve aggressive, easy to manage programs that can be replicated rapidly. FP serves a useful function in depressing maternal mortality among the poor in Africa, who have no access to quality health services. Social marketing is an effective strategy for reaching remote areas. Political will and robust management are necessary commodities.
Directory of Open Access Journals (Sweden)
Lutz Bornmann
Full Text Available Using the InCites tool of Thomson Reuters, this study compares normalized citation impact values calculated for China, Japan, France, Germany, United States, and the UK throughout the time period from 1981 to 2010. InCites offers a unique opportunity to study the normalized citation impacts of countries using (i a long publication window (1981 to 2010, (ii a differentiation in (broad or more narrow subject areas, and (iii allowing for the use of statistical procedures in order to obtain an insightful investigation of national citation trends across the years. Using four broad categories, our results show significantly increasing trends in citation impact values for France, the UK, and especially Germany across the last thirty years in all areas. The citation impact of papers from China is still at a relatively low level (mostly below the world average, but the country follows an increasing trend line. The USA exhibits a stable pattern of high citation impact values across the years. With small impact differences between the publication years, the US trend is increasing in engineering and technology but decreasing in medical and health sciences as well as in agricultural sciences. Similar to the USA, Japan follows increasing as well as decreasing trends in different subject areas, but the variability across the years is small. In most of the years, papers from Japan perform below or approximately at the world average in each subject area.
International Nuclear Information System (INIS)
Shakespeare, T.P.; Mukherjee, R.K.; Gebski, V.J.
2003-01-01
Confidence levels, clinical significance curves, and risk-benefit contours are tools improving analysis of clinical studies and minimizing misinterpretation of published results, however no software has been available for their calculation. The objective was to develop software to help clinicians utilize these tools. Excel 2000 spreadsheets were designed using only built-in functions, without macros. The workbook was protected and encrypted so that users can modify only input cells. The workbook has 4 spreadsheets for use in studies comparing two patient groups. Sheet 1 comprises instructions and graphic examples for use. Sheet 2 allows the user to input the main study results (e.g. survival rates) into a 2-by-2 table. Confidence intervals (95%), p-value and the confidence level for Treatment A being better than Treatment B are automatically generated. An additional input cell allows the user to determine the confidence associated with a specified level of benefit. For example if the user wishes to know the confidence that Treatment A is at least 10% better than B, 10% is entered. Sheet 2 automatically displays clinical significance curves, graphically illustrating confidence levels for all possible benefits of one treatment over the other. Sheet 3 allows input of toxicity data, and calculates the confidence that one treatment is more toxic than the other. It also determines the confidence that the relative toxicity of the most effective arm does not exceed user-defined tolerability. Sheet 4 automatically calculates risk-benefit contours, displaying the confidence associated with a specified scenario of minimum benefit and maximum risk of one treatment arm over the other. The spreadsheet is freely downloadable at www.ontumor.com/professional/statistics.htm A simple, self-explanatory, freely available spreadsheet calculator was developed using Excel 2000. The incorporated decision-making tools can be used for data analysis and improve the reporting of results of any
International Nuclear Information System (INIS)
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-01-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-06-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.
Extending the Reach of Statistical Software Testing
National Research Council Canada - National Science Library
Weber, Robert
2004-01-01
.... In particular, as system complexity increases, the matrices required to generate test cases and perform model analysis can grow dramatically, even exponentially, overwhelming the test generation...
Yokoyama, Shozo; Takenaka, Naomi
2005-04-01
Red-green color vision is strongly suspected to enhance the survival of its possessors. Despite being red-green color blind, however, many species have successfully competed in nature, which brings into question the evolutionary advantage of achieving red-green color vision. Here, we propose a new method of identifying positive selection at individual amino acid sites with the premise that if positive Darwinian selection has driven the evolution of the protein under consideration, then it should be found mostly at the branches in the phylogenetic tree where its function had changed. The statistical and molecular methods have been applied to 29 visual pigments with the wavelengths of maximal absorption at approximately 510-540 nm (green- or middle wavelength-sensitive [MWS] pigments) and at approximately 560 nm (red- or long wavelength-sensitive [LWS] pigments), which are sampled from a diverse range of vertebrate species. The results show that the MWS pigments are positively selected through amino acid replacements S180A, Y277F, and T285A and that the LWS pigments have been subjected to strong evolutionary conservation. The fact that these positively selected M/LWS pigments are found not only in animals with red-green color vision but also in those with red-green color blindness strongly suggests that both red-green color vision and color blindness have undergone adaptive evolution independently in different species.
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values
Directory of Open Access Journals (Sweden)
Anita Lindmark
Full Text Available When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance.The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method.Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252 and high specificity (0.991. There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence.The study emphasizes the importance of combining clinical relevance and level of statistical
Lindmark, Anita; van Rompaye, Bart; Goetghebeur, Els; Glader, Eva-Lotta; Eriksson, Marie
2016-01-01
When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke) to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance. The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method. Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252) and high specificity (0.991). There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence. The study emphasizes the importance of combining clinical relevance and level of statistical confidence when
Liu, Wei; Ding, Jinhui
2018-04-01
The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.
Di Florio, Adriano
2017-10-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.
Mirosław Mrozkowiak; Hanna Żukowska
2015-01-01
Mrozkowiak Mirosław, Żukowska Hanna. Znaczenie Dobrego Krzesła, jako elementu szkolnego i domowego środowiska ucznia, w profilaktyce zaburzeń statyki postawy ciała = The significance of Good Chair as part of children’s school and home environment in the preventive treatment of body statistics distortions. Journal of Education, Health and Sport. 2015;5(7):179-215. ISSN 2391-8306. DOI 10.5281/zenodo.19832 http://ojs.ukw.edu.pl/index.php/johs/article/view/2015%3B5%287%29%3A179-215 https:...
Fang, Yongxiang; Wit, Ernst
2008-01-01
Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...
U.S. Environmental Protection Agency — The Reach Address Database (RAD) stores the reach address of each Water Program feature that has been linked to the underlying surface water features (streams,...
Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu
2015-09-21
Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.
Valdes, C
1992-01-01
Guatemala's family planning (FP) programs are innovative but contraceptive use is only 23%. Total fertility is 5.3 children/woman, and the 9.5 million population will double in 23 years. The problem is poverty and illiteracy among rural residents removed from health services. 80% live in poverty and 80% are illiterate. Government effort is devoted to combating diseases such as diarrhea so there are few funds for implementing a comprehensive population policy. There is support within the national government but FP lacks priority status. APROFAM's goals are to use innovative marketing methods to inform the rural population who lack access to and knowledge about FP. Service delivery is constrained by the difficulty in reaching remote areas where 4 out of 10 indigenous Guatemalans live. Infant mortality can reach as high as 200/1000 live births. Population growth has slowed, and APROFAM plans to reach 16,000 more in the future. Promotions are conducted in several languages and aired on radio, television, and in the print media. It has been found that market research is the most effective strategy in reaching indigenous families. APROFAM has also been effective in upgrading service facilities through training, client surveys, and setting improved clinic standards. Breastfeeding, training, and voluntary sterilization programs contribute to the primary care effort. The example is given of Paulina Lebron from a very poor area who has learned how to space her children and thus improve the standard of living for her family. Eventually, she convinced herself and her family that sterilization was necessary, and now the couple enjoy the bliss of newlyweds without fear of pregnancy.
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M
2015-03-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.
Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.
2012-12-01
Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and
Dubois, Albertine; Hérard, Anne-Sophie; Delatour, Benoît; Hantraye, Philippe; Bonvento, Gilles; Dhenain, Marc; Delzescaux, Thierry
2010-06-01
Biomarkers and technologies similar to those used in humans are essential for the follow-up of Alzheimer's disease (AD) animal models, particularly for the clarification of mechanisms and the screening and validation of new candidate treatments. In humans, changes in brain metabolism can be detected by 1-deoxy-2-[(18)F] fluoro-D-glucose PET (FDG-PET) and assessed in a user-independent manner with dedicated software, such as Statistical Parametric Mapping (SPM). FDG-PET can be carried out in small animals, but its resolution is low as compared to the size of rodent brain structures. In mouse models of AD, changes in cerebral glucose utilization are usually detected by [(14)C]-2-deoxyglucose (2DG) autoradiography, but this requires prior manual outlining of regions of interest (ROI) on selected sections. Here, we evaluate the feasibility of applying the SPM method to 3D autoradiographic data sets mapping brain metabolic activity in a transgenic mouse model of AD. We report the preliminary results obtained with 4 APP/PS1 (64+/-1 weeks) and 3 PS1 (65+/-2 weeks) mice. We also describe new procedures for the acquisition and use of "blockface" photographs and provide the first demonstration of their value for the 3D reconstruction and spatial normalization of post mortem mouse brain volumes. Despite this limited sample size, our results appear to be meaningful, consistent, and more comprehensive than findings from previously published studies based on conventional ROI-based methods. The establishment of statistical significance at the voxel level, rather than with a user-defined ROI, makes it possible to detect more reliably subtle differences in geometrically complex regions, such as the hippocampus. Our approach is generic and could be easily applied to other biomarkers and extended to other species and applications. Copyright 2010 Elsevier Inc. All rights reserved.
Teratology testing under REACH.
Barton, Steve
2013-01-01
REACH guidelines may require teratology testing for new and existing chemicals. This chapter discusses procedures to assess the need for teratology testing and the conduct and interpretation of teratology tests where required.
2016-09-01
Popular culture reflects both the interests of and the issues affecting the general public. As concerns regarding climate change and its impacts grow, is it permeating into popular culture and reaching that global audience?
Duff, Margaret; Chen, Yinpeng; Cheng, Long; Liu, Sheng-Min; Blake, Paul; Wolf, Steven L; Rikakis, Thanassis
2013-05-01
Adaptive mixed reality rehabilitation (AMRR) is a novel integration of motion capture technology and high-level media computing that provides precise kinematic measurements and engaging multimodal feedback for self-assessment during a therapeutic task. We describe the first proof-of-concept study to compare outcomes of AMRR and traditional upper-extremity physical therapy. Two groups of participants with chronic stroke received either a month of AMRR therapy (n = 11) or matched dosing of traditional repetitive task therapy (n = 10). Participants were right handed, between 35 and 85 years old, and could independently reach to and at least partially grasp an object in front of them. Upper-extremity clinical scale scores and kinematic performances were measured before and after treatment. Both groups showed increased function after therapy, demonstrated by statistically significant improvements in Wolf Motor Function Test and upper-extremity Fugl-Meyer Assessment (FMA) scores, with the traditional therapy group improving significantly more on the FMA. However, only participants who received AMRR therapy showed a consistent improvement in kinematic measurements, both for the trained task of reaching to grasp a cone and the untrained task of reaching to push a lighted button. AMRR may be useful in improving both functionality and the kinematics of reaching. Further study is needed to determine if AMRR therapy induces long-term changes in movement quality that foster better functional recovery.
Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M.; Wolters, Lidewij H.; Huizenga, Hilde M.
2015-01-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a
Ariyaratne, A T
1989-01-01
Embodied in the child survival revolution are ideological, methodological, and organizational innovations aimed at radical change in the condition of the world's children as rapidly as possible. In countries such as Sri Lanka, child survival and health for all by the year 2000 often seem to be impossible goals, given the tumultuous socioeconomic and political conditions. In Sri Lanka, the quality of life has been eroded, not enhanced, by the importation of Western technology and managerial capitalism and the destruction of indigenous processes. The chaos and violence that have been brought into the country have made it difficult to reach the poor children, women, and refugees in rural areas with primary health care interventions. Sri Lanka's unreachable--the decision making elites--have blocked access to the unreached--the urban and rural poor. If governments are to reach the unreached, they must remove the obstacles to a people-centered, community development process. It is the people themselves, and the institutions of their creation, that can reach the children amidst them in greatest need. To achieve this task, local communities must be provided with basic human rights, the power to make decisions that affect their lives, necessary resources, and appropriate technologies. Nongovernmental organizations can play a crucial role as bridges between the unreached and the unreachable by promoting community empowerment, aiding in the formation of networks of community organizations, and establishing linkages with government programs. If the ruling elites in developing countries can be persuaded to accommodate the needs and aspirations of those who, to date, have been excluded from the development process, the child survival revolution can be a nonviolent one.
International Nuclear Information System (INIS)
Daziano, C.
2010-01-01
Statistical analysis of trace elements in volcanics research s, allowed to distinguish two independent populations with the same geochemical environment. For each component they have variable index of homogeneity resulting in dissimilar average values that reveal geochemical intra telluric phenomena. On the other hand the inhomogeneities observed in these rocks - as reflected in its petrochemical characters - could be exacerbated especially at so remote and dispersed location of their pitches, their relations with the enclosing rocks for the ranges of compositional variation, due differences relative ages
Solar Hydrogen Reaching Maturity
Directory of Open Access Journals (Sweden)
Rongé Jan
2015-09-01
Full Text Available Increasingly vast research efforts are devoted to the development of materials and processes for solar hydrogen production by light-driven dissociation of water into oxygen and hydrogen. Storage of solar energy in chemical bonds resolves the issues associated with the intermittent nature of sunlight, by decoupling energy generation and consumption. This paper investigates recent advances and prospects in solar hydrogen processes that are reaching market readiness. Future energy scenarios involving solar hydrogen are proposed and a case is made for systems producing hydrogen from water vapor present in air, supported by advanced modeling.
BROOKHAVEN: Proton goal reached
International Nuclear Information System (INIS)
Anon.
1995-01-01
On March 30 the 35-year old Alternating Gradient Synchrotron (AGS) exceeded its updated design goal of 6 x 10 13 protons per pulse (ppp), by accelerating 6.3 x 10 13 ppp, a world record intensity. This goal was set 11 years ago and achieving it called for the construction of a new booster and the reconstruction of much of the AGS. The booster was completed in 1991, and reached its design intensity of 1.5 x 10 13 ppp in 1993. The AGS reconstruction was finished in 1994, and by July of that year the AGS claimed a new US record intensity for a proton synchrotron of 4 x 10 13 ppp, using four booster pulses. Reaching the design intensity was scheduled for 1995. In 1994, the AGS had seemed to be solidly limited to 4 x 10 13 ppp, but in 1995 the operations crew, working on their own in the quiet of the owl shift, steadily improved the intensity, regularly setting new records, much to the bemusement of the machine physicists. The physicists, however, did contribute. A second harmonic radiofrequency cavity in the booster increased the radiofrequency bucket area for capture, raising the booster intensity from 1.7 to 2.1 x 10 13 ppp. In the AGS, new radiofrequency power supplies raised the available voltage from 8 to 13 kV, greatly enhancing the beam loading capabilities of the system. A powerful new transverse damping system successfully controlled instabilities that otherwise would have destroyed the beam in less than a millisecond. Also in the AGS, 35th harmonic octupole resonances were found
BROOKHAVEN: Proton goal reached
Energy Technology Data Exchange (ETDEWEB)
Anon.
1995-09-15
On March 30 the 35-year old Alternating Gradient Synchrotron (AGS) exceeded its updated design goal of 6 x 10{sup 13} protons per pulse (ppp), by accelerating 6.3 x 10{sup 13} ppp, a world record intensity. This goal was set 11 years ago and achieving it called for the construction of a new booster and the reconstruction of much of the AGS. The booster was completed in 1991, and reached its design intensity of 1.5 x 10{sup 13} ppp in 1993. The AGS reconstruction was finished in 1994, and by July of that year the AGS claimed a new US record intensity for a proton synchrotron of 4 x 10{sup 13} ppp, using four booster pulses. Reaching the design intensity was scheduled for 1995. In 1994, the AGS had seemed to be solidly limited to 4 x 10{sup 13} ppp, but in 1995 the operations crew, working on their own in the quiet of the owl shift, steadily improved the intensity, regularly setting new records, much to the bemusement of the machine physicists. The physicists, however, did contribute. A second harmonic radiofrequency cavity in the booster increased the radiofrequency bucket area for capture, raising the booster intensity from 1.7 to 2.1 x 10{sup 13} ppp. In the AGS, new radiofrequency power supplies raised the available voltage from 8 to 13 kV, greatly enhancing the beam loading capabilities of the system. A powerful new transverse damping system successfully controlled instabilities that otherwise would have destroyed the beam in less than a millisecond. Also in the AGS, 35th harmonic octupole resonances were found.
Yin, Ke; Dou, Xiaomin; Ren, Fei; Chan, Wei-Ping; Chang, Victor Wei-Chung
2018-02-15
Bottom ashes generated from municipal solid waste incineration have gained increasing popularity as alternative construction materials, however, they contains elevated heavy metals posing a challenge for its free usage. Different leaching methods are developed to quantify leaching potential of incineration bottom ashes meanwhile guide its environmentally friendly application. Yet, there are diverse IBA applications while the in situ environment is always complicated, challenging its legislation. In this study, leaching tests were conveyed using batch and column leaching methods with seawater as opposed to deionized water, to unveil the metal leaching potential of IBA subjected to salty environment, which is commonly encountered when using IBA in land reclamation yet not well understood. Statistical analysis for different leaching methods suggested disparate performance between seawater and deionized water primarily ascribed to ionic strength. Impacts of leachant are metal-specific dependent on leaching methods and have a function of intrinsic characteristics of incineration bottom ashes. Leaching performances were further compared on additional perspectives, e.g. leaching approach and liquid to solid ratio, indicating sophisticated leaching potentials dominated by combined geochemistry. It is necessary to develop application-oriented leaching methods with corresponding leaching criteria to preclude discriminations between different applications, e.g., terrestrial applications vs. land reclamation. Copyright © 2017 Elsevier B.V. All rights reserved.
Hsu, Tung-Shin; McPherron, R. L.
2002-11-01
An outstanding problem in magnetospheric physics is deciding whether substorms are always triggered by external changes in the interplanetary magnetic field (IMF) or solar wind plasma, or whether they sometimes occur spontaneously. Over the past decade, arguments have been made on both sides of this issue. In fact, there is considerable evidence that some substorms are triggered. However, equally persuasive examples of substorms with no obvious trigger have been found. Because of conflicting views on this subject, further work is required to determine whether there is a physical relation between IMF triggers and substorm onset. In the work reported here a list of substorm onsets was created using two independent substorm signatures: sudden changes in the slope of the AL index and the start of a Pi 2 pulsation burst. Possible IMF triggers were determined from ISEE-2 observations. With the ISEE spacecraft near local noon immediately upstream of the bow shock, there can be little question about propagation delay to the magnetopause or whether a particular IMF feature hits the subsolar magnetopause. Thus it eliminates the objections that the calculated arrival time is subject to a large error or that the solar wind monitor missed a potential trigger incident at the subsolar point. Using a less familiar technique, statistics of point process, we find that the time delay between substorm onsets and the propagated arrival time of IMF triggers are clustered around zero. We estimate for independent processes that the probability of this clustering by chance alone is about 10-11. If we take into account the requirement that the IMF must have been southward prior to the onset, then the probability of clustering is higher, ˜10-5, but still extremely unlikely. Thus it is not possible to ascribe the apparent relation between IMF northward turnings and substorm onset to coincidence.
Baker, Mariah; Rosenthal, L.; Gaughan, A.; Hopkins, E.
2014-01-01
Strawbridge Observatory at Haverford College is home to a undergraduate-led public observing program. Our program holds ~once monthly public events throughout the academic year that take advantage of eyepiece observing on our 16-inch and 12-inch telescopes as well as of the classroom, library, and projection system. These resources allow us to organize a variety of astronomy related activities that are engaging for individuals of all ages: accessible student talks, current film screenings and even arts and crafts for the families who attend with young children. These events aim to spark curiosity in others about scientific discovery and about the remarkable nature of the world in which we live. In addition to exciting local families about astronomy, this program has excited Haverford students from a range of disciplines about both science and education. Being entirely student led means that we are able to take the initiative in planning, coordinating and running all events, fostering an atmosphere of collaboration, experimentation and commitment amongst our volunteers. Additionally, this program is one of the few at Haverford that regularly reaches beyond the campus walls to promote and build relationships with the outside community. In light of this, our program presents a distinctive and enlightening opportunity for student volunteers: we get to use our scientific backgrounds to educate a general audience, while also learning from them about how to communicate and inspire in others the excitement we feel about the subject of astronomy. The work on this project has been supported by NSF AST-1151462.
Lewis-Fernández, Roberto; Raggio, Greer A.; Gorritz, Magdaliz; Duan, Naihua; Marcus, Sue; Cabassa, Leopoldo J.; Humensky, Jennifer; Becker, Anne E.; Alarcón, Renato D.; Oquendo, María A.; Hansen, Helena; Like, Robert C.; Weiss, Mitchell; Desai, Prakash N.; Jacobsen, Frederick M.; Foulks, Edward F.; Primm, Annelle; Lu, Francis; Kopelowicz, Alex; Hinton, Ladson; Hinton, Devon E.
2015-01-01
Growing awareness of health and health care disparities highlights the importance of including information about race, ethnicity, and culture (REC) in health research. Reporting of REC factors in research publications, however, is notoriously imprecise and unsystematic. This article describes the development of a checklist to assess the comprehensiveness and the applicability of REC factor reporting in psychiatric research publications. The 16-itemGAP-REACH© checklist was developed through a rigorous process of expert consensus, empirical content analysis in a sample of publications (N = 1205), and interrater reliability (IRR) assessment (N = 30). The items assess each section in the conventional structure of a health research article. Data from the assessment may be considered on an item-by-item basis or as a total score ranging from 0% to 100%. The final checklist has excellent IRR (κ = 0.91). The GAP-REACH may be used by multiple research stakeholders to assess the scope of REC reporting in a research article. PMID:24080673
2001-01-01
The creation of the world's largest sandstone cavern, not a small feat! At the bottom, cave-in preventing steel mesh can be seen clinging to the top of the tunnel. The digging of UX-15, the cavern that will house ATLAS, reached the upper ceiling of LEP on October 10th. The breakthrough which took place nearly 100 metres underground occurred precisely on schedule and exactly as planned. But much caution was taken beforehand to make the LEP breakthrough clean and safe. To prevent the possibility of cave-ins in the side tunnels that will eventually be attached to the completed UX-15 cavern, reinforcing steel mesh was fixed into the walls with bolts. Obviously no people were allowed in the LEP tunnels below UX-15 as the breakthrough occurred. The area was completely evacuated and fences were put into place to keep all personnel out. However, while personnel were being kept out of the tunnels below, this has been anything but the case for the work taking place up above. With the creation of the world's largest...
Harris, Joshua D; Brand, Jefferson C; Cote, Mark P; Dhawan, Aman
2017-08-01
Within the health care environment, there has been a recent and appropriate trend towards emphasizing the value of care provision. Reduced cost and higher quality improve the value of care. Quality is a challenging, heterogeneous, variably defined concept. At the core of quality is the patient's outcome, quantified by a vast assortment of subjective and objective outcome measures. There has been a recent evolution towards evidence-based medicine in health care, clearly elucidating the role of high-quality evidence across groups of patients and studies. Synthetic studies, such as systematic reviews and meta-analyses, are at the top of the evidence-based medicine hierarchy. Thus, these investigations may be the best potential source of guiding diagnostic, therapeutic, prognostic, and economic medical decision making. Systematic reviews critically appraise and synthesize the best available evidence to provide a conclusion statement (a "take-home point") in response to a specific answerable clinical question. A meta-analysis uses statistical methods to quantitatively combine data from single studies. Meta-analyses should be performed with high methodological quality homogenous studies (Level I or II) or evidence randomized studies, to minimize confounding variable bias. When it is known that the literature is inadequate or a recent systematic review has already been performed with a demonstration of insufficient data, then a new systematic review does not add anything meaningful to the literature. PROSPERO registration and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines assist authors in the design and conduct of systematic reviews and should always be used. Complete transparency of the conduct of the review permits reproducibility and improves fidelity of the conclusions. Pooling of data from overly dissimilar investigations should be avoided. This particularly applies to Level IV evidence, that is, noncomparative investigations
Kim, Yee Suk; Lee, Sungin; Zong, Nansu; Kahng, Jimin
2017-01-01
The present study aimed to investigate differences in prognosis based on human papillomavirus (HPV) infection, persistent infection and genotype variations for patients exhibiting atypical squamous cells of undetermined significance (ASCUS) in their initial Papanicolaou (PAP) test results. A latent Dirichlet allocation (LDA)-based tool was developed that may offer a facilitated means of communication to be employed during patient-doctor consultations. The present study assessed 491 patients (139 HPV-positive and 352 HPV-negative cases) with a PAP test result of ASCUS with a follow-up period ≥2 years. Patients underwent PAP and HPV DNA chip tests between January 2006 and January 2009. The HPV-positive subjects were followed up with at least 2 instances of PAP and HPV DNA chip tests. The most common genotypes observed were HPV-16 (25.9%, 36/139), HPV-52 (14.4%, 20/139), HPV-58 (13.7%, 19/139), HPV-56 (11.5%, 16/139), HPV-51 (9.4%, 13/139) and HPV-18 (8.6%, 12/139). A total of 33.3% (12/36) patients positive for HPV-16 had cervical intraepithelial neoplasia (CIN)2 or a worse result, which was significantly higher than the prevalence of CIN2 of 1.8% (8/455) in patients negative for HPV-16 (Paged ≥51 years (38.7%) than in those aged ≤50 years (20.4%; P=0.036). Progression from persistent infection to CIN2 or worse (19/34, 55.9%) was higher than clearance (0/105, 0.0%; Page and long infection period with a clinical progression of CIN2 or worse. Therefore, LDA results may be presented as explanatory evidence during time-constrained patient-doctor consultations in order to deliver information regarding the patient's status. PMID:28587376
Peer Support for the Hardly Reached: A Systematic Review.
Sokol, Rebeccah; Fisher, Edwin
2016-07-01
Health disparities are aggravated when prevention and care initiatives fail to reach those they are intended to help. Groups can be classified as hardly reached according to a variety of circumstances that fall into 3 domains: individual (e.g., psychological factors), demographic (e.g., socioeconomic status), and cultural-environmental (e.g., social network). Several reports have indicated that peer support is an effective means of reaching hardly reached individuals. However, no review has explored peer support effectiveness in relation to the circumstances associated with being hardly reached or across diverse health problems. To conduct a systematic review assessing the reach and effectiveness of peer support among hardly reached individuals, as well as peer support strategies used. Three systematic searches conducted in PubMed identified studies that evaluated peer support programs among hardly reached individuals. In aggregate, the searches covered articles published from 2000 to 2015. Eligible interventions provided ongoing support for complex health behaviors, including prioritization of hardly reached populations, assistance in applying behavior change plans, and social-emotional support directed toward disease management or quality of life. Studies were excluded if they addressed temporally isolated behaviors, were limited to protocol group classes, included peer support as the dependent variable, did not include statistical tests of significance, or incorporated comparison conditions that provided appreciable social support. We abstracted data regarding the primary health topic, categorizations of hardly reached groups, program reach, outcomes, and strategies employed. We conducted a 2-sample t test to determine whether reported strategies were related to reach. Forty-seven studies met our inclusion criteria, and these studies represented each of the 3 domains of circumstances assessed (individual, demographic, and cultural-environmental). Interventions
Mexican agencies reach teenagers.
Brito Lemus, R; Beamish, J
1992-08-01
The Gente Joven project of the Mexican Foundation for Family Planning (MEXFAM) trains young volunteers in 19 cities to spread messages about sexually transmitted diseases and population growth to their peers. They also distribute condoms and spermicides. It also uses films and materials to spread its messages. The project would like to influence young men's behavior, but the Latin image of machismo poses a big challenge. It would like to become more responsible toward pregnancy prevention. About 50% of adolescents have sexual intercourse, but few use contraceptives resulting in a high adolescent pregnancy rate. Many of these pregnant teenagers choose not to marry. Adolescent pregnancy leads to girls leaving school, few marketable skills, and rearing children alone. Besides women who began childbearing as a teenager have 1.5 times more children than other women. Male involvement in pregnancy prevention should improve these statistics. As late as 1973, the Health Code banned promotion and sales of contraceptives, but by 1992 about 50% of women of reproductive age use contraceptives. The Center for the Orientation of Adolescents has organized 8 Young Men's Clubs in Mexico City to involve male teenagers more in family planning and to develop self-confidence. It uses a holistic approach to their development through discussions with their peers. A MEXFAM study shows that young men are not close with their fathers who tend to exude a machismo attitude, thus the young men do not have a role model for responsible sexual behavior. MEXFAM's work is cut out for them, however, since the same study indicates that 50% of the young men believe it is fine to have 1 girlfriend and 33% think women should earn more than men. A teenager volunteer reports, however, that more boys have been coming to him for contraception and information than girls in 1992 while in other years girls outnumbered the boys.
Wilson, Robert M.
2001-01-01
Since 1750, the number of cataclysmic volcanic eruptions (volcanic explosivity index (VEI)>=4) per decade spans 2-11, with 96 percent located in the tropics and extra-tropical Northern Hemisphere. A two-point moving average of the volcanic time series has higher values since the 1860's than before, being 8.00 in the 1910's (the highest value) and 6.50 in the 1980's, the highest since the 1910's peak. Because of the usual behavior of the first difference of the two-point moving averages, one infers that its value for the 1990's will measure approximately 6.50 +/- 1, implying that approximately 7 +/- 4 cataclysmic volcanic eruptions should be expected during the present decade (2000-2009). Because cataclysmic volcanic eruptions (especially those having VEI>=5) nearly always have been associated with short-term episodes of global cooling, the occurrence of even one might confuse our ability to assess the effects of global warming. Poisson probability distributions reveal that the probability of one or more events with a VEI>=4 within the next ten years is >99 percent. It is approximately 49 percent for an event with a VEI>=5, and 18 percent for an event with a VEI>=6. Hence, the likelihood that a climatically significant volcanic eruption will occur within the next ten years appears reasonably high.
Policy Analysis Reaches Midlife
Directory of Open Access Journals (Sweden)
Beryl A. Radin
2013-07-01
Full Text Available The field of policy analysis that exists in the 21st century is quite different from that found earlier phases. The world of the 1960s that gave rise to this field in the US often seems unrelated to the world we experience today. These shifts have occurred as a result of a range of developments – technological changes, changes in the structure and processes of government both internally and globally, new expectations about accountability and transparency, economic and fiscal problems, and increased political and ideological conflict.It is clear globalization has had a significant impact on the field. Shifts in the type of decisionmaking also have created challenges for policy analysts since analysts are now clearly in every nook and cranny in the decisionmaking world. Thus it is relevant to look at the work that they do, the skills that they require, and the background experience that is relevant to them.
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Reaching ignition in the tokamak
International Nuclear Information System (INIS)
Furth, H.P.
1985-06-01
This review covers the following areas: (1) the physics of burning plasmas, (2) plasma physics requirements for reaching ignition, (3) design studies for ignition devices, and (4) prospects for an ignition project
Energy Technology Data Exchange (ETDEWEB)
Kyle, Jennifer E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Casey, Cameron P. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Stratton, Kelly G. [National Security Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zink, Erika M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Kim, Young-Mo [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zheng, Xueyun [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Monroe, Matthew E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Weitz, Karl K. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Bloodsworth, Kent J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Orton, Daniel J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Ibrahim, Yehia M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Moore, Ronald J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Lee, Christine G. [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Research Service, Portland Veterans Affairs Medical Center, Portland OR USA; Pedersen, Catherine [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Orwoll, Eric [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Smith, Richard D. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Burnum-Johnson, Kristin E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Baker, Erin S. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA
2017-02-05
The use of dried blood spots (DBS) has many advantages over traditional plasma and serum samples such as smaller blood volume required, storage at room temperature, and ability for sampling in remote locations. However, understanding the robustness of different analytes in DBS samples is essential, especially in older samples collected for longitudinal studies. Here we analyzed DBS samples collected in 2000-2001 and stored at room temperature and compared them to matched serum samples stored at -80°C to determine if they could be effectively used as specific time points in a longitudinal study following metabolic disease. Four hundred small molecules were identified in both the serum and DBS samples using gas chromatograph-mass spectrometry (GC-MS), liquid chromatography-MS (LC-MS) and LC-ion mobility spectrometry-MS (LC-IMS-MS). The identified polar metabolites overlapped well between the sample types, though only one statistically significant polar metabolite in a case-control study was conserved, indicating degradation occurs in the DBS samples affecting quantitation. Differences in the lipid identifications indicated that some oxidation occurs in the DBS samples. However, thirty-six statistically significant lipids correlated in both sample types indicating that lipid quantitation was more stable across the sample types.
Directory of Open Access Journals (Sweden)
Madelaine Sarria Castro
2004-05-01
Full Text Available OBJETIVOS: Caracterizar el empleo de las pruebas convencionales de significación estadística y las tendencias actuales que muestra su uso en tres revistas biomédicas del ámbito hispanohablante. MÉTODOS: Se examinaron todos los artículos originales descriptivos o explicativos que fueron publicados en el quinquenio de 19962000 en tres publicaciones: Revista Cubana de Medicina General Integral, Revista Panamericana de Salud Pública/Pan American Journal of Public Health y Medicina Clínica. RESULTADOS: En las tres revistas examinadas se detectaron diversos rasgos criticables en el empleo de las pruebas de hipótesis basadas en los "valores P" y la escasa presencia de las nuevas tendencias que se proponen en su lugar: intervalos de confianza (IC e inferencia bayesiana. Los hallazgos fundamentales fueron los siguientes: mínima presencia de los IC, ya fuese como complemento de las pruebas de significación o como recurso estadístico único; mención del tamaño muestral como posible explicación de los resultados; predominio del empleo de valores rígidos de alfa; falta de uniformidad en la presentación de los resultados, y alusión indebida en las conclusiones de la investigación a los resultados de las pruebas de hipótesis. CONCLUSIONES: Los resultados reflejan la falta de acatamiento de autores y editores en relación con las normas aceptadas en torno al uso de las pruebas de significación estadística y apuntan a que el empleo adocenado de estas pruebas sigue ocupando un espacio importante en la literatura biomédica del ámbito hispanohablante.OBJECTIVE: To describe the use of conventional tests of statistical significance and the current trends shown by their use in three biomedical journals read in Spanish-speaking countries. METHODS: All descriptive or explanatory original articles published in the five-year period of 1996 through 2000 were reviewed in three journals: Revista Cubana de Medicina General Integral [Cuban Journal of
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
RECORDS REACHING RECORDING DATA TECHNOLOGIES
Directory of Open Access Journals (Sweden)
G. W. L. Gresik
2013-07-01
Full Text Available The goal of RECORDS (Reaching Recording Data Technologies is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.
Records Reaching Recording Data Technologies
Gresik, G. W. L.; Siebe, S.; Drewello, R.
2013-07-01
The goal of RECORDS (Reaching Recording Data Technologies) is the digital capturing of buildings and cultural heritage objects in hard-to-reach areas and the combination of data. It is achieved by using a modified crane from film industry, which is able to carry different measuring systems. The low-vibration measurement should be guaranteed by a gyroscopic controlled advice that has been , developed for the project. The data were achieved by using digital photography, UV-fluorescence photography, infrared reflectography, infrared thermography and shearography. Also a terrestrial 3D laser scanner and a light stripe topography scanner have been used The combination of the recorded data should ensure a complementary analysis of monuments and buildings.
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Andrić Marijana Mišić; Markov Slobodanka
2017-01-01
Underrepresentation of women in leadership positions in universities is a phenomenon present in most countries of the world, with some significant differences. In our work we focused on obstacles that women professors in Novi Sad University (Serbia) faced in reaching leadership positions. Analysis is based on qualitative research using a semi structured interview, statistical data and selected secondary sources. Obstacles, mentorship and networking have been researched from an idiographic per...
Virtual reality training improves turning capacity and functional reach in stroke patients
International Nuclear Information System (INIS)
Malik, A.N.; Masood, T.
2017-01-01
Objective: To determine the added effects of virtual reality training on turning capacity, gait parameters and functional reach capacity of stroke patients compared to task oriented training alone. Methodology: A randomized control trial was conducted from February 2016 to July 2106 at Physical Rehabilitation Department Pakistan Railway Hospital, Rawalpindi, Pakistan. Twenty stroke patients were selected through purposive sampling. The patients were randomly assigned through sealed envelope method into two groups; Task Oriented Training (TOT) and Virtual Reality Training (VRT) Group. The standardized tools were used for assessment. The TOT was provided for 4 days per week for 6 weeks while VRT group received additional exer-gaming training during sessions. Results: Significant improvement was observed in both groups regarding reaching forward, turning 360, gait pivot turn (p a 0.01) and FRT (p a 0.001). The two groups were statistically different from each other in terms of turning capacity, reaching forward, gait pivot turn and functional reach after 6 weeks of intervention (p a 0.05) Conclusion: Addition of virtual reality training further improves the significant improvement caused by task oriented training on turning capacity, reaching forward, gait pivot turn and functional reach in stroke patients. (author)
REACH and nanomaterials: current status
International Nuclear Information System (INIS)
Alessandrelli, Maria; Di Prospero Fanghella, Paola; Polci, Maria Letizia; Castelli, Stefano; Pettirossi, Flavio
2015-01-01
New challenges for regulators are emerging about a specific assessment and appropriate management of the potential risks of nanomaterials. In the framework of European legislation on chemicals, Regulation (EC) No. 1907/2006 REACH aims to ensure the safety of human health and the environment through the collection of information on the physico-chemical characteristics of the substances and on their profile (eco) toxicological and the identification of appropriate risk management linked to 'exposure to these substances without impeding scientific progress and the competitiveness of industry. In order to cover the current shortage of information on the safety of nanomaterials and tackle the acknowledged legal vacuum, are being a rich activities, carried out both by regulators both by stake holders, and discussions on the proposals for adapting the European regulatory framework for chemicals . The European Commission is geared to strengthen the REACH Regulation by means of updates of its annexes. The importance of responding to the regulatory requirements has highlighted the need for cooperation between European organizations, scientists and industries to promote and ensure the safe use of nanomaterials. [it
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
David, L
1996-05-01
The distant shores of Mars were reached by numerous U.S. and Russian spacecraft throughout the 1960s to mid 1970s. Nearly 20 years have passed since those successful missions which orbited and landed on the Martian surface. Two Soviet probes headed for the planet in July, 1988, but later failed. In August 1993, the U.S. Mars Observer suddenly went silent just three days before it was to enter orbit around the planet and was never heard from again. In late 1996, there will be renewed activity on the launch pads with three probes departing for the red planet: 1) The U.S. Mars Global Surveyor will be launched in November on a Delta II rocket and will orbit the planet for global mapping purposes; 2) Russia's Mars '96 mission, scheduled to fly in November on a Proton launcher, consists of an orbiter, two small stations which will land on the Martian surface, and two penetrators that will plow into the terrain; and finally, 3) a U.S. Discovery-class spacecraft, the Mars Pathfinder, has a December launch date atop a Delta II booster. The mission features a lander and a microrover that will travel short distances over Martian territory. These missions usher in a new phase of Mars exploration, setting the stage for an unprecedented volley of spacecraft that will orbit around, land on, drive across, and perhaps fly at low altitudes over the planet.
Metasurface holograms reaching 80% efficiency.
Zheng, Guoxing; Mühlenbernd, Holger; Kenney, Mitchell; Li, Guixin; Zentgraf, Thomas; Zhang, Shuang
2015-04-01
Surfaces covered by ultrathin plasmonic structures--so-called metasurfaces--have recently been shown to be capable of completely controlling the phase of light, representing a new paradigm for the design of innovative optical elements such as ultrathin flat lenses, directional couplers for surface plasmon polaritons and wave plate vortex beam generation. Among the various types of metasurfaces, geometric metasurfaces, which consist of an array of plasmonic nanorods with spatially varying orientations, have shown superior phase control due to the geometric nature of their phase profile. Metasurfaces have recently been used to make computer-generated holograms, but the hologram efficiency remained too low at visible wavelengths for practical purposes. Here, we report the design and realization of a geometric metasurface hologram reaching diffraction efficiencies of 80% at 825 nm and a broad bandwidth between 630 nm and 1,050 nm. The 16-level-phase computer-generated hologram demonstrated here combines the advantages of a geometric metasurface for the superior control of the phase profile and of reflectarrays for achieving high polarization conversion efficiency. Specifically, the design of the hologram integrates a ground metal plane with a geometric metasurface that enhances the conversion efficiency between the two circular polarization states, leading to high diffraction efficiency without complicating the fabrication process. Because of these advantages, our strategy could be viable for various practical holographic applications.
Olefins and chemical regulation in Europe: REACH.
Penman, Mike; Banton, Marcy; Erler, Steffen; Moore, Nigel; Semmler, Klaus
2015-11-05
REACH (Registration, Evaluation, Authorisation and Restriction of Chemicals) is the European Union's chemical regulation for the management of risk to human health and the environment (European Chemicals Agency, 2006). This regulation entered into force in June 2007 and required manufacturers and importers to register substances produced in annual quantities of 1000 tonnes or more by December 2010, with further deadlines for lower tonnages in 2013 and 2018. Depending on the type of registration, required information included the substance's identification, the hazards of the substance, the potential exposure arising from the manufacture or import, the identified uses of the substance, and the operational conditions and risk management measures applied or recommended to downstream users. Among the content developed to support this information were Derived No-Effect Levels or Derived Minimal Effect Levels (DNELs/DMELs) for human health hazard assessment, Predicted No Effect Concentrations (PNECs) for environmental hazard assessment, and exposure scenarios for exposure and risk assessment. Once registered, substances may undergo evaluation by the European Chemicals Agency (ECHA) or Member State authorities and be subject to requests for additional information or testing as well as additional risk reduction measures. To manage the REACH registration and related activities for the European olefins and aromatics industry, the Lower Olefins and Aromatics REACH Consortium was formed in 2008 with administrative and technical support provided by Penman Consulting. A total of 135 substances are managed by this group including 26 individual chemical registrations (e.g. benzene, 1,3-butadiene) and 13 categories consisting of 5-26 substances. This presentation will describe the content of selected registrations prepared for 2010 in addition to the significant post-2010 activities. Beyond REACH, content of the registrations may also be relevant to other European activities, for
Inverse statistical approach in heartbeat time series
International Nuclear Information System (INIS)
Ebadi, H; Shirazi, A H; Mani, Ali R; Jafari, G R
2011-01-01
We present an investigation on heart cycle time series, using inverse statistical analysis, a concept borrowed from studying turbulence. Using this approach, we studied the distribution of the exit times needed to achieve a predefined level of heart rate alteration. Such analysis uncovers the most likely waiting time needed to reach a certain change in the rate of heart beat. This analysis showed a significant difference between the raw data and shuffled data, when the heart rate accelerates or decelerates to a rare event. We also report that inverse statistical analysis can distinguish between the electrocardiograms taken from healthy volunteers and patients with heart failure
Statistical Symbolic Execution with Informed Sampling
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
De Vreese, L P; Gomiero, T; Uberti, M; De Bastiani, E; Weger, E; Mantesso, U; Marangoni, A
2015-04-01
(a) A psychometric validation of an Italian version of the Alzheimer's Functional Assessment Tool scale (AFAST-I), designed for informant-based assessment of the degree of impairment and of assistance required in seven basic daily activities in adult/elderly people with intellectual disabilities (ID) and (suspected) dementia; (b) a pilot analysis of its clinical significance with traditional statistical procedures and with an artificial neural network. AFAST-I was administered to the professional caregivers of 61 adults/seniors with ID with a mean age (± SD) of 53.4 (± 7.7) years (36% with Down syndrome). Internal consistency (Cronbach's α coefficient), inter/intra-rater reliabilities (intra-class coefficients, ICC) and concurrent, convergent and discriminant validity (Pearson's r coefficients) were computed. Clinical significance was probed by analysing the relationships among AFAST-I scores and the Sum of Cognitive Scores (SCS) and the Sum of Social Scores (SOS) of the Dementia Questionnaire for Persons with Intellectual Disabilities (DMR-I) after standardisation of their raw scores in equivalent scores (ES). An adaptive artificial system (AutoContractive Maps, AutoCM) was applied to all the variables recorded in the study sample, aimed at uncovering which variable occupies a central position and supports the entire network made up of the remaining variables interconnected among themselves with different weights. AFAST-I shows a high level of internal homogeneity with a Cronbach's α coefficient of 0.92. Inter-rater and intra-rater reliabilities were also excellent with ICC correlations of 0.96 and 0.93, respectively. The results of the analyses of the different AFAST-I validities all go in the expected direction: concurrent validity (r=-0.87 with ADL); convergent validity (r=0.63 with SCS; r=0.61 with SOS); discriminant validity (r=0.21 with the frequency of occurrence of dementia-related Behavioral Excesses of the Assessment for Adults with Developmental
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Challenges of extension workers in reaching rural women farmers in ...
African Journals Online (AJOL)
The study examined the challenges of extension workers in reaching rural women farmers in Enugu State Nigeria. Questionnaire was used to collect data from a sample size of 52 extension workers. Data were analyzed using percentage, mean statistic, chart and factor analysis. Results revealed that training and visit ...
ALMA Telescope Reaches New Heights
2009-09-01
ball at a distance of nine miles, and to keep their smooth reflecting surfaces accurate to less than the thickness of a human hair. Once the transporter reached the high plateau it carried the antenna to a concrete pad -- a docking station with connections for power and fiber optics -- and positioned it with an accuracy of a small fraction of an inch. The transporter is guided by a laser steering system and, just like some cars, also has ultrasonic collision detectors. These sensors ensure the safety of the state-of-the-art antennas as the transporter drives them across what will soon be a rather crowded plateau. Ultimately, ALMA will have at least 66 antennas distributed over about 200 pads, spread over distances of up to 11.5 miles and operating as a single, giant telescope. Even when ALMA is fully operational, the transporters will be used to move the antennas between pads to reconfigure the telescope for different kinds of observations. This first ALMA antenna at the high site will soon be joined by others, and the ALMA team looks forward to making their first observations from the Chajnantor plateau. They plan to link three antennas by early 2010, and to make the first scientific observations with ALMA in the second half of 2011. ALMA will help astronomers answer important questions about our cosmic origins. The telescope will observe the Universe using light with millimeter and submillimeter wavelengths, between infrared light and radio waves in the electromagnetic spectrum. Light at these wavelengths comes from some of the coldest, and from some of the most distant objects in the cosmos. These include cold clouds of gas and dust where new stars are being born, or remote galaxies towards the edge of the observable universe. The Universe is relatively unexplored at submillimeter wavelengths, as the telescopes need extremely dry atmospheric conditions, such as those at Chajnantor, and advanced detector technology. The Atacama Large Millimeter/submillimeter Array
Directory of Open Access Journals (Sweden)
Andrić Marijana Mišić
2017-12-01
Full Text Available Underrepresentation of women in leadership positions in universities is a phenomenon present in most countries of the world, with some significant differences. In our work we focused on obstacles that women professors in Novi Sad University (Serbia faced in reaching leadership positions. Analysis is based on qualitative research using a semi structured interview, statistical data and selected secondary sources. Obstacles, mentorship and networking have been researched from an idiographic perspective (reflection and the personal experience of the women at Novi Sad University. Results indicate a significant underrepresentation of women in leadership positions at Novi Sad University. Findings point to a general pattern: the more power and authority the leadership position holds, the scarcer the number of women participating in it. According to interviewees’ statements the patriarchal value system makes the leadership positions difficult to attain for women. Interview analysis also suggests additional limiting factors, such as lack of mentorship and inadequate networking, acting as inhibitors in reaching leadership positions.
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Operational Reach: Is Current Army Doctrine Adequate?
National Research Council Canada - National Science Library
Heintzelman, Scott
2003-01-01
The term operational reach, an element of operational design, is new to U.S. Army doctrine. Operational reach is not found in the previous edition of the Army's basic operational doctrine, Field Manual...
Stream Habitat Reach Summary - NCWAP [ds158
California Natural Resource Agency — The Stream Habitat - NCWAP - Reach Summary [ds158] shapefile contains in-stream habitat survey data summarized to the stream reach level. It is a derivative of the...
Efficacy of REACH Forgiveness across cultures.
Lin, Yin; Worthington, Everett L; Griffin, Brandon J; Greer, Chelsea L; Opare-Henaku, Annabella; Lavelock, Caroline R; Hook, Joshua N; Ho, Man Yee; Muller, Holly
2014-09-01
This study investigates the efficacy of the 6-hour REACH Forgiveness intervention among culturally diverse undergraduates. Female undergraduates (N = 102) and foreign extraction (46.2%) and domestic (43.8%) students in the United States were randomly assigned to immediate treatment or waitlist conditions. Treatment efficacy and the effect of culture on treatment response were assessed using measures of emotional and decisional forgiveness across 3 time periods. Students in the treatment condition reported greater improvement in emotional forgiveness, but not decisional forgiveness, relative to those in the waitlist condition. Gains were maintained at a 1-week follow-up. Although culture did not moderate the effect of treatment, a main effect of culture on emotional forgiveness and marginally significant interaction effect of culture on decisional forgiveness were found. The REACH Forgiveness intervention was efficacious for college students from different cultural backgrounds when conducted in the United States. However, some evidence may warrant development of culturally adapted forgiveness interventions. © 2014 Wiley Periodicals, Inc.
Scheck, Florian
2016-01-01
Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
REACH: impact on the US cosmetics industry?
Pouillot, Anne; Polla, Barbara; Polla, Ada
2009-03-01
The Registration, Evaluation, Authorization and restriction of Chemicals (REACH) is a recent European regulation on chemical substances meant to protect human health and the environment. REACH imposes the "precautionary principle" where additional data and definitive action are required when uncertainty is identified. The cosmetics industry is only partially concerned by REACH: while the stages of registration and evaluation apply to cosmetics, those of authorization and restriction most likely will not, as cosmetic ingredients are already subject to regulation by various agencies and directives. REACH has potential benefits to the industry including the possibility of reassuring consumers and improving their image of chemicals and cosmetics. However, REACH also has potential disadvantages, mainly with regard to impeding innovation. The American cosmetics industry will be affected by REACH, because all US manufacturers who export substances to Europe will have to fully comply with REACH.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
Directory of Open Access Journals (Sweden)
Sreeram V Ramagopalan
2015-04-01
Full Text Available Background: We and others have shown a significant proportion of interventional trials registered on ClinicalTrials.gov have their primary outcomes altered after the listed study start and completion dates. The objectives of this study were to investigate whether changes made to primary outcomes are associated with the likelihood of reporting a statistically significant primary outcome on ClinicalTrials.gov. Methods: A cross-sectional analysis of all interventional clinical trials registered on ClinicalTrials.gov as of 20 November 2014 was performed. The main outcome was any change made to the initially listed primary outcome and the time of the change in relation to the trial start and end date. Findings: 13,238 completed interventional trials were registered with ClinicalTrials.gov that also had study results posted on the website. 2555 (19.3% had one or more statistically significant primary outcomes. Statistical analysis showed that registration year, funding source and primary outcome change after trial completion were associated with reporting a statistically significant primary outcome. Conclusions: Funding source and primary outcome change after trial completion are associated with a statistically significant primary outcome report on clinicaltrials.gov.
Impact of Austrian hydropower plants on the flood control safety of the Hungarian Danube reach
International Nuclear Information System (INIS)
Zsuffa, I.
1999-01-01
Statistical analysis of daily water level data from four gauging stations along the Hungarian Danube reach has been carried out with the purpose of analysing the impact of the Austrian hydropower plants on the floods of the river. Conditional probability distribution functions of annual flood load maxima and annual number of floods were generated for the periods 1957-1976 and 1977-1996. By comparing these distribution functions, it could be shown that the flood load maxima have decreased, while the number of small and medium floods have increased during the past forty years. These changes indicate a decreased rate of flood superposition resulting from the barrages constructed in this period. The significantly decreased flood load maxima indicate that the Austrian barrage system has positive impact on the flood control safety of the Hungarian Danube reach
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
REACH: Evaluation Report and Executive Summary
Sibieta, Luke
2016-01-01
REACH is a targeted reading support programme designed to improve reading accuracy and comprehension in pupils with reading difficulties in Years 7 and 8. It is based on research by the Centre for Reading and Language at York and is delivered by specially trained teaching assistants (TAs). This evaluation tested two REACH interventions, one based…
Decoding natural reach-and-grasp actions from human EEG
Schwarz, Andreas; Ofner, Patrick; Pereira, Joana; Ioana Sburlea, Andreea; Müller-Putz, Gernot R.
2018-02-01
Objective. Despite the high number of degrees of freedom of the human hand, most actions of daily life can be executed incorporating only palmar, pincer and lateral grasp. In this study we attempt to discriminate these three different executed reach-and-grasp actions utilizing their EEG neural correlates. Approach. In a cue-guided experiment, 15 healthy individuals were asked to perform these actions using daily life objects. We recorded 72 trials for each reach-and-grasp condition and from a no-movement condition. Main results. Using low-frequency time domain features from 0.3 to 3 Hz, we achieved binary classification accuracies of 72.4%, STD ± 5.8% between grasp types, for grasps versus no-movement condition peak performances of 93.5%, STD ± 4.6% could be reached. In an offline multiclass classification scenario which incorporated not only all reach-and-grasp actions but also the no-movement condition, the highest performance could be reached using a window of 1000 ms for feature extraction. Classification performance peaked at 65.9%, STD ± 8.1%. Underlying neural correlates of the reach-and-grasp actions, investigated over the primary motor cortex, showed significant differences starting from approximately 800 ms to 1200 ms after the movement onset which is also the same time frame where classification performance reached its maximum. Significance. We could show that it is possible to discriminate three executed reach-and-grasp actions prominent in people’s everyday use from non-invasive EEG. Underlying neural correlates showed significant differences between all tested conditions. These findings will eventually contribute to our attempt of controlling a neuroprosthesis in a natural and intuitive way, which could ultimately benefit motor impaired end users in their daily life actions.
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Statistics: a Bayesian perspective
National Research Council Canada - National Science Library
Berry, Donald A
1996-01-01
...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...
The database for reaching experiments and models.
Directory of Open Access Journals (Sweden)
Ben Walker
Full Text Available Reaching is one of the central experimental paradigms in the field of motor control, and many computational models of reaching have been published. While most of these models try to explain subject data (such as movement kinematics, reaching performance, forces, etc. from only a single experiment, distinct experiments often share experimental conditions and record similar kinematics. This suggests that reaching models could be applied to (and falsified by multiple experiments. However, using multiple datasets is difficult because experimental data formats vary widely. Standardizing data formats promises to enable scientists to test model predictions against many experiments and to compare experimental results across labs. Here we report on the development of a new resource available to scientists: a database of reaching called the Database for Reaching Experiments And Models (DREAM. DREAM collects both experimental datasets and models and facilitates their comparison by standardizing formats. The DREAM project promises to be useful for experimentalists who want to understand how their data relates to models, for modelers who want to test their theories, and for educators who want to help students better understand reaching experiments, models, and data analysis.
Toxicological comments to the discussion about REACH.
Greim, Helmut; Arand, Michael; Autrup, Herman; Bolt, Hermann M; Bridges, James; Dybing, Erik; Glomot, Rémi; Foa, Vito; Schulte-Hermann, Rolf
2006-03-01
It is the ultimate goal of the intended REACH process (Registration, Evaluation and Authorization of Chemicals) of the European Union to identify substances of hazardous properties and to evaluate the risks of human and environmental exposure. During the last few months there has been a controversial discussion as to what extent in vitro studies and consideration of structure activity relationship provide sufficient information to waive repeated exposure studies. Industry as well as certain regulatory agencies or NGOs support this approach and propose that repeated dose studies may only be required beyond 100 t/a. From a toxicological point of view it has to be stressed that this discussion primarily considers the cost reduction and protection of animals, whereas protection of human health and the environment are secondary. In vitro studies only allow identification of specific hazardous properties which can be detected by the specific test system. Moreover, appropriate information on the dose response of adverse effects, identification of thresholds and NOELs that are essential for risk characterization cannot be obtained from these studies. Consequently, identification of all relevant hazardous properties and endpoints of adverse effects can only be determined in the intact animal by repeated dose studies such as 28-day or 90-day studies. In the absence of such information the hazard identification is incomplete and there is no basis for appropriate risk assessment of human exposure. Thus, any waiving of repeated dose studies in animals bears the probability of unforeseen effects in case of acute or continuous human exposure. From this the undersigning European Toxicologists conclude: 1. The intention of REACH is to identify hazardous properties in order that a reliable risk assessment can be made and measures taken to deal with chemicals posing a significant risk. 2. The recent debate has centered on ways in which the well established in vivo methods for risk
When Does the Warmest Water Reach Greenland?
Grist, J. P.; Josey, S. A.; Boehme, L.; Meredith, M. P.; Laidre, K. L.; Heide-Jørgensen, M. P.; Kovacs, K. M.; Lydersen, C.; Davidson, F. J. M.; Stenson, G. B.; Hammill, M. O.; Marsh, R.; Coward, A.
2016-02-01
The warmest water reaching the east and west coast of Greenland is found between 200 and 600 m, in the warm Atlantic Water Layer (WL). Temperature changes within the WL have been highlighted as a possible cause of accelerated melting of tidewater glaciers and therefore are an important consideration for understanding global sea level rise. However, a limited number of winter observations of the WL have prohibited determining its seasonal variability. To address this, temperature data from Argo profiling floats, a range of sources within the World Ocean Database, and unprecedented coverage from marine-mammal borne sensors have been analyzed for the period 2002-2011. A significant seasonal range in temperature ( 1-2°C) is found in the warm layer, in contrast to most of the surrounding ocean. The magnitude of the seasonal cycle is thus comparable with the 1990s warming that was associated with an increased melt rate in a marine terminating glacier of West Greenland. The phase of the seasonal cycle exhibits considerable spatial variability; with high-resolution ocean model trajectory analysis suggesting it is determined by the time taken for waters to be advected from the subduction site in the Irminger Basin. For western Greenland, the annual temperature maximum occurs near or after the turn of the calendar year. This is significant because a recent study suggested that it is in the non-summer months when fjord-shelf exchanges allow the WL to most strongly influence glacier melt rate. However this is also the time of the year when the WL is least well observed. It is therefore clear that year-round subsurface temperature measurements are still required for a complete description of the WL seasonality, and in particular to ensure that the ice-melting potential of the WL is not underestimated.
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Enhancing US Operational Reach in Southeast Asia
National Research Council Canada - National Science Library
Hitchcock, David
2003-01-01
.... While this treat continues to exist, the US Pacific Command (PACOM) must also pursue a neat term methodology to expand its operational reach and ability to respond to contingencies throughout the East Asian littoral, especially within Southeast Asia...
Reaching the Overlooked Student in Physical Education
Esslinger, Keri; Esslinger, Travis; Bagshaw, Jarad
2015-01-01
This article describes the use of live action role-playing, or "LARPing," as a non-traditional activity that has the potential to reach students who are not interested in traditional physical education.
Compact muon solenoid magnet reaches full field
2006-01-01
Scientist of the U.S. Department of Energy in Fermilab and collaborators of the US/CMS project announced that the world's largest superconducting solenoid magnet has reached full field in tests at CERN. (1 apge)
Intervention for Maltreating Fathers: Statistically and Clinically Significant Change
Scott, Katreena L.; Lishak, Vicky
2012-01-01
Objective: Fathers are seldom the focus of efforts to address child maltreatment and little is currently known about the effectiveness of intervention for this population. To address this gap, we examined the efficacy of a community-based group treatment program for fathers who had abused or neglected their children or exposed their children to…
The questioned p value: clinical, practical and statistical significance
Directory of Open Access Journals (Sweden)
Rosa Jiménez-Paneque
2016-09-01
Full Text Available Resumen El uso del valor de p y la significación estadística han estado en entredicho desde principios de la década de los 80 en el siglo pasado hasta nuestros días. Mucho se ha discutido al respecto en el ámbito de la estadística y sus aplicaciones, en particular a la Epidemiología y la Salud Pública. El valor de p y su equivalente, la significación estadística, son por demás conceptos difíciles de asimilar para los muchos profesionales de la salud involucrados de alguna manera en la investigación aplicada a sus áreas de trabajo. Sin embargo, su significado debería ser claro en términos intuitivos a pesar de que se basa en conceptos teóricos del terreno de la Estadística-Matemática. Este artículo intenta presentar al valor de p como un concepto que se aplica a la vida diaria y por tanto intuitivamente sencillo pero cuyo uso adecuado no se puede separar de elementos teóricos y metodológicos con complejidad intrínseca. Se explican también de manera intuitiva las razones detrás de las críticas que ha recibido el valor de p y su uso aislado, principalmente la necesidad de deslindar significación estadística de significación clínica y se mencionan algunos de los remedios propuestos para estos problemas. Se termina aludiendo a la actual tendencia a reivindicar su uso apelando a la conveniencia de utilizarlo en ciertas situaciones y la reciente declaración de la Asociación Americana de Estadística al respecto.
Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance
Kramer, Karen L.; Veile, Amanda; Ot?rola-Castillo, Erik
2016-01-01
Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger s...
Gas revenue increasingly significant
International Nuclear Information System (INIS)
Megill, R.E.
1991-01-01
This paper briefly describes the wellhead prices of natural gas compared to crude oil over the past 70 years. Although natural gas prices have never reached price parity with crude oil, the relative value of a gas BTU has been increasing. It is one of the reasons that the total amount of money coming from natural gas wells is becoming more significant. From 1920 to 1955 the revenue at the wellhead for natural gas was only about 10% of the money received by producers. Most of the money needed for exploration, development, and production came from crude oil. At present, however, over 40% of the money from the upstream portion of the petroleum industry is from natural gas. As a result, in a few short years natural gas may become 50% of the money revenues generated from wellhead production facilities
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
Do working environment interventions reach shift workers?
Nabe-Nielsen, Kirsten; Jørgensen, Marie Birk; Garde, Anne Helene; Clausen, Thomas
2016-01-01
Shift workers are exposed to more physical and psychosocial stressors in the working environment as compared to day workers. Despite the need for targeted prevention, it is likely that workplace interventions less frequently reach shift workers. The aim was therefore to investigate whether the reach of workplace interventions varied between shift workers and day workers and whether such differences could be explained by the quality of leadership exhibited at different times of the day. We used questionnaire data from 5361 female care workers in the Danish eldercare sector. The questions concerned usual working hours, quality of leadership, and self-reported implementation of workplace activities aimed at stress reduction, reorganization of the working hours, and participation in improvements of working procedures or qualifications. Compared with day workers, shift workers were less likely to be reached by workplace interventions. For example, night workers less frequently reported that they had got more flexibility (OR 0.5; 95 % CI 0.3-0.7) or that they had participated in improvements of the working procedures (OR 0.6; 95 % CI 0.5-0.8). Quality of leadership to some extent explained the lack of reach of interventions especially among fixed evening workers. In the light of the evidence of shift workers' stressful working conditions, we suggest that future studies focus on the generalizability of results of the present study and on how to reach this group and meet their needs when designing and implementing workplace interventions.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Directory of Open Access Journals (Sweden)
Kristine Lykens
Full Text Available BACKGROUND: In 2009 an estimated 5.3 million people in the United States were afflicted with Alzheimer's disease, a degenerative form of dementia. The impact of this disease is not limited to the patient but also has significant impact on the lives and health of their family caregivers. The Resources for Enhancing Alzheimer's Caregiver Health (REACH II program was developed and tested in clinical studies. The REACH II program is now being delivered by community agencies in several locations. This study examines the impact of the REACH II program on caregiver lives and health in a city in north Texas. STUDY DESIGN: Family caregivers of Alzheimer's patients were assessed using an instrument covering the multi-item domains of Caregiver Burden, Depression, Self-Care, and Social Support upon enrollment in the program and at the completion of the 6 month intervention. The domain scores were analyzed using a multivariate paired t-test and Bonferroni confidence interval for the differences in pre- and post-service domain scores. RESULTS: A total of 494 families were enrolled in the program during the period January 1, 2011 through June 30, 2012. Of these families 177 completed the 6 month program and have pre - and post service domain scores. The median age for the caregivers was 62 years. The domain scores for Depression and Caregiver Burden demonstrated statistically significant improvements upon program completion. CONCLUSION: The REACH II intervention was successfully implemented by a community agency with comparable impacts to those of the clinical trial warranting wider scale implementation.
Positive effects of robotic exoskeleton training of upper limb reaching movements after stroke
2012-01-01
This study, conducted in a group of nine chronic patients with right-side hemiparesis after stroke, investigated the effects of a robotic-assisted rehabilitation training with an upper limb robotic exoskeleton for the restoration of motor function in spatial reaching movements. The robotic assisted rehabilitation training was administered for a period of 6 weeks including reaching and spatial antigravity movements. To assess the carry-over of the observed improvements in movement during training into improved function, a kinesiologic assessment of the effects of the training was performed by means of motion and dynamic electromyographic analysis of reaching movements performed before and after training. The same kinesiologic measurements were performed in a healthy control group of seven volunteers, to determine a benchmark for the experimental observations in the patients’ group. Moreover degree of functional impairment at the enrolment and discharge was measured by clinical evaluation with upper limb Fugl-Meyer Assessment scale (FMA, 0–66 points), Modified Ashworth scale (MA, 0–60 pts) and active ranges of motion. The robot aided training induced, independently by time of stroke, statistical significant improvements of kinesiologic (movement time, smoothness of motion) and clinical (4.6 ± 4.2 increase in FMA, 3.2 ± 2.1 decrease in MA) parameters, as a result of the increased active ranges of motion and improved co-contraction index for shoulder extension/flexion. Kinesiologic parameters correlated significantly with clinical assessment values, and their changes after the training were affected by the direction of motion (inward vs. outward movement) and position of target to be reached (ipsilateral, central and contralateral peripersonal space). These changes can be explained as a result of the motor recovery induced by the robotic training, in terms of regained ability to execute single joint movements and of improved interjoint coordination of
Positive effects of robotic exoskeleton training of upper limb reaching movements after stroke
Directory of Open Access Journals (Sweden)
Frisoli Antonio
2012-06-01
Full Text Available Abstract This study, conducted in a group of nine chronic patients with right-side hemiparesis after stroke, investigated the effects of a robotic-assisted rehabilitation training with an upper limb robotic exoskeleton for the restoration of motor function in spatial reaching movements. The robotic assisted rehabilitation training was administered for a period of 6 weeks including reaching and spatial antigravity movements. To assess the carry-over of the observed improvements in movement during training into improved function, a kinesiologic assessment of the effects of the training was performed by means of motion and dynamic electromyographic analysis of reaching movements performed before and after training. The same kinesiologic measurements were performed in a healthy control group of seven volunteers, to determine a benchmark for the experimental observations in the patients’ group. Moreover degree of functional impairment at the enrolment and discharge was measured by clinical evaluation with upper limb Fugl-Meyer Assessment scale (FMA, 0–66 points, Modified Ashworth scale (MA, 0–60 pts and active ranges of motion. The robot aided training induced, independently by time of stroke, statistical significant improvements of kinesiologic (movement time, smoothness of motion and clinical (4.6 ± 4.2 increase in FMA, 3.2 ± 2.1 decrease in MA parameters, as a result of the increased active ranges of motion and improved co-contraction index for shoulder extension/flexion. Kinesiologic parameters correlated significantly with clinical assessment values, and their changes after the training were affected by the direction of motion (inward vs. outward movement and position of target to be reached (ipsilateral, central and contralateral peripersonal space. These changes can be explained as a result of the motor recovery induced by the robotic training, in terms of regained ability to execute single joint movements and of improved
Guiding Warfare to Reach Sustainable Peace
DEFF Research Database (Denmark)
Vestenskov, David; Drewes, Line
The conference report Guiding Warfare to Reach Sustainable Peace constitutes the primary outcome of the conference It is based on excerpts from the conference presenters and workshop discussions. Furthermore, the report contains policy recommendations and key findings, with the ambition of develo......The conference report Guiding Warfare to Reach Sustainable Peace constitutes the primary outcome of the conference It is based on excerpts from the conference presenters and workshop discussions. Furthermore, the report contains policy recommendations and key findings, with the ambition...... of developing best practices in the education and implementation of IHL in capacity building of security forces....
Do working environment interventions reach shift workers?
DEFF Research Database (Denmark)
Nabe-Nielsen, Kirsten; Jørgensen, Marie Birk; Garde, Anne Helene
2016-01-01
PURPOSE: Shift workers are exposed to more physical and psychosocial stressors in the working environment as compared to day workers. Despite the need for targeted prevention, it is likely that workplace interventions less frequently reach shift workers. The aim was therefore to investigate whether...... the reach of workplace interventions varied between shift workers and day workers and whether such differences could be explained by the quality of leadership exhibited at different times of the day. METHODS: We used questionnaire data from 5361 female care workers in the Danish eldercare sector...
Telerobotic operation of structurally flexible, long-reach manipulators
International Nuclear Information System (INIS)
Kwon, D.S.; Hwang, D.H.; Babcock, S.M.
1994-01-01
As a part of the Department of Energy's Environmental Restoration and Waste Management Program, long-reach manipulators are being considered for the retrieval of waste from large storage tanks. Long-reach manipulators may have characteristics significantly different from those of typical industrial robots because of the flexibility of long links needed to cover the large workspace. To avoid structural vibrations during operation, control algorithms employing various types of shaping filters were investigated. A new approach that uses embedded simulation was developed and compared with others. In the new approach, generation of joint trajectories considering link flexibility was also investigated
Directory of Open Access Journals (Sweden)
Teresa Paolucci
2016-01-01
Full Text Available Background. The position sense of the shoulder joint is important during reaching. Objective. To examine the existence of additional competence of the shoulder with regard to the ability to measure extracorporeal space, through a novel approach, using the shoulder proprioceptive rehabilitation tool (SPRT, during reaching. Design. Observational case-control study. Methods. We examined 50 subjects: 25 healthy and 25 with impingement syndrome with a mean age [years] of 64.52 +/− 6.98 and 68.36 +/− 6.54, respectively. Two parameters were evaluated using the SPRT: the integration of visual information and the proprioceptive afferents of the shoulder (Test 1 and the discriminative proprioceptive capacity of the shoulder, with the subject blindfolded (Test 2. These tasks assessed the spatial error (in centimeters by the shoulder joint in reaching movements on the sagittal plane. Results. The shoulder had proprioceptive features that allowed it to memorize a reaching position and reproduce it (error of 1.22 cm to 1.55 cm in healthy subjects. This ability was lower in the impingement group, with a statistically significant difference compared to the healthy group (p<0.05 by Mann–Whitney test. Conclusions. The shoulder has specific expertise in the measurement of the extracorporeal space during reaching movements that gradually decreases in impingement syndrome.
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Reaching Reluctant Students: Insights from Torey Hayden.
Marlowe, Mike
1999-01-01
Illustrates principles of reaching students who fight or avoid adults by using examples drawn from the writings of Torey Hayden. Presents ten concepts that can serve as guidelines for building relationships with resistant children, and gives excerpts from Hayden's works to illustrate each concept. Demonstrates how books provide teachers with…
ATLAS Barrel Toroid magnet reached nominal field
2006-01-01
Â OnÂ 9 November the barrel toroid magnet reached its nominal field of 4 teslas, with an electrical current of 21 000 amperes (21 kA) passing through the eight superconducting coils as shown on this graph
THE LONG REACH OF EDUCATION: EARLY RETIREMENT.
Venti, Steven; Wise, David A
2015-12-01
The goal of this paper is to draw attention to the long lasting effect of education on economic outcomes. We use the relationship between education and two routes to early retirement - the receipt of Social Security Disability Insurance (DI) and the early claiming of Social Security retirement benefits - to illustrate the long-lasting influence of education. We find that for both men and women with less than a high school degree the median DI participation rate is 6.6 times the participation rate for those with a college degree or more. Similarly, men and women with less than a high school education are over 25 percentage points more likely to claim Social Security benefits early than those with a college degree or more. We focus on four critical "pathways" through which education may indirectly influence early retirement - health, employment, earnings, and the accumulation of assets. We find that for women health is the dominant pathway through which education influences DI participation. For men, the health, earnings, and wealth pathways are of roughly equal magnitude. For both men and women the principal channel through which education influences early Social Security claiming decisions is the earnings pathway. We also consider the direct effect of education that does not operate through these pathways. The direct effect of education is much greater for early claiming of Social Security benefits than for DI participation, accounting for 72 percent of the effect of education for men and 67 percent for women. For women the direct effect of education on DI participation is not statistically significant, suggesting that the total effect may be through the four pathways.
Müller-Kirsten, Harald J W
2013-01-01
Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...
Counihan, Timothy D; Waite, Ian R; Nilsen, Elena B; Hardiman, Jill M; Elias, Edwin; Gelfenbaum, Guy; Zaugg, Steven D
2014-06-15
While previous studies have documented contaminants in fish, sediments, water, and wildlife, few specifics are known about the spatial distribution of contaminants in the Columbia River Estuary (CRE). Our study goal was to characterize sediment contaminant detections and concentrations in reaches of the CRE that were concurrently being sampled to assess contaminants in water, invertebrates, fish, and osprey (Pandion haliaetus) eggs. Our objectives were to develop a survey design based on sedimentation characteristics and then assess whether sediment grain size, total organic carbon (TOC), and contaminant concentrations and detections varied between areas with different sedimentation characteristics. We used a sediment transport model to predict sedimentation characteristics of three 16km river reaches in the CRE. We then compartmentalized the modeled change in bed mass after a two week simulation to define sampling strata with depositional, stable, or erosional conditions. We collected and analyzed bottom sediments to assess whether substrate composition, organic matter composition, and contaminant concentrations and detections varied among strata within and between the reaches. We observed differences in grain size fractions between strata within and between reaches. We found that the fine sediment fraction was positively correlated with TOC. Contaminant concentrations were statistically different between depositional vs. erosional strata for the industrial compounds, personal care products and polycyclic aromatic hydrocarbons class (Indus-PCP-PAH). We also observed significant differences between strata in the number of detections of Indus-PCP-PAH (depositional vs. erosional; stable vs. erosional) and for the flame retardants, polychlorinated biphenyls, and pesticides class (depositional vs. erosional, depositional vs. stable). When we estimated mean contaminant concentrations by reach, we observed higher contaminant concentrations in the furthest downstream
Counihan, Timothy D.; Waite, Ian R.; Nilsen, Elena B.; Hardiman, Jill M.; Elias, Edwin; Gelfenbaum, Guy; Zaugg, Steven D.
2014-01-01
While previous studies have documented contaminants in fish, sediments, water, and wildlife, few specifics are known about the spatial distribution of contaminants in the Columbia River Estuary (CRE). Our study goal was to characterize sediment contaminant detections and concentrations in reaches of the CRE that were concurrently being sampled to assess contaminants in water, invertebrates, fish, and osprey (Pandion haliaetus) eggs. Our objectives were to develop a survey design based on sedimentation characteristics and then assess whether sediment grain size, total organic carbon (TOC), and contaminant concentrations and detections varied between areas with different sedimentation characteristics. We used a sediment transport model to predict sedimentation characteristics of three 16 km river reaches in the CRE. We then compartmentalized the modeled change in bed mass after a two week simulation to define sampling strata with depositional, stable, or erosional conditions. We collected and analyzed bottom sediments to assess whether substrate composition, organic matter composition, and contaminant concentrations and detections varied among strata within and between the reaches. We observed differences in grain size fractions between strata within and between reaches. We found that the fine sediment fraction was positively correlated with TOC. Contaminant concentrations were statistically different between depositional vs. erosional strata for the industrial compounds, personal care products and polycyclic aromatic hydrocarbons class (Indus–PCP–PAH). We also observed significant differences between strata in the number of detections of Indus–PCP–PAH (depositional vs. erosional; stable vs. erosional) and for the flame retardants, polychlorinated biphenyls, and pesticides class (depositional vs. erosional, depositional vs. stable). When we estimated mean contaminant concentrations by reach, we observed higher contaminant concentrations in the furthest
Directory of Open Access Journals (Sweden)
Joachim I. Krueger
2018-04-01
Full Text Available The practice of Significance Testing (ST remains widespread in psychological science despite continual criticism of its flaws and abuses. Using simulation experiments, we address four concerns about ST and for two of these we compare ST’s performance with prominent alternatives. We find the following: First, the 'p' values delivered by ST predict the posterior probability of the tested hypothesis well under many research conditions. Second, low 'p' values support inductive inferences because they are most likely to occur when the tested hypothesis is false. Third, 'p' values track likelihood ratios without raising the uncertainties of relative inference. Fourth, 'p' values predict the replicability of research findings better than confidence intervals do. Given these results, we conclude that 'p' values may be used judiciously as a heuristic tool for inductive inference. Yet, 'p' values cannot bear the full burden of inference. We encourage researchers to be flexible in their selection and use of statistical methods.
Christopoulos, Vassilios N; Bonaiuto, James; Kagan, Igor; Andersen, Richard A
2015-08-19
The posterior parietal cortex (PPC) has traditionally been considered important for awareness, spatial perception, and attention. However, recent findings provide evidence that the PPC also encodes information important for making decisions. These findings have initiated a running argument of whether the PPC is critically involved in decision making. To examine this issue, we reversibly inactivated the parietal reach region (PRR), the area of the PPC that is specialized for reaching movements, while two monkeys performed a memory-guided reaching or saccade task. The task included choices between two equally rewarded targets presented simultaneously in opposite visual fields. Free-choice trials were interleaved with instructed trials, in which a single cue presented in the peripheral visual field defined the reach and saccade target unequivocally. We found that PRR inactivation led to a strong reduction of contralesional choices, but only for reaches. On the other hand, saccade choices were not affected by PRR inactivation. Importantly, reaching and saccade movements to single instructed targets remained largely intact. These results cannot be explained as an effector-nonspecific deficit in spatial attention or awareness, since the temporary "lesion" had an impact only on reach choices. Hence, the PPR is a part of a network for reach decisions and not just reach planning. There has been an ongoing debate on whether the posterior parietal cortex (PPC) represents only spatial awareness, perception, and attention or whether it is also involved in decision making for actions. In this study we explore whether the parietal reach region (PRR), the region of the PPC that is specialized for reaches, is involved in the decision process. We inactivated the PRR while two monkeys performed reach and saccade choices between two targets presented simultaneously in both hemifields. We found that inactivation affected only the reach choices, while leaving saccade choices intact
Improving exposure scenario definitions within REACH
DEFF Research Database (Denmark)
Lee, Jihyun; Pizzol, Massimo; Thomsen, Marianne
In recent years, the paradigm of chemical management system has changed from being toxicity oriented and media based to being risk oriented and receptor based. This trend is evident not only regarding environmental quality standards, but also for industrial chemical regulations. Political...... instruments to support a precautionary chemicals management system and to protect receptor’s health have also been increasing. Since 2007, the European Union adopted REACH (the Regulation on Registration, Evaluation, Authorisation and Restriction of Chemicals): REACH makes industry responsible for assessing...... and managing the risks posed by industrial chemicals and providing appropriate safety information to their users (EC, 2007). However, to ensure a high level of protection of human health and the environment, there is a need to consider ‘aggregate exposure’ including background exposures from environment which...
Does workplace health promotion reach shift workers?
DEFF Research Database (Denmark)
Nabe-Nielsen, Kirsten; Garde, Anne Helene; Clausen, Thomas
2015-01-01
OBJECTIVES: One reason for health disparities between shift and day workers may be that workplace health promotion does not reach shift workers to the same extent as it reaches day workers. This study aimed to investigate the association between shift work and the availability of and participation...... in workplace health promotion. METHODS: We used cross-sectional questionnaire data from a large representative sample of all employed people in Denmark. We obtained information on the availability of and participation in six types of workplace health promotion. We also obtained information on working hours, ie......). RESULTS: In the general working population, fixed evening and fixed night workers, and employees working variable shifts including night work reported a higher availability of health promotion, while employees working variable shifts without night work reported a lower availability of health promotion...
Performance reach in the LHC for 2012
International Nuclear Information System (INIS)
Arduini, G.
2012-01-01
Based on the 2011 experience and Machine Development study results, the performance reach of the LHC with 25 and 50 ns beams will be addressed for operation at 3.5 and 4 TeV. The possible scrubbing scenarios and potential intensity limitations resulting from vacuum, heating will be taken into account wherever possible. The paper mainly covers the performance of the two high luminosity regions in IR1 and IR5. (author)
Can a significance test be genuinely Bayesian?
Pereira, Carlos A. de B.; Stern, Julio Michael; Wechsler, Sergio
2008-01-01
The Full Bayesian Significance Test, FBST, is extensively reviewed. Its test statistic, a genuine Bayesian measure of evidence, is discussed in detail. Its behavior in some problems of statistical inference like testing for independence in contingency tables is discussed.
Whither Statistics Education Research?
Watson, Jane
2016-01-01
This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…
Riparian Vegetation Mapping Along the Hanford Reach
International Nuclear Information System (INIS)
FOGWELL, T.W.
2003-01-01
During the biological survey and inventory of the Hanford Site conducted in the mid-1990s (1995 and 1996), preliminary surveys of the riparian vegetation were conducted along the Hanford Reach. These preliminary data were reported to The Nature Conservancy (TNC), but were not included in any TNC reports to DOE or stakeholders. During the latter part of FY2001, PNNL contracted with SEE Botanical, the parties that performed the original surveys in the mid 1990s, to complete the data summaries and mapping associated with the earlier survey data. Those data sets were delivered to PNNL and the riparian mapping by vegetation type for the Hanford Reach is being digitized during the first quarter of FY2002. These mapping efforts provide the information necessary to create subsequent spatial data layers to describe the riparian zone according to plant functional types (trees, shrubs, grasses, sedges, forbs). Quantification of the riparian zone by vegetation types is important to a number of DOE'S priority issues including modeling contaminant transport and uptake in the near-riverine environment and the determination of ecological risk. This work included the identification of vegetative zones along the Reach by changes in dominant plant species covering the shoreline from just to the north of the 300 Area to China Bar near Vernita. Dominant and indicator species included Agropyron dasytachyudA. smithii, Apocynum cannabinum, Aristida longiseta, Artemisia campestris ssp. borealis var scouleriana, Artemisa dracunculus, Artemisia lindleyana, Artemisia tridentata, Bromus tectorum, Chrysothamnus nauseosus, Coreopsis atkinsoniana. Eleocharis palustris, Elymus cinereus, Equisetum hyemale, Eriogonum compositum, Juniperus trichocarpa, Phalaris arundinacea, Poa compressa. Salk exigua, Scirpus acutus, Solidago occidentalis, Sporobolus asper,and Sporobolus cryptandrus. This letter report documents the data received, the processing by PNNL staff, and additional data gathered in FY2002
Long-reach manipulators for decommissioning
International Nuclear Information System (INIS)
Webster, D.A.; Challinor, S.F.
1993-01-01
A survey of redundant facilities at Sellafield has identified that in many cases the conventional means of deploying remote handling equipment are not appropriate and that novel means must be employed. However, decommissioning is not a value adding activity and so expensive one off designs must be avoided. The paper will describe BNFL's approach to the synthesis from proprietary parts of a manipulator which can lift 3 te at a horizontal reach of over 5 metres and yet can still perform the dextrous manipulation necessary to remove small items. It will also cover the development of the manipulator control systems and the adaption of commercial handtools to be manipulator friendly. (author)
Luminosity performance reach after LS1
International Nuclear Information System (INIS)
Herr, W.
2012-01-01
Based on past experience (2010/2011), in particular expected limitations from beam-beam effects, and taking into account the expected beam quality from the LHC injectors, the peak and integrated luminosity at top energy is discussed for different scenarios (e.g. bunch spacing, beta*). In particular it will be shown which are the key parameters to reach the nominal luminosity and it is also shown that peak luminosities two times larger than nominal (or higher) are possible. Possible test in 2012 are discussed
City Reach Code Technical Support Document
Energy Technology Data Exchange (ETDEWEB)
Athalye, Rahul A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, Yan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Jian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Bing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Frankel, Mark [New Buildings Inst., Portland, OR (United States); Lyles, Mark [New Buildings Inst., Portland, OR (United States)
2017-10-31
This report describes and analyzes a set of energy efficiency measures that will save 20% energy over ASHRAE Standard 90.1-2013. The measures will be used to formulate a Reach Code for cities aiming to go beyond national model energy codes. A coalition of U.S. cities together with other stakeholders wanted to facilitate the development of voluntary guidelines and standards that can be implemented in stages at the city level to improve building energy efficiency. The coalition's efforts are being supported by the U.S. Department of Energy via Pacific Northwest National Laboratory (PNNL) and in collaboration with the New Buildings Institute.
Patterns of arm muscle activation involved in octopus reaching movements.
Gutfreund, Y; Flash, T; Fiorito, G; Hochner, B
1998-08-01
The extreme flexibility of the octopus arm allows it to perform many different movements, yet octopuses reach toward a target in a stereotyped manner using a basic invariant motor structure: a bend traveling from the base of the arm toward the tip (Gutfreund et al., 1996a). To study the neuronal control of these movements, arm muscle activation [electromyogram (EMG)] was measured together with the kinematics of reaching movements. The traveling bend is associated with a propagating wave of muscle activation, with maximal muscle activation slightly preceding the traveling bend. Tonic activation was occasionally maintained afterward. Correlation of the EMG signals with the kinematic variables (velocities and accelerations) reveals that a significant part of the kinematic variability can be explained by the level of muscle activation. Furthermore, the EMG level measured during the initial stages of movement predicts the peak velocity attained toward the end of the reaching movement. These results suggest that feed-forward motor commands play an important role in the control of movement velocity and that simple adjustment of the excitation levels at the initial stages of the movement can set the velocity profile of the whole movement. A simple model of octopus arm extension is proposed in which the driving force is set initially and is then decreased in proportion to arm diameter at the bend. The model qualitatively reproduces the typical velocity profiles of octopus reaching movements, suggesting a simple control mechanism for bend propagation in the octopus arm.
Decision support using nonparametric statistics
Beatty, Warren
2018-01-01
This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.
Can donated media placements reach intended audiences?
Cooper, Crystale Purvis; Gelb, Cynthia A; Chu, Jennifer; Polonec, Lindsey
2013-09-01
Donated media placements for public service announcements (PSAs) can be difficult to secure, and may not always reach intended audiences. Strategies used by the Centers for Disease Control and Prevention's (CDC) Screen for Life: National Colorectal Cancer Action Campaign (SFL) to obtain donated media placements include producing a diverse mix of high-quality PSAs, co-branding with state and tribal health agencies, securing celebrity involvement, monitoring media trends to identify new distribution opportunities, and strategically timing the release of PSAs. To investigate open-ended recall of PSAs promoting colorectal cancer screening, CDC conducted 12 focus groups in three U.S. cities with men and women either nearing age 50 years, when screening is recommended to begin, or aged 50-75 years who were not in compliance with screening guidelines. In most focus groups, multiple participants recalled exposure to PSAs promoting colorectal cancer screening, and most of these individuals reported having seen SFL PSAs on television, in transit stations, or on the sides of public buses. Some participants reported exposure to SFL PSAs without prompting from the moderator, as they explained how they learned about the disease. Several participants reported learning key campaign messages from PSAs, including that colorectal cancer screening should begin at age 50 years and screening can find polyps so they can be removed before becoming cancerous. Donated media placements can reach and educate mass audiences, including millions of U.S. adults who have not been screened appropriately for colorectal cancer.
Thomas, Brandon J; Hawkins, Matthew M; Nalepka, Patrick
2017-03-30
Runeson (Scandanavian Journal of Psychology 18:172-179, 1977) suggested that the polar planimeter might serve as an informative model system of perceptual mechanism. The key aspect of the polar planimeter is that it registers a higher order property of the environment without computational mediation on the basis of lower order properties, detecting task-specific information only. This aspect was posited as a hypothesis for the perception of jumping and reaching affordances for the self and another person. The findings supported this hypothesis. The perception of reaching while jumping significantly differed from an additive combination of jump-without-reaching and reach-without-jumping perception. The results are consistent with Gibson's (The senses considered as perceptual systems, Houghton Mifflin, Boston, MA; Gibson, The senses considered as perceptual systems, Houghton Mifflin, Boston, MA, 1966; The ecological approach to visual perception, Houghton Mifflin, Boston, MA; Gibson, The ecological approach to visual perception, Houghton Mifflin, Boston, MA, 1979) theory of information-that aspects of the environment are specified by patterns in energetic media.
Riparian Vegetation Mapping Along the Hanford Reach
Energy Technology Data Exchange (ETDEWEB)
FOGWELL, T.W.
2003-07-11
During the biological survey and inventory of the Hanford Site conducted in the mid-1990s (1995 and 1996), preliminary surveys of the riparian vegetation were conducted along the Hanford Reach. These preliminary data were reported to The Nature Conservancy (TNC), but were not included in any TNC reports to DOE or stakeholders. During the latter part of FY2001, PNNL contracted with SEE Botanical, the parties that performed the original surveys in the mid 1990s, to complete the data summaries and mapping associated with the earlier survey data. Those data sets were delivered to PNNL and the riparian mapping by vegetation type for the Hanford Reach is being digitized during the first quarter of FY2002. These mapping efforts provide the information necessary to create subsequent spatial data layers to describe the riparian zone according to plant functional types (trees, shrubs, grasses, sedges, forbs). Quantification of the riparian zone by vegetation types is important to a number of DOE'S priority issues including modeling contaminant transport and uptake in the near-riverine environment and the determination of ecological risk. This work included the identification of vegetative zones along the Reach by changes in dominant plant species covering the shoreline from just to the north of the 300 Area to China Bar near Vernita. Dominant and indicator species included Agropyron dasytachyudA. smithii, Apocynum cannabinum, Aristida longiseta, Artemisia campestris ssp. borealis var scouleriana, Artemisa dracunculus, Artemisia lindleyana, Artemisia tridentata, Bromus tectorum, Chrysothamnus nauseosus, Coreopsis atkinsoniana. Eleocharis palustris, Elymus cinereus, Equisetum hyemale, Eriogonum compositum, Juniperus trichocarpa, Phalaris arundinacea, Poa compressa. Salk exigua, Scirpus acutus, Solidago occidentalis, Sporobolus asper,and Sporobolus cryptandrus. This letter report documents the data received, the processing by PNNL staff, and additional data gathered in FY
Reach and get capability in a computing environment
Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM
2012-06-05
A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.
Unified communication to reach vulnerable mothers.
Tezcan, B; Von Rege, I; Henkson, H; Oteng-Ntim, E
2011-01-01
The feasibility of using a mobile text to reach vulnerable patient groups was assessed in this study. A total of 121 pregnant or postnatal women were randomly asked to complete a questionnaire. The questionnaire was given to them in the antenatal clinic, postnatal ward, antenatal ward or in the day assessment unit at St Thomas' Hospital, London. The forms were collected and analysed using an Excel database. The results of this survey show that mobile technology is readily available for 97% of the obstetric population. In mothers from vulnerable groups and in mothers from deprived areas, 61% possessed 3rd generation mobile technology. The majority of mothers surveyed wanted their care supplemented by the use of their mobile phones.
Validity of an Interactive Functional Reach Test.
Galen, Sujay S; Pardo, Vicky; Wyatt, Douglas; Diamond, Andrew; Brodith, Victor; Pavlov, Alex
2015-08-01
Videogaming platforms such as the Microsoft (Redmond, WA) Kinect(®) are increasingly being used in rehabilitation to improve balance performance and mobility. These gaming platforms do not have built-in clinical measures that offer clinically meaningful data. We have now developed software that will enable the Kinect sensor to assess a patient's balance using an interactive functional reach test (I-FRT). The aim of the study was to test the concurrent validity of the I-FRT and to establish the feasibility of implementing the I-FRT in a clinical setting. The concurrent validity of the I-FRT was tested among 20 healthy adults (mean age, 25.8±3.4 years; 14 women). The Functional Reach Test (FRT) was measured simultaneously by both the Kinect sensor using the I-FRT software and the Optotrak Certus(®) 3D motion-capture system (Northern Digital Inc., Waterloo, ON, Canada). The feasibility of implementing the I-FRT in a clinical setting was assessed by performing the I-FRT in 10 participants with mild balance impairments recruited from the outpatient physical therapy clinic (mean age, 55.8±13.5 years; four women) and obtaining their feedback using a NASA Task Load Index (NASA-TLX) questionnaire. There was moderate to good agreement between FRT measures made by the two measurement systems. The greatest agreement between the two measurement system was found with the Kinect sensor placed at a distance of 2.5 m [intraclass correlation coefficient (2,k)=0.786; PNASA/TLX questionnaire. FRT measures made using the Kinect sensor I-FRT software provides a valid clinical measure that can be used with the gaming platforms.
Do older adults perceive postural constraints for reach estimation?
Cordova, Alberto; Gabbard, Carl
2014-01-01
BACKGROUND/STUDY CONTEXT: Recent evidence indicates that older persons have difficulty mentally representing intended movements. Furthermore, in an estimation of reach paradigm using motor imagery, a form of mental representation, older persons significantly overestimated their ability compared with young adults. The authors tested the notion that older adults may also have difficulty perceiving the postural constraints associated with reach estimation. The authors compared young (Mage = 22 years) and older (Mage = 67) adults on reach estimation while seated and in a more postural demanding standing and leaning forward position. The expectation was a significant postural effect with the standing condition, as evidenced by reduced overestimation. Whereas there was no difference between groups in the seated condition (both overestimated), older adults underestimated whereas the younger group once again overestimated in the standing condition. From one perspective, these results show that older adults do perceive postural constraints in light of their own physical capabilities. That is, that group perceived greater postural demands with the standing posture and elected to program a more conservative strategy, resulting in underestimation.
Worry, Intolerance of Uncertainty, and Statistics Anxiety
Williams, Amanda S.
2013-01-01
Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…
Key Design Requirements for Long-Reach Manipulators
Energy Technology Data Exchange (ETDEWEB)
Kwon, D.S.
2001-01-01
Long-reach manipulators differ from industrial robots and teleoperators typically used in the nuclear industry in that the aspect ratio (length to diameter) of links is much greater and link flexibility, as well as joint or drive train flexibility, is likely to be significant. Long-reach manipulators will be required for a variety of applications in the Environmental Restoration and Waste Management Program. While each application will present specific functional, kinematic, and performance requirements, an approach for determining the kinematic applicability and performance characteristics is presented, with a focus on waste storage tank remediation. Requirements are identified, kinematic configurations are considered, and a parametric study of link design parameters and their effects on performance characteristics is presented.
Key design requirements for long-reach manipulators
International Nuclear Information System (INIS)
Kwon, D.S.; March-Leuba, S.; Babcock, S.M.; Hamel, W.R.
1993-09-01
Long-reach manipulators differ from industrial robots and teleoperators typically used in the nuclear industry in that the aspect ratio (length to diameter) of links is much greater and link flexibility, as well as joint or drive train flexibility, is likely to be significant. Long-reach manipulators will be required for a variety of applications in the Environmental Restoration and Waste Management Program. While each application will present specific functional kinematic, and performance requirements an approach for determining the kinematic applicability and performance characteristics is presented, with a focus on waste storage tank remediation. Requirements are identified, kinematic configurations are considered, and a parametric study of link design parameters and their effects on performance characteristics is presented
Key Design Requirements for Long-Reach Manipulators
International Nuclear Information System (INIS)
Kwon, D.S.
2001-01-01
Long-reach manipulators differ from industrial robots and teleoperators typically used in the nuclear industry in that the aspect ratio (length to diameter) of links is much greater and link flexibility, as well as joint or drive train flexibility, is likely to be significant. Long-reach manipulators will be required for a variety of applications in the Environmental Restoration and Waste Management Program. While each application will present specific functional, kinematic, and performance requirements, an approach for determining the kinematic applicability and performance characteristics is presented, with a focus on waste storage tank remediation. Requirements are identified, kinematic configurations are considered, and a parametric study of link design parameters and their effects on performance characteristics is presented
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Statistics for experimentalists
Cooper, B E
2014-01-01
Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Significance evaluation in factor graphs
DEFF Research Database (Denmark)
Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet
2017-01-01
in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...
ESO telbib: Linking In and Reaching Out
Grothkopf, U.; Meakins, S.
2015-04-01
Measuring an observatory's research output is an integral part of its science operations. Like many other observatories, ESO tracks scholarly papers that use observational data from ESO facilities and uses state-of-the-art tools to create, maintain, and further develop the Telescope Bibliography database (telbib). While telbib started out as a stand-alone tool mostly used to compile lists of papers, it has by now developed into a multi-faceted, interlinked system. The core of the telbib database is links between scientific papers and observational data generated by the La Silla Paranal Observatory residing in the ESO archive. This functionality has also been deployed for ALMA data. In addition, telbib reaches out to several other systems, including ESO press releases, the NASA ADS Abstract Service, databases at the CDS Strasbourg, and impact scores at Altmetric.com. We illustrate these features to show how the interconnected telbib system enhances the content of the database as well as the user experience.
Using New Media to Reach Broad Audiences
Gay, P. L.
2008-06-01
The International Year of Astronomy New Media Working Group (IYA NMWG) has a singular mission: To flood the Internet with ways to learn about astronomy, interact with astronomers and astronomy content, and socially network with astronomy. Within each of these areas, we seek to build lasting programs and partnerships that will continue beyond 2009. Our weapon of choice is New Media. It is often easiest to define New Media by what it is not. Television, radio, print and their online redistribution of content are not New Media. Many forms of New Media start as user provided content and content infrastructures that answer that individual's creative whim in a way that is adopted by a broader audience. Classic examples include Blogs and Podcasts. This media is typically distributed through content specific websites and RSS feeds, which allow syndication. RSS aggregators (iTunes has audio and video aggregation abilities) allow subscribers to have content delivered to their computers automatically when they connect to the Internet. RSS technology is also being used in such creative ways as allowing automatically updating Google-maps that show the location of someone with an intelligent GPS system, and in sharing 100 word microblogs from anyone (Twitters) through a single feed. In this poster, we outline how the IYA NMWG plans to use New Media to reach target primary audiences of astronomy enthusiasts, image lovers, and amateur astronomers, as well as secondary audiences, including: science fiction fans, online gamers, and skeptics.
Media perspective - new opportunities for reaching audiences
Haswell, Katy
2007-08-01
The world of media is experiencing a period of extreme and rapid change with the rise of internet television and the download generation. Many young people no longer watch standard TV. Instead, they go on-line, talking to friends and downloading pictures, videos, music clips to put on their own websites and watch/ listen to on their laptops and mobile phones. Gone are the days when TV controllers determined what you watched and when you watched it. Now the buzzword is IPTV, Internet Protocol Television, with companies such as JOOST offering hundreds of channels on a wide range of subjects, all of which you can choose to watch when and where you wish, on your high-def widescreen with stereo surround sound at home or on your mobile phone on the train. This media revolution is changing the way organisations get their message out. And it is encouraging companies such as advertising agencies to be creative about new ways of accessing audiences. The good news is that we have fresh opportunities to reach young people through internet-based media and material downloaded through tools such as games machines, as well as through the traditional media. And it is important for Europlanet to make the most of these new and exciting developments.
Has Athletic Performance Reached its Peak?
Berthelot, Geoffroy; Sedeaud, Adrien; Marck, Adrien; Antero-Jacquemin, Juliana; Schipman, Julien; Saulière, Guillaume; Marc, Andy; Desgorces, François-Denis; Toussaint, Jean-François
2015-09-01
Limits to athletic performance have long been a topic of myth and debate. However, sport performance appears to have reached a state of stagnation in recent years, suggesting that the physical capabilities of humans and other athletic species, such as greyhounds and thoroughbreds, cannot progress indefinitely. Although the ultimate capabilities may be predictable, the exact path for the absolute maximal performance values remains difficult to assess and relies on technical innovations, sport regulation, and other parameters that depend on current societal and economic conditions. The aim of this literature review was to assess the possible plateau of top physical capabilities in various events and detail the historical backgrounds and sociocultural, anthropometrical, and physiological factors influencing the progress and regression of athletic performance. Time series of performances in Olympic disciplines, such as track and field and swimming events, from 1896 to 2012 reveal a major decrease in performance development. Such a saturation effect is simultaneous in greyhound, thoroughbred, and frog performances. The genetic condition, exhaustion of phenotypic pools, economic context, and the depletion of optimal morphological traits contribute to the observed limitation of physical capabilities. Present conditions prevailing, we approach absolute physical limits and endure a continued period of world record scarcity. Optional scenarios for further improvements will mostly depend on sport technology and modification competition rules.
LEP Dismantling Reaches Half-Way Stage
2001-01-01
LEP's last superconducting module leaves its home port... Just seven months into the operation, LEP dismantling is forging ahead. Two of the eight arcs which form the tunnel have already been emptied and the last of the accelerator's radiofrequency (RF) cavities has just been raised to the surface. The 160 people working on LEP dismantling have reason to feel pleased with their progress. All of the accelerator's 72 superconducting RF modules have already been brought to the surface, with the last one being extracted on 2nd May. This represents an important step in the dismantling process, as head of the project, John Poole, explains. 'This was the most delicate part of the project, because the modules are very big and they could only come out at one place', he says. The shaft at point 1.8 through which the RF cavity modules pass is 18 metres in diameter, while each module is 11.5 metres long. Some modules had to travel more than 10 kilometres to reach the shaft. ... is lifted up the PM 1.8 shaft, after a m...
CAST reaches milestone but keeps on searching
CERN Courier (september 2011 issue)
2011-01-01
After eight years of searching for the emission of a dark matter candidate particle, the axion, from the Sun, the CERN Axion Solar Telescope (CAST) has fulfilled its original physics programme. Members of the CAST collaboration in July, together with dipole-based helioscope. CAST, the world’s most sensitive axion helioscope, points a recycled prototype LHC dipole magnet at the Sun at dawn and dusk, looking for the conversion of axions to X-rays. It incorporates four state-of-the-art X-ray detectors: three Micromegas detectors and a pn-CCD imaging camera attached to a focusing X-ray telescope that was recovered from the German space programme (see CERN Courier April 2010). Over the years, CAST has operated with the magnet bores - the location of the axion conversion - in different conditions: first in vacuum, covering axion masses up to 20 meV/c2, and then with a buffer gas (4He and later 3He) at various densities, finally reaching the goal of 1.17 eV/c2 on 22 ...
Important ATLAS Forward Calorimeter Milestone Reached
Loch, P.
The ATLAS Forward Calorimeter working group has reached an important milestone in the production of their detectors. The mechanical assembly of the first electromagnetic module (FCal1C) has been completed at the University of Arizona on February 25, 2002, only ten days after the originally scheduled date. The photo shows the University of Arizona FCal group in the clean room, together with the assembled FCal1C module. The module consists of a stack of 18 round copper plates, each about one inch thick. Each plate is about 90 cm in diameter, and has 12260 precision-drilled holes in it, to accommodate the tube/rod electrode assembly. The machining of the plates, which was done at the Science Technology Center (STC) at Carleton University, Ottawa, Canada, required high precision to allow for easy insertion of the electrode copper tube. The plates have been carefully cleaned at the University of Arizona, to remove any machining residue and metal flakes. This process alone took about eleven weeks. Exactly 122...
International reach of tobacco marketing among young children.
Borzekowski, Dina L G; Cohen, Joanna E
2013-10-01
Prosmoking messages, delivered through marketing and the media, can reach very young children and influence attitudes and behaviors around smoking. This study examined the reach of tobacco marketing to 5 and 6 year olds in 6 low- and middle-income countries. Researchers worked one-on-one with 5 and 6 year olds in Brazil, China, India, Nigeria, Pakistan, and Russia (N = 2423). The children were asked to match logos with pictures of products, including 8 logos for cigarette brands. Analyses examined, overall and by country, whether gender, age, location, household use of tobacco, and knowledge of media characters were associated with awareness of cigarette brand logos. Additional analyses considered the relationship between cigarette brand logo awareness and intentions to smoke. Overall, 68% of 5 and 6 year olds could identify at least 1 cigarette brand logo, ranging from 50% in Russia to 86% in China. Across countries, being slightly older and having someone in the household who used tobacco, were significantly associated with greater odds of being able to identify at least 1 cigarette brand logo. The majority of young children from low- and middle-income countries are familiar with cigarette brands. This study's findings suggest that more effective measures are needed to restrict the reach of tobacco marketing.
Planning of the Extended Reach well Dieksand 2; Planung der Extended Reach Bohrung Dieksand 2
Energy Technology Data Exchange (ETDEWEB)
Frank, U.; Berners, H. [RWE-DEA AG, Hamburg (Germany). Drilling Team Mittelplate und Dieksand; Hadow, A.; Klop, G.; Sickinger, W. [Wintershall AG Erdoelwerke, Barnstdorf (Germany); Sudron, K.
1998-12-31
The Mittelplate oil field is located 7 km offshore the town of Friedrichskoog. Reserves are estimated at 30 million tonnes of oil. At a production rate of 2,500 t/d, it will last about 33 years. The transport capacity of the offshore platform is limited, so that attempts were made to enhance production by constructing the extended reach borehole Dieksand 2. Details are presented. (orig.) [Deutsch] Das Erdoelfeld Mittelplate liegt am suedlichen Rand des Nationalparks Schleswig Holsteinisches Wattenmeer, ca. 7000 m westlich der Ortschaft Friedrichskoog. Die gewinnbaren Reserven betragen ca. 30 Millionen t Oel. Bei einer Foerderkapazitaet von 2.500 t/Tag betraegt die Foerderdauer ca. 33 Jahre. Aufgrund der begrenzten Transportkapazitaeten von der Insel, laesst sich durch zusaetzliche Bohrungen von der kuenstlichen Insel Mittelplate keine entscheidende Erhoehung der Foerderkapazitaet erzielen. Ab Sommer 1996 wurde erstmals die Moeglichkeit der Lagerstaettenerschliessung von Land untersucht. Ein im Mai 1997 in Hamburg etabliertes Drilling Team wurde mit der Aufgabe betraut, die Extended Reach Bohrung Dieksand 2 zu planen und abzuteufen. Die Planungsphasen fuer die Extended Reach Bohrung Dieksand 2 wurden aufgezeigt. Die fuer den Erfolg einer Extended Reach Bohrung wichtigen Planungsparameter wurden erlaeutert. Es wurden Wege gezeigt, wie bei diesem Projekt technische und geologische Risiken in der Planung mit beruecksichtigt und nach Beginn der Bohrung weiter bearbeitet werden koennen. (orig.)
Access to expert stroke care with telemedicine: REACH MUSC
Directory of Open Access Journals (Sweden)
Abby Swanson Kazley
2012-03-01
Full Text Available Stroke is a leading cause of death and disability, and rtPA can significantly reduce the long-term impact of acute ischemic stroke (AIS if given within 3 hours of symptom onset. South Carolina is located in the stroke belt and has a high rate of stroke and stroke mortality. Many small rural SC hospitals do not maintain the expertise needed to treat AIS patients with rtPA. MUSC is an academic medical center using REACH MUSC telemedicine to deliver stroke care to 15 hospitals in the state, increasing the likelihood of timely treatment with rtPA. The purpose of this study is to determine the increase in access to rtPA through the use of telemedicine for AIS in the general population and in specific segments of the population based on age, gender, race, ethnicity, education, urban/rural residence, poverty, and stroke mortality.We used a retrospective cross-sectional design examining Census data from 2000 and Geographic Information Systems (GIS analysis to identify South Carolina residents that live within 30 or 60 minutes of a Primary Stroke Center (PSC or a REACH MUSC site. We include all South Carolina citizens in our analysis and specifically examine the population’s age, gender, race, ethnicity, education, urban/rural residence, poverty, and stroke mortality. Our sample includes 4,012,012 South Carolinians. The main measure is access to expert stroke care at a Primary Stroke Center (PSC or a REACH MUSC hospital within 30 or 60 minutes. We find that without REACH MUSC, only 38% of the population has potential access to expert stroke care in SC within sixty minutes given that most PSCs will maintain expert stroke coverage. REACH MUSC allows 76% of the population to be within sixty minutes of expert stroke care, and 43% of the population to be within 30 minute drive time of expert stroke care. These increases in access are especially significant for groups that have faced disparities in care and high rates of AIS. The use of telemedicine can
Can coronal hole spicules reach coronal temperatures?
Madjarska, M. S.; Vanninathan, K.; Doyle, J. G.
2011-08-01
Aims: The present study aims to provide observational evidence of whether coronal hole spicules reach coronal temperatures. Methods: We combine multi-instrument co-observations obtained with the SUMER/SoHO and with the EIS/SOT/XRT/Hinode. Results: The analysed three large spicules were found to be comprised of numerous thin spicules that rise, rotate, and descend simultaneously forming a bush-like feature. Their rotation resembles the untwisting of a large flux rope. They show velocities ranging from 50 to 250 kms-1. We clearly associated the red- and blue-shifted emissions in transition region lines not only with rotating but also with rising and descending plasmas. Our main result is that these spicules although very large and dynamic, are not present in the spectral lines formed at temperatures above 300 000 K. Conclusions: In this paper we present the analysis of three Ca ii H large spicules that are composed of numerous dynamic thin spicules but appear as macrospicules in lower resolution EUV images. We found no coronal counterpart of these and smaller spicules. We believe that the identification of phenomena that have very different origins as macrospicules is due to the interpretation of the transition region emission, and especially the He ii emission, wherein both chromospheric large spicules and coronal X-ray jets are present. We suggest that the recent observation of spicules in the coronal AIA/SDO 171 Å and 211 Å channels probably comes from the existence of transition region emission there. Movie is available in electronic form at http://www.aanda.org
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Statistics Using Just One Formula
Rosenthal, Jeffrey S.
2018-01-01
This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…
Statistics Anxiety among Postgraduate Students
Koh, Denise; Zawi, Mohd Khairi
2014-01-01
Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…
A COMPARISON OF THE SIT-AND-REACH TEST AND THE BACK-SAVER SIT-AND-REACH TEST IN UNIVERSITY STUDENTS
Directory of Open Access Journals (Sweden)
Pedro A. López-Miñarro
2009-03-01
Full Text Available This study compares the forward reach score, spine and pelvis postures, and hamstring criterion-related validity (concurrent validity between the sit-and-reach test (SR and the back-saver sit-and-reach test (BS. Seventy-six men (mean age ± SD: 23.45 ± 3.96 years and 67 women (mean age ± SD: 23.85 ± 5.36 years were asked to perform three trials of SR, BS left (BSl, right (BSr, and passive straight leg raise (PSLR right and left (hamstring criterion measure in a randomized order. The thoracic, lumbar, and pelvis angles (measured with a Uni-level inclinometer and forward reach scores were recorded once the subjects reached forward as far as possible without flexing the knees. A repeated measure ANOVA was performed followed by Bonferroni´s post hoc test. Pearson correlation coefficients were used to define the relationships between SR and BS scores with respect to PSLR. In both men and women, the thoracic angle in BS was significantly greater than in SR (p<0.016. However, no significant differences were found between the tests in lumbar angle, pelvic angle, and forward reach scores. The concurrent validity of the forward reach score as a measure of hamstring extensibility was moderate in women (0.66 0. 76 and weak to moderate in men (0.51 0.59. The concurrent validity was slightly higher in SR than in BS, although no significant differences between the correlation values were observed. There were significant differences in the thoracic angle between the SR and BS, but not in the forward reach score. There was no difference in concurrent validity between the two tests. However, the traditional SR was preferred because it reached better concurrent validity than the BS
Anderson, Joe; Bingham, Geoffrey P
2010-09-01
We provide a solution to a major problem in visually guided reaching. Research has shown that binocular vision plays an important role in the online visual guidance of reaching, but the visual information and strategy used to guide a reach remains unknown. We propose a new theory of visual guidance of reaching including a new information variable, tau(alpha) (relative disparity tau), and a novel control strategy that allows actors to guide their reach trajectories visually by maintaining a constant proportion between tau(alpha) and its rate of change. The dynamical model couples the information to the reaching movement to generate trajectories characteristic of human reaching. We tested the theory in two experiments in which participants reached under conditions of darkness to guide a visible point either on a sliding apparatus or on their finger to a point-light target in depth. Slider apparatus controlled for a simple mapping from visual to proprioceptive space. When reaching with their finger, participants were forced, by perturbation of visual information used for feedforward control, to use online control with only binocular disparity-based information for guidance. Statistical analyses of trajectories strongly supported the theory. Simulations of the model were compared statistically to actual reaching trajectories. The results supported the theory, showing that tau(alpha) provides a source of information for the control of visually guided reaching and that participants use this information in a proportional rate control strategy.
Whisker and Nose Tactile Sense Guide Rat Behavior in a Skilled Reaching Task
Directory of Open Access Journals (Sweden)
Pierantonio Parmiani
2018-02-01
Full Text Available Skilled reaching is a complex movement in which a forelimb is extended to grasp food for eating. Video-recordings analysis of control rats enables us to distinguish several components of skilled reaching: Orient, approaching the front wall of the reaching box and poking the nose into the slot to locate the food pellet; Transport, advancing the forelimb through the slot to reach-grasp the pellet; and Withdrawal of the grasped food to eat. Although food location and skilled reaching is guided by olfaction, the importance of whisker/nose tactile sense in rats suggests that this too could play a role in reaching behavior. To test this hypothesis, we studied skilled reaching in rats trained in a single-pellet reaching task before and after bilateral whisker trimming and bilateral infraorbital nerve (ION severing. During the task, bilaterally trimmed rats showed impaired Orient with respect to controls. Specifically, they detected the presence of the wall by hitting it with their nose (rather than their whiskers, and then located the slot through repetitive nose touches. The number of nose touches preceding poking was significantly higher in comparison to controls. On the other hand, macrovibrissae trimming resulted in no change in reaching/grasping or withdrawal components of skilled reaching. Bilaterally ION-severed rats, displayed a marked change in the structure of their skilled reaching. With respect to controls, in ION-severed rats: (a approaches to the front wall were significantly reduced at 3–5 and 6–8 days; (b nose pokes were significantly reduced at 3–5 days, and the slot was only located after many repetitive nose touches; (c the reaching-grasping-retracting movement never appeared at 3–5 days; (d explorative paw movements, equal to zero in controls, reached significance at 9–11 days; and (e the restored reaching-grasping-retracting sequence was globally slower than in controls, but the success rate was the same. These findings
Reaching remote areas in Latin America.
Jaimes, R
1994-01-01
Poor communities in remote and inaccessible areas tend to not only be cut off from family planning education and services, but they are also deprived of basic primary health care services. Efforts to bring family planning to such communities and populations should therefore be linked with other services. The author presents three examples of programs to bring effective family planning services to remote communities in Central and South America. Outside of the municipal center in the Tuxtlas region of Mexico, education and health levels are low and people live according to ancient customs. Ten years ago with the help of MEXFAM, the IPPF affiliate in Mexico, two social promoters established themselves in the town of Catemaco to develop a community program of family planning and health care offering education and prevention to improve the quality of people's lives. Through their health brigades taking health services to towns without an established health center, the program has influenced an estimated 100,000 people in 50 villages and towns. The program also has a clinic. In Guatemala, the Family Welfare Association (APROFAM) gave bicycles to 240 volunteer health care workers to facilitate their outreach work in rural areas. APROFAM since 1988 has operated an integrated program to treat intestinal parasites and promote family planning in San Lucas de Toliman, an Indian town close to Lake Atitlan. Providing health care to more than 10,000 people, the volunteer staff has covered the entire department of Solola, reaching each family in the area. Field educators travel on motorcycles through the rural areas of Guatemala coordinating with the health volunteers the distribution of contraceptives at the community level. The Integrated Project's Clinic was founded in 1992 and currently carries out pregnancy and Pap tests, as well as general lab tests. Finally, Puna is an island in the middle of the Gulf of Guayaquil, Ecuador. Women on the island typically have 10
International Nuclear Information System (INIS)
Venkataraman, G.
1992-01-01
Treating radiation gas as a classical gas, Einstein derived Planck's law of radiation by considering the dynamic equilibrium between atoms and radiation. Dissatisfied with this treatment, S.N. Bose derived Plank's law by another original way. He treated the problem in generality: he counted how many cells were available for the photon gas in phase space and distributed the photons into these cells. In this manner of distribution, there were three radically new ideas: The indistinguishability of particles, the spin of the photon (with only two possible orientations) and the nonconservation of photon number. This gave rise to a new discipline of quantum statistical mechanics. Physics underlying Bose's discovery, its significance and its role in development of the concept of ideal gas, spin-statistics theorem and spin particles are described. The book has been written in a simple and direct language in an informal style aiming to stimulate the curiosity of a reader. (M.G.B.)
Phenomena and characteristics of barrier river reaches in the middle and lower Yangtze River, China
You, Xingying; Tang, Jinwu
2017-06-01
Alluvial river self-adjustment describes the mechanism whereby a river that was originally in an equilibrium state of sediment transport encounters some disturbance that destroys the balance and results in responses such as riverbed deformation. A systematic study of historical and recent aerial photographs and topographic maps in the Middle and Lower Reaches of the Yangtze River (MLYR) shows that river self-adjustment has the distinguishing feature of transferring from upstream to downstream, which may affect flood safety, waterway morphology, bank stability, and aquatic environmental safety over relatively long reaches downstream. As a result, it is necessary to take measures to control or block this transfer. Using the relationship of the occurrence time of channel adjustments between the upstream and downstream, 34 single-thread river reaches in the MLYR were classified into four types: corresponding, basically corresponding, basically not corresponding, not corresponding. The latter two types, because of their ability to prevent upstream channel adjustment from transferring downstream, are called barrier river reaches in this study. Statistics indicate that barrier river reaches are generally single thread and slightly curved, with a narrow and deep cross-sectional morphology, and without flow deflecting nodes in the upper and middle parts of reaches. Moreover, in the MLYR, barrier river reaches have a hydrogeometric coefficient of {}1.2‱, a silty clay content of the concave bank {>}{9.5}%, and a median diameter of the bed sediment {>}{0.158} mm. The barrier river reach mechanism lies in that can effectively centralise the planimetric position of the main stream from different upstream directions, meaning that no matter how the upper channel adjusts, the main stream shows little change, providing relatively stable inflow conditions for the lower reaches. Regarding river regulation, it is necessary to optimise the benefits of barrier river reaches; long river
Functional reach and lateral reach tests adapted for aquatic physical therapy
Directory of Open Access Journals (Sweden)
Ana Angélica Ribeiro de Lima
Full Text Available Abstract Introduction: Functional reach (FR and lateral reach (LR tests are widely used in scientific research and clinical practice. Assessment tools are useful in assessing subjects with greater accuracy and are usually adapted according to the limitations of each condition. Objective: To adapt FR and LR tests for use in an aquatic environment and assess the performance of healthy young adults. Methods: We collected anthropometric data and information on whether the participant exercised regularly or not. The FR and LR tests were adapted for use in an aquatic environment and administered to 47 healthy subjects aged 20-30 years. Each test was repeated three times. Results: Forty-one females and six males were assessed. The mean FR test score for men was 24.06 cm, whereas the mean value for right lateral reach (RLR was 10.94 cm and for left lateral reach (LLR was 9.78 cm. For females, the mean FR score was 17.57 cm, while the mean values for RLR was 8.84cm and for LLR was 7.76 cm. Men performed better in the FR (p < 0.001 and RLR tests than women (p = 0.037. Individuals who exercised regularly showed no differences in performance level when compared with their counterparts. Conclusion: The FR and LR tests were adapted for use in an aquatic environment. Males performed better on the FR and RLR tests, when compared to females. There was no correlation between the FR and LR tests and weight, height, Body Mass Index (BMI, foot length or length of the dominant upper limb.
Differential Recruitment of Parietal Cortex during Spatial and Non-spatial Reach Planning
Directory of Open Access Journals (Sweden)
Pierre-Michel Bernier
2017-05-01
Full Text Available The planning of goal-directed arm reaching movements is associated with activity in the dorsal parieto-frontal cortex, within which multiple regions subserve the integration of arm- and target-related sensory signals to encode a motor goal. Surprisingly, many of these regions show sustained activity during reach preparation even when target location is not specified, i.e., when a motor goal cannot be unambiguously formed. The functional role of these non-spatial preparatory signals remains unresolved. Here this process was investigated in humans by comparing reach preparatory activity in the presence or absence of information regarding upcoming target location. In order to isolate the processes specific to reaching and to control for visuospatial attentional factors, the reaching task was contrasted to a finger movement task. Functional MRI and electroencephalography (EEG were used to characterize the spatio-temporal pattern of reach-related activity in the parieto-frontal cortex. Reach planning with advance knowledge of target location induced robust blood oxygenated level dependent and EEG responses across parietal and premotor regions contralateral to the reaching arm. In contrast, reach preparation without knowledge of target location was associated with a significant BOLD response bilaterally in the parietal cortex. Furthermore, EEG alpha- and beta-band activity was restricted to parietal scalp sites, the magnitude of the latter being correlated with reach reaction times. These results suggest an intermediate stage of sensorimotor transformations in bilateral parietal cortex when target location is not specified.
Application of descriptive statistics in analysis of experimental data
Mirilović Milorad; Pejin Ivana
2008-01-01
Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...
Dynamic channel adjustments in the Jingjiang Reach of the Middle Yangtze River
Xia, Junqiang; Deng, Shanshan; Lu, Jinyou; Xu, Quanxi; Zong, Quanli; Tan, Guangming
2016-03-01
Significant channel adjustments have occurred in the Jingjiang Reach of the Middle Yangtze River, because of the operation of the Three Gorges Project (TGP). The Jingjiang Reach is selected as the study area, covering the Upper Jingjiang Reach (UJR) and Lower Jingjiang Reach (LJR). The reach-scale bankfull channel dimensions in the study reach were calculated annually from 2002 to 2013 by means of a reach-averaged approach and surveyed post-flood profiles at 171 sections. We find from the calculated results that: the reach-scale bankfull widths changed slightly in the UJR and LJR, with the corresponding depths increasing by 1.6 m and 1.0 m the channel adjustments occurred mainly with respect to bankfull depth because of the construction of large-scale bank revetment works, although there were significant bank erosion processes in local regions without the bank protection engineering. The reach-scale bankfull dimensions in the UJR and LJR generally responded to the previous five-year average fluvial erosion intensity during flood seasons, with higher correlations being obtained for the depth and cross-sectional area. It is concluded that these dynamic adjustments of the channel geometry are a direct result of recent human activities such as the TGP operation.
Statistical and theoretical research
International Nuclear Information System (INIS)
Anon.
1983-01-01
Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology
Statistical fluctuation phenomenon of early growth fission chain
International Nuclear Information System (INIS)
Zheng Chun; Song Lingli
2008-01-01
The early growth of neutron population within a supercritical system of fissile material is of a statistical nature and may depart significantly from the average time dependence neutron population. The probability of a source neutron sponsoring a persistent fission chain was considered for a supercritical system. Then the probability distribution in time of the neutron population reaching a preset level was deduced based on the probability P(n,t) of n neutron at time t. By combing the above two probabilities, the probability that at time t after the system reached critical there were no neutron in the system was derived. The P(t) of Godiva neutron excursion at supercritical, and the pre-burst probability of BARS were calculated by this model, and were found agree with the experiment result. (authors)
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
State Transportation Statistics 2010
2011-09-14
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...
State Transportation Statistics 2011
2012-08-08
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Naganna, Sujay Raghavendra; Deka, Paresh Chandra
2018-07-01
The hydro-geological properties of streambed together with the hydraulic gradients determine the fluxes of water, energy and solutes between the stream and underlying aquifer system. Dam induced sedimentation affects hyporheic processes and alters substrate pore space geometries in the course of progressive stabilization of the sediment layers. Uncertainty in stream-aquifer interactions arises from the inherent complex-nested flow paths and spatio-temporal variability of streambed hydraulic properties. A detailed field investigation of streambed hydraulic conductivity (Ks) using Guelph Permeameter was carried out in an intermittent stream reach of the Pavanje river basin located in the mountainous, forested tract of western ghats of India. The present study reports the spatial and temporal variability of streambed hydraulic conductivity along the stream reach obstructed by two Vented Dams in sequence. Statistical tests such as Levene's and Welch's t-tests were employed to check for various variability measures. The strength of spatial dependence and the presence of spatial autocorrelation among the streambed Ks samples were tested by using Moran's I statistic. The measures of central tendency and dispersion pointed out reasonable spatial variability in Ks distribution throughout the study reach during two consecutive years 2016 and 2017. The streambed was heterogeneous with regard to hydraulic conductivity distribution with high-Ks zones near the backwater areas of the vented dam and low-Ks zones particularly at the tail water section of vented dams. Dam operational strategies were responsible for seasonal fluctuations in sedimentation and modifications to streambed substrate characteristics (such as porosity, grain size, packing etc.), resulting in heterogeneous streambed Ks profiles. The channel downstream of vented dams contained significantly more cohesive deposits of fine sediment due to the overflow of surplus suspended sediment-laden water at low velocity
Model for neural signaling leap statistics
International Nuclear Information System (INIS)
Chevrollier, Martine; Oria, Marcos
2011-01-01
We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T 37.5 0 C, awaken regime) and Levy statistics (T = 35.5 0 C, sleeping period), characterized by rare events of long range connections.
Model for neural signaling leap statistics
Chevrollier, Martine; Oriá, Marcos
2011-03-01
We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T = 37.5°C, awaken regime) and Lévy statistics (T = 35.5°C, sleeping period), characterized by rare events of long range connections.
Model for neural signaling leap statistics
Energy Technology Data Exchange (ETDEWEB)
Chevrollier, Martine; Oria, Marcos, E-mail: oria@otica.ufpb.br [Laboratorio de Fisica Atomica e Lasers Departamento de Fisica, Universidade Federal da ParaIba Caixa Postal 5086 58051-900 Joao Pessoa, Paraiba (Brazil)
2011-03-01
We present a simple model for neural signaling leaps in the brain considering only the thermodynamic (Nernst) potential in neuron cells and brain temperature. We numerically simulated connections between arbitrarily localized neurons and analyzed the frequency distribution of the distances reached. We observed qualitative change between Normal statistics (with T 37.5{sup 0}C, awaken regime) and Levy statistics (T = 35.5{sup 0}C, sleeping period), characterized by rare events of long range connections.
East-West European economic integration: Difficult to reach target
International Nuclear Information System (INIS)
D'Ermo, V.; Manca, S.
1993-01-01
The energy sector of Western Europe is now undergoing a slow growth period due largely to the socio-economic upheavals of East and West German unification and the political-economic restructuring of the countries making up Eastern Europe and the former Soviet Union. This paper evidences this fact by tabling and commenting on 1991-1992 coal, petroleum, natural gas and electric power production/consumption/export statistical data representing energy sector activities in the former COMECON member countries. The poor performance of these countries can be attributed to the effects of energy market liberalization, the restructuring of utility assets, limited production capacities and inflation. It is estimated that the adjustment time to reach economic parity with Western nations will be long but that the waiting period could be shortened through the implementation of technology transfer and financial cooperation programs with the more prosperous countries capable of providing the investment capital and know-how needed for the restructuring of production systems and resource development
Guaranteed performance in reaching mode of sliding mode ...
Indian Academy of Sciences (India)
R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22
addresses the design of constant plus proportional rate reaching law-based SMC for second-order ... Reaching mode; sliding mode controlled systems; output tracking ... The uncertainty in the input distribution function g is expressed as.
Nanomaterials under REACH. Nanosilver as a case study
Pronk MEJ; Wijnhoven SWP; Bleeker EAJ; Heugens EHW; Peijnenburg WJGM; Luttik R; Hakkert BC; SEC; SIR; LER
2009-01-01
Om de risico's van nanomaterialen te kunnen inschatten en beheersen, zijn enkele aanpassingen nodig in de Europese chemicalienwetgeving REACH. De gegevens over stoffen waar REACH standaard om vraagt, zijn namelijk onvoldoende om de specifieke eigenschappen van nanomaterialen te bepalen. Hetzelfde
Reaching Adolescents and Youth in Burkina Faso, Guinea-Bissau
African Journals Online (AJOL)
AJRH Managing Editor
typical profile of individuals in contact with peer educators or attending youth ... being reached (versus not reached) by programs ... characteristics in order to serve groups that may be ... places for counseling services but the frequency of.
Minetti, Andrea; Hurtado, Northan; Grais, Rebecca F.; Ferrari, Matthew
2014-01-01
Current mass vaccination campaigns in measles outbreak response are nonselective with respect to the immune status of individuals. However, the heterogeneity in immunity, due to previous vaccination coverage or infection, may lead to potential bias of such campaigns toward those with previous high access to vaccination and may result in a lower-than-expected effective impact. During the 2010 measles outbreak in Malawi, only 3 of the 8 districts where vaccination occurred achieved a measureable effective campaign impact (i.e., a reduction in measles cases in the targeted age groups greater than that observed in nonvaccinated districts). Simulation models suggest that selective campaigns targeting hard-to-reach individuals are of greater benefit, particularly in highly vaccinated populations, even for low target coverage and with late implementation. However, the choice between targeted and nonselective campaigns should be context specific, achieving a reasonable balance of feasibility, cost, and expected impact. In addition, it is critical to develop operational strategies to identify and target hard-to-reach individuals. PMID:24131555
[Comment on] Statistical discrimination
Chinn, Douglas
In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.
Interaction torque contributes to planar reaching at slow speed
Directory of Open Access Journals (Sweden)
Hoshi Fumihiko
2008-10-01
Full Text Available Abstract Background How the central nervous system (CNS organizes the joint dynamics for multi-joint movement is a complex problem, because of the passive interaction among segmental movements. Previous studies have demonstrated that the CNS predictively compensates for interaction torque (INT which is arising from the movement of the adjacent joints. However, most of these studies have mainly examined quick movements, presumably because the current belief is that the effects of INT are not significant at slow speeds. The functional contribution of INT for multijoint movements performed in various speeds is still unclear. The purpose of this study was to examine the contribution of INT to a planer reaching in a wide range of motion speeds for healthy subjects. Methods Subjects performed reaching movements toward five targets under three different speed conditions. Joint position data were recorded using a 3-D motion analysis device (50 Hz. Torque components, muscle torque (MUS, interaction torque (INT, gravity torque (G, and net torque (NET were calculated by solving the dynamic equations for the shoulder and elbow. NET at a joint which produces the joint kinematics will be an algebraic sum of torque components; NET = MUS - G - INT. Dynamic muscle torque (DMUS = MUS-G was also calculated. Contributions of INT impulse and DMUS impulse to NET impulse were examined. Results The relative contribution of INT to NET was not dependent on speed for both joints at every target. INT was additive (same direction to DMUS at the shoulder joint, while in the elbow DMUS counteracted (opposed to INT. The trajectory of reach was linear and two-joint movements were coordinated with a specific combination at each target, regardless of motion speed. However, DMUS at the elbow was opposed to the direction of elbow movement, and its magnitude varied from trial to trial in order to compensate for the variability of INT. Conclusion Interaction torque was important at
Preventing statistical errors in scientific journals.
Nuijten, M.B.
2016-01-01
There is evidence for a high prevalence of statistical reporting errors in psychology and other scientific fields. These errors display a systematic preference for statistically significant results, distorting the scientific literature. There are several possible causes for this systematic error
Decoding Grasping Movements from the Parieto-Frontal Reaching Circuit in the Nonhuman Primate.
Nelissen, Koen; Fiave, Prosper Agbesi; Vanduffel, Wim
2018-04-01
Prehension movements typically include a reaching phase, guiding the hand toward the object, and a grip phase, shaping the hand around it. The dominant view posits that these components rely upon largely independent parieto-frontal circuits: a dorso-medial circuit involved in reaching and a dorso-lateral circuit involved in grasping. However, mounting evidence suggests a more complex arrangement, with dorso-medial areas contributing to both reaching and grasping. To investigate the role of the dorso-medial reaching circuit in grasping, we trained monkeys to reach-and-grasp different objects in the dark and determined if hand configurations could be decoded from functional magnetic resonance imaging (MRI) responses obtained from the reaching and grasping circuits. Indicative of their established role in grasping, object-specific grasp decoding was found in anterior intraparietal (AIP) area, inferior parietal lobule area PFG and ventral premotor region F5 of the lateral grasping circuit, and primary motor cortex. Importantly, the medial reaching circuit also conveyed robust grasp-specific information, as evidenced by significant decoding in parietal reach regions (particular V6A) and dorsal premotor region F2. These data support the proposed role of dorso-medial "reach" regions in controlling aspects of grasping and demonstrate the value of complementing univariate with more sensitive multivariate analyses of functional MRI (fMRI) data in uncovering information coding in the brain.
Ibarra, Jose Luis; Agas, Jessica Marie; Lee, Melissa; Pan, Julia Lily; Buttenheim, Alison Meredith
2018-04-16
Recruiting hard-to-reach populations for health research is challenging. Web-based platforms offer one way to recruit specific samples for research purposes, but little is known about the feasibility of online recruitment and the representativeness and comparability of samples recruited through different Web-based platforms. The objectives of this study were to determine the feasibility of recruiting a hard-to-reach population (pregnant smokers) using 4 different Web-based platforms and to compare participants recruited through each platform. A screener and survey were distributed online through Qualtrics Panel, Soapbox Sample, Reddit, and Amazon Mechanical Turk (mTurk). Descriptive statistics were used to summarize results of each recruitment platform, including eligibility yield, quality yield, income, race, age, and gestational age. Of the 3847 participants screened for eligibility across all 4 Web-based platforms, 535 were eligible and 308 completed the survey. Amazon mTurk yielded the fewest completed responses (n=9), 100% (9/9) of which passed several quality metrics verifying pregnancy and smoking status. Qualtrics Panel yielded 14 completed responses, 86% (12/14) of which passed the quality screening. Soapbox Sample produced 107 completed surveys, 67% (72/107) of which were found to be quality responses. Advertising through Reddit produced the highest completion rate (n=178), but only 29.2% (52/178) of those surveys passed the quality metrics. We found significant differences in eligibility yield, quality yield, age, number of previous pregnancies, age of smoking initiation, current smokers, race, education, and income (Precruited pregnant smokers, results varied in quality, cost, and percentage of complete responses. Moving forward, investigators should pay careful attention to the percentage yield and cost of online recruitment platforms to maximize internal and external validity. ©Jose Luis Ibarra, Jessica Marie Agas, Melissa Lee, Julia Lily Pan, Alison
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Transport Statistics - Transport - UNECE
Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6
IAEA Patient Protection Effort Reaches Key Milestone
International Nuclear Information System (INIS)
2012-01-01
Full text: An International Atomic Energy Agency (IAEA) effort to help people track their radiation exposure from medical procedures achieved a significant milestone this week. The Agency received the final approval from a group of medical oversight organizations for the 'Joint Position Statement on the IAEA Patient Radiation Exposure Tracking', a set of principles to guide patient protection efforts at the sub-national, national, and international level. The joint statement endorses the IAEA's three-year-old Smart Card/SmartRadTrack project, which aims to help nations develop systems to track medical radiation procedures and radiation doses. The statement has been agreed by the World Health Organization (WHO), the U.S. Food and Drug Administration (FDA), the European Society of Radiology (ESR), the International Organization for Medical Physics (IOMP), the International Society of Radiographers and Radiological Technologists (ISRRT), and the Conference of Radiation Control Program Directors, USA (CRCPD). 'This system is critical if the medical community is going to keep patients safe when they are being referred for more and more diagnostic scans. These scans, over the years, are made using more and more powerful machines', said Madan Rehani, Radiation Safety Specialist in the IAEA's Radiation Protection of Patients Unit. 'The tracking system will draw doctors' attention to previous radiological examinations, both in terms of clinical information and radiation dose and thus help them assess whether the 11th or 20th CT scan is really appropriate, whether it will do more good than harm.' Advances in radiation-based diagnostic technologies, such as the CT scan, have led to patients receiving such procedures more frequently. The convenience of CT with the added advantage of increased information has resulted in increased usage to the point that there are instances of patients getting tens of CT scans in a few years, not all of which may be justified, or getting CT
Directory of Open Access Journals (Sweden)
K. Geina
2015-03-01
Full Text Available Purpose. To analyze the dynamics of pike (Esox luceus Linnaeus, 1758 age structure of the Dnieper lower reaches in conditions of the modification of fishing pressure. Methodology. An analysis of fishing situation has been performed based on data of official fishery statistics. Fish sampling was done at control-observation posts of the Institute of Fisheries of the NAAS of Ukraine and directly in the fishery. Juvenile fish yield was determined using a complex of fry fishing gears using a stationary net-station. Field and cameral processing of the material was performed using generally accepted methods. Findings. A retrospective analysis of the situation in the Dnieper-Bug lower reach system clearly indicates on the presence of continuous tendency of catch decline of representative of native fish fauna – pike. With relatively uniform indices of the “yield” of its juveniles before Dnieper flow impoundment and in conditions of present time, its commercial catches significantly dropped. The dynamics of pike current age structure indicates on an increase of relative density of age groups, which form the recruitment of the commercial portion of the population (1-1+ and a decrease of importance of the component of the right side of age series. A discrepancy between the observed changes of the age group and commercial harvest quantities indicates on increased human pressure on this species. Originality. For the first, we analyzed the dynamics of fish juvenile “yield” and age structure of pike commercial stock of the Dnieper lower reaches in the river flow transformation process. Practical value. A decrease of the ichthyomass of piscivorous fishes in the Dnieper lower reaches results in changes of fish populations of littoral biotopes towards the prevalence of the dominance of coarse species that lead to a deterioration of forage availability for a number of valuable commercial species. An increase of the number of pike can regulate the strain
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Statistical analysis and data management
International Nuclear Information System (INIS)
Anon.
1981-01-01
This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found
Statistical Power in Plant Pathology Research.
Gent, David H; Esker, Paul D; Kriss, Alissa B
2018-01-01
In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even
Detecting Novelty and Significance
Ferrari, Vera; Bradley, Margaret M.; Codispoti, Maurizio; Lang, Peter J.
2013-01-01
Studies of cognition often use an “oddball” paradigm to study effects of stimulus novelty and significance on information processing. However, an oddball tends to be perceptually more novel than the standard, repeated stimulus as well as more relevant to the ongoing task, making it difficult to disentangle effects due to perceptual novelty and stimulus significance. In the current study, effects of perceptual novelty and significance on ERPs were assessed in a passive viewing context by presenting repeated and novel pictures (natural scenes) that either signaled significant information regarding the current context or not. A fronto-central N2 component was primarily affected by perceptual novelty, whereas a centro-parietal P3 component was modulated by both stimulus significance and novelty. The data support an interpretation that the N2 reflects perceptual fluency and is attenuated when a current stimulus matches an active memory representation and that the amplitude of the P3 reflects stimulus meaning and significance. PMID:19400680
Significant NRC Enforcement Actions
Nuclear Regulatory Commission — This dataset provides a list of Nuclear Regulartory Commission (NRC) issued significant enforcement actions. These actions, referred to as "escalated", are issued by...
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Action plans can interact to hinder or facilitate reach performance.
Fournier, Lisa R; Wiediger, Matthew D; Taddese, Ezana F
2015-11-01
Executing a reach action can be delayed while retaining another action in working memory (WM) if the two action plans partly overlap rather than do not overlap. This delay (partial repetition cost) occurs when reach responses are under cognitive control. In this study, we investigated whether facilitation (a partial repetition benefit) occurs when reach responses are automatic. We also examined whether the hemisphere controlling the limb or selection of the preferred limb (based on a free-reach task) influences reach performance when the actions partly overlap. Left- and right-handers reached to different stimulus locations to the left and right of body midline with their ipsilateral hand while maintaining an action plan in WM that required the same or the different hand. The results showed a partial repetition benefit for spatially compatible reaches to left and right stimulus locations far from the body midline, but not for those near the body midline. Also, no partial repetition cost was found at any of the stimulus-reach locations. This indicates that automatic reach responses that partly overlap with an action plan maintained in WM are not delayed, but instead can be facilitated (partial repetition benefit). The roles of hemisphere and reach-hand preference in action control and the importance of the degree of feature overlap in obtaining a partial repetition benefit (and cost) are discussed.
International Nuclear Information System (INIS)
Dienes, J.K.
1993-01-01
Although it is possible to simulate the ground blast from a single explosive shot with a simple computer algorithm and appropriate constants, the most commonly used modelling methods do not account for major changes in geology or shot energy because mechanical features such as tectonic stresses, fault structure, microcracking, brittle-ductile transition, and water content are not represented in significant detail. An alternative approach for modelling called Statistical Crack Mechanics is presented in this paper. This method, developed in the seventies as a part of the oil shale program, accounts for crack opening, shear, growth, and coalescence. Numerous photographs and micrographs show that shocked materials tend to involve arrays of planar cracks. The approach described here provides a way to account for microstructure and give a representation of the physical behavior of a material at the microscopic level that can account for phenomena such as permeability, fragmentation, shear banding, and hot-spot formation in explosives
Graphene Statistical Mechanics
Bowick, Mark; Kosmrlj, Andrej; Nelson, David; Sknepnek, Rastko
2015-03-01
Graphene provides an ideal system to test the statistical mechanics of thermally fluctuating elastic membranes. The high Young's modulus of graphene means that thermal fluctuations over even small length scales significantly stiffen the renormalized bending rigidity. We study the effect of thermal fluctuations on graphene ribbons of width W and length L, pinned at one end, via coarse-grained Molecular Dynamics simulations and compare with analytic predictions of the scaling of width-averaged root-mean-squared height fluctuations as a function of distance along the ribbon. Scaling collapse as a function of W and L also allows us to extract the scaling exponent eta governing the long-wavelength stiffening of the bending rigidity. A full understanding of the geometry-dependent mechanical properties of graphene, including arrays of cuts, may allow the design of a variety of modular elements with desired mechanical properties starting from pure graphene alone. Supported by NSF grant DMR-1435794
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.
Caçola, Priscila M; Pant, Mohan D
2014-10-01
The purpose was to use a multi-level statistical technique to analyze how children's age, motor proficiency, and cognitive styles interact to affect accuracy on reach estimation tasks via Motor Imagery and Visual Imagery. Results from the Generalized Linear Mixed Model analysis (GLMM) indicated that only the 7-year-old age group had significant random intercepts for both tasks. Motor proficiency predicted accuracy in reach tasks, and cognitive styles (object scale) predicted accuracy in the motor imagery task. GLMM analysis is suitable to explore age and other parameters of development. In this case, it allowed an assessment of motor proficiency interacting with age to shape how children represent, plan, and act on the environment.
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
Recreational Boating Statistics 2011
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...
Tuberculosis Data and Statistics
... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National Transportation Statistics 2008
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
National Transportation Statistics 2009
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Interactive statistics with ILLMO
Martens, J.B.O.S.
2014-01-01
Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Youth Sports Safety Statistics
... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...
Reach/frequency for printed media: Personal probabilities or models
DEFF Research Database (Denmark)
Mortensen, Peter Stendahl
2000-01-01
The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded ...... and estiamtes from such models are shown to be closer to panel data. the problem, however, is to get valid input for such models from readership surveys. Means for this are discussed....
On two methods of statistical image analysis
Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.
1999-01-01
The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,
Investigation of PAM-4 for extending reach in data center interconnect applications
DEFF Research Database (Denmark)
Vegas Olmos, Juan José; Teipen, Brian; Eiselt, Nicklas
2015-01-01
Optical four-level pulse amplitude modulation (PAM-4) is being widely studied for various short-reach optical interfaces, motivated by the need to keep cost structure low, and to increase link capacity despite various constraints in component bandwidth. When considering PAM-4 in applications...... with reach significantly greater than 10km, such as in extended data center interconnects which require optical amplification, impairments such as chromatic dispersion, optical filtering, and ASE must be controlled. We investigate and report on requirements of PAM-4 for extended-reach, data center...
Evaluation of Juvenile Fall Chinook Stranding on the Hanford Reach, 1997-1999 Interim Report.
Energy Technology Data Exchange (ETDEWEB)
Wagner, Paul; Nugent, John; Price, William (Washington Department of Fish and Wildlife, Olympia, WA)
1999-02-15
Pilot work conducted in 1997 to aid the development of the study for the 1998 Evaluation of Juvenile Fall Chinook Stranding on The Hanford Reach. The objectives of the 1997 work were to: (1) identify juvenile chinook production and rearing areas..., (2) identify sampling sites and develop the statistical parameters necessary to complete the study, (3) develop a study plan..., (4) conduct field sampling activities...
Parametric design studies of long-reach manipulators
International Nuclear Information System (INIS)
Kwon, D.S.; March-Leuba, S.; Babcock, S.M.; Burks, B.L.; Hamel, W.R.
1993-01-01
A number of different approaches have been studied for remediation of waste storage tanks at various sites. One of the most promising approaches is the use of a high-capacity, long-reach manipulation (LRM) system with a variety of end effectors for dislodging the waste. LRMs may have characteristics significantly different from those of industrial robots due to the long links needed to cover the large workspace. Because link lengths are much greater than their diameters, link flexibility, as well as joint or drive train flexibility, is likely to be significant. LRMs will be required for a variety of applications in the Environmental Restoration and Waste Management Program. While each application will present specific functional, kinematic, and performance requirements, a design approach for determining the kinematic applicability and performance characteristics considering link flexibility is presented with a focus on waste storage tank remediation. This paper addresses key design issues for LRM-based waste retrieval systems. It discusses the effects of parameters such as payload capacity, storage tanks size, and access port diameter on manipulator structural design. The estimated weight, fundamental natural frequency, and static deflection of the manipulator have been calculated for various parameter conditions
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Kim, E.; Newton, A. P.
2012-04-01
One major problem in dynamo theory is the multi-scale nature of the MHD turbulence, which requires statistical theory in terms of probability distribution functions. In this contribution, we present the statistical theory of magnetic fields in a simplified mean field α-Ω dynamo model by varying the statistical property of alpha, including marginal stability and intermittency, and then utilize observational data of solar activity to fine-tune the mean field dynamo model. Specifically, we first present a comprehensive investigation into the effect of the stochastic parameters in a simplified α-Ω dynamo model. Through considering the manifold of marginal stability (the region of parameter space where the mean growth rate is zero), we show that stochastic fluctuations are conductive to dynamo. Furthermore, by considering the cases of fluctuating alpha that are periodic and Gaussian coloured random noise with identical characteristic time-scales and fluctuating amplitudes, we show that the transition to dynamo is significantly facilitated for stochastic alpha with random noise. Furthermore, we show that probability density functions (PDFs) of the growth-rate, magnetic field and magnetic energy can provide a wealth of useful information regarding the dynamo behaviour/intermittency. Finally, the precise statistical property of the dynamo such as temporal correlation and fluctuating amplitude is found to be dependent on the distribution the fluctuations of stochastic parameters. We then use observations of solar activity to constrain parameters relating to the effect in stochastic α-Ω nonlinear dynamo models. This is achieved through performing a comprehensive statistical comparison by computing PDFs of solar activity from observations and from our simulation of mean field dynamo model. The observational data that are used are the time history of solar activity inferred for C14 data in the past 11000 years on a long time scale and direct observations of the sun spot
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze
2014-08-01
Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.
Dunbar, P. K.; Furtney, M.; McLean, S. J.; Sweeney, A. D.
2014-12-01
Tsunamis have inflicted death and destruction on the coastlines of the world throughout history. The occurrence of tsunamis and the resulting effects have been collected and studied as far back as the second millennium B.C. The knowledge gained from cataloging and examining these events has led to significant changes in our understanding of tsunamis, tsunami sources, and methods to mitigate the effects of tsunamis. The most significant, not surprisingly, are often the most devastating, such as the 2011 Tohoku, Japan earthquake and tsunami. The goal of this poster is to give a brief overview of the occurrence of tsunamis and then focus specifically on several significant tsunamis. There are various criteria to determine the most significant tsunamis: the number of deaths, amount of damage, maximum runup height, had a major impact on tsunami science or policy, etc. As a result, descriptions will include some of the most costly (2011 Tohoku, Japan), the most deadly (2004 Sumatra, 1883 Krakatau), and the highest runup ever observed (1958 Lituya Bay, Alaska). The discovery of the Cascadia subduction zone as the source of the 1700 Japanese "Orphan" tsunami and a future tsunami threat to the U.S. northwest coast, contributed to the decision to form the U.S. National Tsunami Hazard Mitigation Program. The great Lisbon earthquake of 1755 marked the beginning of the modern era of seismology. Knowledge gained from the 1964 Alaska earthquake and tsunami helped confirm the theory of plate tectonics. The 1946 Alaska, 1952 Kuril Islands, 1960 Chile, 1964 Alaska, and the 2004 Banda Aceh, tsunamis all resulted in warning centers or systems being established.The data descriptions on this poster were extracted from NOAA's National Geophysical Data Center (NGDC) global historical tsunami database. Additional information about these tsunamis, as well as water level data can be found by accessing the NGDC website www.ngdc.noaa.gov/hazard/
Should these potential CMR substances have been registered under REACH?
DEFF Research Database (Denmark)
Wedebye, Eva Bay; Nikolov, Nikolai Georgiev; Dybdahl, Marianne
2013-01-01
(Q)SAR models were applied to screen around 68,000 REACH pre-registered substances for CMR properties (carcinogenic, mutagenic or toxic to reproduction). Predictions from 14 relevant models were combined to reach overall calls for C, M and R. Combining predictions may reduce “noise” and increase...
Ma, Kai; Huang, Xiaorong; Guo, Biying; Wang, Yanqiu; Gao, Linyun
2018-06-01
Land use changes alter the hydrological characteristics of the land surface, and have significant impacts on hydrological cycle and water balance, the analysis of complex effects on natural systems has become one of the main concerns. In this study, we generated the land use conversion matrixes using ArcGIS and selected several landscape indexes (contagion index, CONTAG, Shannon's diversity index, SHDI, etc.) to evaluate the impact of land use/cover changes on hydrological process in the upper reaches of Minjiang River. We also used a statistical regression model which was established based on hydrology and precipitation data during the period of 1959-2008 to simulate the impacts of different land use conditions on rainfall and runoff in different periods. Our results showed that the simulated annual mean flow from 1985 to 1995 and 1995 to 2008 are 9.19 and 1.04 m3 s-1 lower than the measured values, respectively, which implied that the ecological protection measures should be strengthened in the study area. Our study could provide a scientific basis for water resource management and proper land use planning of upper reaches of Minjiang River.
Conversion factors and oil statistics
International Nuclear Information System (INIS)
Karbuz, Sohbet
2004-01-01
World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers
Gabbard, Carl; Cordova, Alberto
2013-01-01
Recent studies indicate that the ability to mentally represent action using motor imagery declines with advanced age (>64 years). As the ability to represent action declines, the elderly may experience increasing difficulty with movement planning and execution. Here, we determined the association between estimation of reach via use of motor imagery and actual FR. Young adults (M=22 years) and older adults (M=66 years) estimated reach while standing with targets randomly presented in peripersonal (within actual reach) and extrapersonal (beyond reach) space. Imagined responses were compared to the individual's scaled maximum reach. FR, also while standing, was assessed using the standardized Functional Reach Test (FRT). Results for total score estimation accuracy showed that there was no difference for age; however, results for mean bias and distribution of error revealed that the older group underestimated while the younger group overestimated. In reference to FR, younger adults outperformed older adults (30 versus 14in.) and most prominent, only the younger group showed a significant relationship between estimation and FR. In addition to gaining insight to the effects of advanced age on the ability to mentally represent action and its association with movement execution, these results although preliminary, may have clinical implications based on the question of whether motor imagery training could improve movement estimations and how that might affect actual reach. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Bartsch, David A; Rodgers, Vicki K; Strong, Don
2013-01-01
Outcomes of older adults referred for care management and mental health services through the senior reach gatekeeper model of case finding were examined in this study and compared with the Spokane gatekeeper model Colorado Senior Reach and the Mid-Kansas Senior Outreach (MKSO) programs are the two Senior Reach Gatekeeper programs modeled after the Spokane program, employing the same community education and gatekeeper model and with mental health treatment for elderly adults in need of support. The three mature programs were compared on seniors served isolation, and depression ratings. Nontraditional community gatekeepers were trained and referred seniors in need. Findings indicate that individuals served by the two Senior Reach Gatekeeper programs demonstrated significant improvements. Isolation indicators such as social isolation decreased and depression symptoms and suicide ideation also decreased. These findings for two Senior Reach Gatekeeper programs demonstrate that the gatekeeper approach to training community partners worked in referring at-risk seniors in need in meeting their needs, and in having a positive impact on their lives.
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Dolph, Christine L.; Eggert, Susan L.; Magner, Joe; Ferrington, Leonard C.; Vondracek, Bruce C.
2015-01-01
Recent studies suggest that stream restoration at the reach scale may not increase stream biodiversity, raising concerns about the utility of this conservation practice. We examined whether reach-scale restoration in disturbed agricultural streams was associated with changes in macroinvertebrate community structure (total macroinvertebrate taxon richness, total macroinvertebrate density, Ephemeroptera, Plecoptera, Trichoptera [EPT] taxon richness, % abundance of EPT taxa) or secondary production (macroinvertebrate biomass over time). We collected macroinvertebrate samples over the course of 1 y from restored and unrestored reaches of 3 streams in southern Minnesota and used generalized least-square (GLS) models to assess whether measures of community structure were related to reach type, stream site, or sampling month. After accounting for effects of stream site and time, we found no significant difference in total taxon richness or % abundance of EPT taxa between restored and unrestored reaches. However, the number of EPT taxa and macroinvertebrate density were significantly higher in restored than in unrestored reaches. We compared secondary production estimates among study reaches based on 95th-percentile confidence intervals generated via bootstrapping. In each study stream, secondary production was significantly (2–3×) higher in the restored than in the unrestored reach. Higher productivity in the restored reaches was largely a result of the disproportionate success of a few dominant, tolerant taxa. Our findings suggest that reach-scale restoration may have ecological effects that are not detected by measures of total taxon richness alone.
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Mineral industry statistics 1975
Energy Technology Data Exchange (ETDEWEB)
1978-01-01
Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Reaching for the true overlay in advanced nodes
Koay, Chiew-seng; Hamieh, Bassem; Felix, Nelson; Gaudiello, John
2017-03-01
Traditionally, the total measurement uncertainty (TMU) of overlay metrology focuses on dynamic precision, toolinduced-shift, and matching, while rarely examining inaccuracy. However, some researchers have recently shown that measurement inaccuracy can still be large despite optimized small TMU. Moreover, this inaccuracy can consume a significant portion of the overlay budget in the advanced nodes. In addition to qualifying the overlay error of inline wafers, overlay metrology is also used for improving on-product overlay as it provides corrective feedback to the lithography scanner. The accuracy of the correction terms as a result depends directly upon the measurement accuracy. As such, enhanced overlay accuracy will improve the overlay performance of reworked wafers, or subsequently exposed wafers. We have previously shown that a segmented Blossom target is more prone to asymmetry-induced inaccuracy than a nonsegmented target is [1]. Since target segmentation is inevitable for SADP and SAQP patterning processes, their resulting overlay performance leaves a lot to be desired. In our quest to reach for the true overlay, this paper reports our investigations on accuracy enhancement techniques for image-based targets, such as redundancy and self-calibration, and on the use of simulation-optimized scatterometry-based targets.
Electromyographic activity of beating and reaching during simulated boardsailing.
Buchanan, M; Cunningham, P; Dyson, R J; Hurrion, P D
1996-04-01
This study examined the responses of six competitive boardsailors (three males, three females) during laboratory-based simulation tasks while the electromyographic activity of up to 13 muscles was recorded. A sailboard, mounted in a steel frame and resting on a waterbed, allowed simulation of roll and pitch movements. Wind force was simulated by attaching the boom to a weight stack with a hydraulically controlled buffered release phase. The progression of the simulation test was controlled by the sailor copying movements on an edited video of each subject boardsailing on the open water. Analysis of individual pumping movements for mean peak percentage of maximal enveloped voluntary contraction (%MEVC) in 'beating' and 'reaching' showed that muscular activity in the arm (flexor carpi ulnaris, extensor carpi radialis and biceps brachii) was greatest (66-94% MEVC), with considerable activity (58-75% MEVC) in the deltoid and trapezius shoulder muscles, but much less activity in the leg muscles (16-40% MEVC). For the combined upper and lower body muscles there was a significant difference (P reflecting the current dynamic nature of the sport.
Safety significance evaluation system
International Nuclear Information System (INIS)
Lew, B.S.; Yee, D.; Brewer, W.K.; Quattro, P.J.; Kirby, K.D.
1991-01-01
This paper reports that the Pacific Gas and Electric Company (PG and E), in cooperation with ABZ, Incorporated and Science Applications International Corporation (SAIC), investigated the use of artificial intelligence-based programming techniques to assist utility personnel in regulatory compliance problems. The result of this investigation is that artificial intelligence-based programming techniques can successfully be applied to this problem. To demonstrate this, a general methodology was developed and several prototype systems based on this methodology were developed. The prototypes address U.S. Nuclear Regulatory Commission (NRC) event reportability requirements, technical specification compliance based on plant equipment status, and quality assurance assistance. This collection of prototype modules is named the safety significance evaluation system
Predicting significant torso trauma.
Nirula, Ram; Talmor, Daniel; Brasel, Karen
2005-07-01
Identification of motor vehicle crash (MVC) characteristics associated with thoracoabdominal injury would advance the development of automatic crash notification systems (ACNS) by improving triage and response times. Our objective was to determine the relationships between MVC characteristics and thoracoabdominal trauma to develop a torso injury probability model. Drivers involved in crashes from 1993 to 2001 within the National Automotive Sampling System were reviewed. Relationships between torso injury and MVC characteristics were assessed using multivariate logistic regression. Receiver operating characteristic curves were used to compare the model to current ACNS models. There were a total of 56,466 drivers. Age, ejection, braking, avoidance, velocity, restraints, passenger-side impact, rollover, and vehicle weight and type were associated with injury (p < 0.05). The area under the receiver operating characteristic curve (83.9) was significantly greater than current ACNS models. We have developed a thoracoabdominal injury probability model that may improve patient triage when used with ACNS.
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Directory of Open Access Journals (Sweden)
Chiara eBegliomini
2014-09-01
Full Text Available Experimental evidence suggests the existence of a sophisticated brain circuit specifically dedicated to reach-to-grasp planning and execution, both in human and non human primates (Castiello, 2005. Studies accomplished by means of neuroimaging techniques suggest the hypothesis of a dichotomy between a reach-to-grasp circuit, involving the intraparietal area (AIP, the dorsal and ventral premotor cortices (PMd and PMv - Castiello and Begliomini, 2008; Filimon, 2010 and a reaching circuit involving the medial intraparietal area (mIP and the Superior Parieto-Occipital Cortex (SPOC (Culham et al., 2006. However, the time course characterizing the involvement of these regions during the planning and execution of these two types of movements has yet to be delineated. A functional magnetic resonance imaging (fMRI study has been conducted, including reach-to grasp and reaching only movements, performed towards either a small or a large stimulus, and Finite Impulse Response model (FIR - Henson, 2003 was adopted to monitor activation patterns from stimulus onset for a time window of 10 seconds duration. Data analysis focused on brain regions belonging either to the reaching or to the grasping network, as suggested by Castiello & Begliomini (2008.Results suggest that reaching and grasping movements planning and execution might share a common brain network, providing further confirmation to the idea that the neural underpinnings of reaching and grasping may overlap in both spatial and temporal terms (Verhagen et al., 2013.
Proprioceptive body illusions modulate the visual perception of reaching distance.
Directory of Open Access Journals (Sweden)
Agustin Petroni
Full Text Available The neurobiology of reaching has been extensively studied in human and non-human primates. However, the mechanisms that allow a subject to decide-without engaging in explicit action-whether an object is reachable are not fully understood. Some studies conclude that decisions near the reach limit depend on motor simulations of the reaching movement. Others have shown that the body schema plays a role in explicit and implicit distance estimation, especially after motor practice with a tool. In this study we evaluate the causal role of multisensory body representations in the perception of reachable space. We reasoned that if body schema is used to estimate reach, an illusion of the finger size induced by proprioceptive stimulation should propagate to the perception of reaching distances. To test this hypothesis we induced a proprioceptive illusion of extension or shrinkage of the right index finger while participants judged a series of LEDs as reachable or non-reachable without actual movement. Our results show that reach distance estimation depends on the illusory perceived size of the finger: illusory elongation produced a shift of reaching distance away from the body whereas illusory shrinkage produced the opposite effect. Combining these results with previous findings, we suggest that deciding if a target is reachable requires an integration of body inputs in high order multisensory parietal areas that engage in movement simulations through connections with frontal premotor areas.
Environmental stressors afflicting tailwater stream reaches across the United States
Miranda, Leandro E.; Krogman, R. M.
2014-01-01
The tailwater is the reach of a stream immediately below an impoundment that is hydrologically, physicochemically and biologically altered by the presence and operation of a dam. The overall goal of this study was to gain a nationwide awareness of the issues afflicting tailwater reaches in the United States. Specific objectives included the following: (i) estimate the percentage of reservoirs that support tailwater reaches with environmental conditions suitable for fish assemblages throughout the year, (ii) identify and quantify major sources of environmental stress in those tailwaters that do support fish assemblages and (iii) identify environmental features of tailwater reaches that determine prevalence of key fish taxa. Data were collected through an online survey of fishery managers. Relative to objective 1, 42% of the 1306 reservoirs included in this study had tailwater reaches with sufficient flow to support a fish assemblage throughout the year. The surface area of the reservoir and catchment most strongly delineated reservoirs maintaining tailwater reaches with or without sufficient flow to support a fish assemblage throughout the year. Relative to objective 2, major sources of environmental stress generally reflected flow variables, followed by water quality variables. Relative to objective 3, zoogeography was the primary factor discriminating fish taxa in tailwaters, followed by a wide range of flow and water quality variables. Results for objectives 1–3 varied greatly among nine geographic regions distributed throughout the continental United States. Our results provide a large-scale view of the effects of reservoirs on tailwater reaches and may help guide research and management needs.
Testing statistical hypotheses of equivalence
Wellek, Stefan
2010-01-01
Equivalence testing has grown significantly in importance over the last two decades, especially as its relevance to a variety of applications has become understood. Yet published work on the general methodology remains scattered in specialists' journals, and for the most part, it focuses on the relatively narrow topic of bioequivalence assessment.With a far broader perspective, Testing Statistical Hypotheses of Equivalence provides the first comprehensive treatment of statistical equivalence testing. The author addresses a spectrum of specific, two-sided equivalence testing problems, from the
International Nuclear Information System (INIS)
Supe, S.J.; Nagalaxmi, K.V.; Meenakshi, L.
1983-01-01
In the practice of radiotherapy, various concepts like NSD, CRE, TDF, and BIR are being used to evaluate the biological effectiveness of the treatment schedules on the normal tissues. This has been accepted as the tolerance of the normal tissue is the limiting factor in the treatment of cancers. At present when various schedules are tried, attention is therefore paid to the biological damage of the normal tissues only and it is expected that the damage to the cancerous tissues would be extensive enough to control the cancer. Attempt is made in the present work to evaluate the concent of tumor significant dose (TSD) which will represent the damage to the cancerous tissue. Strandquist in the analysis of a large number of cases of squamous cell carcinoma found that for the 5 fraction/week treatment, the total dose required to bring about the same damage for the cancerous tissue is proportional to T/sup -0.22/, where T is the overall time over which the dose is delivered. Using this finding the TSD was defined as DxN/sup -p/xT/sup -q/, where D is the total dose, N the number of fractions, T the overall time p and q are the exponents to be suitably chosen. The values of p and q are adjusted such that p+q< or =0.24, and p varies from 0.0 to 0.24 and q varies from 0.0 to 0.22. Cases of cancer of cervix uteri treated between 1978 and 1980 in the V. N. Cancer Centre, Kuppuswamy Naidu Memorial Hospital, Coimbatore, India were analyzed on the basis of these formulations. These data, coupled with the clinical experience, were used for choice of a formula for the TSD. Further, the dose schedules used in the British Institute of Radiology fraction- ation studies were also used to propose that the tumor significant dose is represented by DxN/sup -0.18/xT/sup -0.06/
Uranium chemistry: significant advances
International Nuclear Information System (INIS)
Mazzanti, M.
2011-01-01
The author reviews recent progress in uranium chemistry achieved in CEA laboratories. Like its neighbors in the Mendeleev chart uranium undergoes hydrolysis, oxidation and disproportionation reactions which make the chemistry of these species in water highly complex. The study of the chemistry of uranium in an anhydrous medium has led to correlate the structural and electronic differences observed in the interaction of uranium(III) and the lanthanides(III) with nitrogen or sulfur molecules and the effectiveness of these molecules in An(III)/Ln(III) separation via liquid-liquid extraction. Recent work on the redox reactivity of trivalent uranium U(III) in an organic medium with molecules such as water or an azide ion (N 3 - ) in stoichiometric quantities, led to extremely interesting uranium aggregates particular those involved in actinide migration in the environment or in aggregation problems in the fuel processing cycle. Another significant advance was the discovery of a compound containing the uranyl ion with a degree of oxidation (V) UO 2 + , obtained by oxidation of uranium(III). Recently chemists have succeeded in blocking the disproportionation reaction of uranyl(V) and in stabilizing polymetallic complexes of uranyl(V), opening the way to to a systematic study of the reactivity and the electronic and magnetic properties of uranyl(V) compounds. (A.C.)
Directory of Open Access Journals (Sweden)
Ph D Student Roman Mihaela
2011-05-01
Full Text Available The concept of "public accountability" is a challenge for political science as a new concept in this area in full debate and developement ,both in theory and practice. This paper is a theoretical approach of displaying some definitions, relevant meanings and significance odf the concept in political science. The importance of this concept is that although originally it was used as a tool to improve effectiveness and eficiency of public governance, it has gradually become a purpose it itself. "Accountability" has become an image of good governance first in the United States of America then in the European Union.Nevertheless,the concept is vaguely defined and provides ambiguous images of good governance.This paper begins with the presentation of some general meanings of the concept as they emerge from specialized dictionaries and ancyclopaedies and continues with the meanings developed in political science. The concept of "public accontability" is rooted in economics and management literature,becoming increasingly relevant in today's political science both in theory and discourse as well as in practice in formulating and evaluating public policies. A first conclusin that emerges from, the analysis of the evolution of this term is that it requires a conceptual clarification in political science. A clear definition will then enable an appropriate model of proving the system of public accountability in formulating and assessing public policies, in order to implement a system of assessment and monitoring thereof.
Proximal and distal adjustments of reaching behavior in preterm infants.
de Toledo, Aline Martins; Soares, Daniele de Almeida; Tudella, Eloisa
2011-01-01
The authors aimed to investigate proximal and distal adjustments of reaching behavior and grasping in 5-, 6-, and 7-month-old preterm infants. Nine low-risk preterm and 10 full-term infants participated. Both groups showed the predominance of unimanual reaching, an age-related increase in the frequency of vertical-oriented and open hand movement, and also an increase in successful grasping from 6 to 7 months. The frequency of open hand was higher in the preterm group at 6 months. Intrinsic restrictions imposed by prematurity did not seem to have impaired reaching performance of preterm infants throughout the months of age.
Significant Radionuclides Determination
Energy Technology Data Exchange (ETDEWEB)
Jo A. Ziegler
2001-07-31
The purpose of this calculation is to identify radionuclides that are significant to offsite doses from potential preclosure events for spent nuclear fuel (SNF) and high-level radioactive waste expected to be received at the potential Monitored Geologic Repository (MGR). In this calculation, high-level radioactive waste is included in references to DOE SNF. A previous document, ''DOE SNF DBE Offsite Dose Calculations'' (CRWMS M&O 1999b), calculated the source terms and offsite doses for Department of Energy (DOE) and Naval SNF for use in design basis event analyses. This calculation reproduces only DOE SNF work (i.e., no naval SNF work is included in this calculation) created in ''DOE SNF DBE Offsite Dose Calculations'' and expands the calculation to include DOE SNF expected to produce a high dose consequence (even though the quantity of the SNF is expected to be small) and SNF owned by commercial nuclear power producers. The calculation does not address any specific off-normal/DBE event scenarios for receiving, handling, or packaging of SNF. The results of this calculation are developed for comparative analysis to establish the important radionuclides and do not represent the final source terms to be used for license application. This calculation will be used as input to preclosure safety analyses and is performed in accordance with procedure AP-3.12Q, ''Calculations'', and is subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (DOE 2000) as determined by the activity evaluation contained in ''Technical Work Plan for: Preclosure Safety Analysis, TWP-MGR-SE-000010'' (CRWMS M&O 2000b) in accordance with procedure AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''.
Indian Academy of Sciences (India)
IAS Admin
Pauli exclusion principle, Fermi–. Dirac statistics, identical and in- distinguishable particles, Fermi gas. Fermi–Dirac Statistics. Derivation and Consequences. S Chaturvedi and Shyamal Biswas. (left) Subhash Chaturvedi is at University of. Hyderabad. His current research interests include phase space descriptions.
Generalized interpolative quantum statistics
International Nuclear Information System (INIS)
Ramanathan, R.
1992-01-01
A generalized interpolative quantum statistics is presented by conjecturing a certain reordering of phase space due to the presence of possible exotic objects other than bosons and fermions. Such an interpolation achieved through a Bose-counting strategy predicts the existence of an infinite quantum Boltzmann-Gibbs statistics akin to the one discovered by Greenberg recently
Handbook of Spatial Statistics
Gelfand, Alan E
2010-01-01
Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.
International Nuclear Information System (INIS)
2003-01-01
The energy statistical table is a selection of statistical data for energies and countries from 1997 to 2002. It concerns the petroleum, the natural gas, the coal, the electric power, the production, the external market, the consumption per sector, the energy accounting 2002 and graphs on the long-dated forecasting. (A.L.B.)
Bayesian statistical inference
Directory of Open Access Journals (Sweden)
Bruno De Finetti
2017-04-01
Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.
Practical statistics for educators
Ravid, Ruth
2014-01-01
Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt
This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Energy statistics yearbook 2002
International Nuclear Information System (INIS)
2005-01-01
The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Howard Stauffer; Nadav Nur
2005-01-01
The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...
Energy statistics yearbook 2001
International Nuclear Information System (INIS)
2004-01-01
The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Energy statistics yearbook 2000
International Nuclear Information System (INIS)
2002-01-01
The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities
Temperature dependent anomalous statistics
International Nuclear Information System (INIS)
Das, A.; Panda, S.
1991-07-01
We show that the anomalous statistics which arises in 2 + 1 dimensional Chern-Simons gauge theories can become temperature dependent in the most natural way. We analyze and show that a statistic's changing phase transition can happen in these theories only as T → ∞. (author). 14 refs
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Understanding advanced statistical methods
Westfall, Peter
2013-01-01
Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...
What can be learnt from an ecotoxicity database in the framework of the REACh regulation?
International Nuclear Information System (INIS)
Henegar, Adina; Mombelli, Enrico; Pandard, Pascal; Pery, Alexandre R.R.
2011-01-01
Since REACh applies in all of EU, special emphasis has been put on the reduction of systematic ecotoxicity testing. In this context, it is important to extract a maximum of information from existing ecotoxicity databases in order to propose alternative methods aimed at replacing and reducing experimental testing. Consequently, we analyzed a database of new chemicals registered in France and Europe during the last twenty years reporting aquatic ecotoxicity data with respect to three trophic levels (i.e., Algae EC50 72 h, Daphnia EC50 48 h and Fish LC50 96 h). In order to ensure the relevance of the comparison between these three experimental tests, we performed a stringent data selection based on the pertinence and quality of available ecotoxicological information. At the end of this selection, less than 5% of the initial number of chemicals was retained for subsequent analysis. Such an analysis showed that fish was the least sensitive trophic level, whereas Daphnia had the highest sensitivity. Moreover, thanks to an analysis of the relative sensitivity of trophic levels, it was possible to establish that respective correction factors of 50 and 10 would be necessary if only one or two test values were available. From a physicochemical point of view, it was possible to characterize two significant correlations relating the sensitivity of the aforementioned trophic levels with the chemical structure of the retained substances. This analysis showed that algae displayed a higher sensitivity towards chemicals containing acid fragments whereas fish presented a higher sensitivity towards chemicals containing aromatic ether fragments. Overall, our work suggests that statistical analysis of historical data combined with data yielded by the REACh regulation should permit the derivation of robust safety factors, testing strategies and mathematical models. These alternative methods, in turn, could allow a replacement and reduction of ecotoxicological testing. - Research
PNW River Reach Files -- 1:100k Watercourses (arcs)
Pacific States Marine Fisheries Commission — This feature class includes the ARC features from the 2001 version of the PNW River Reach files Arc/INFO coverage. Separate, companion feature classes are also...
PNW River Reach Files -- 1:100k Waterbodies (polygons)
Pacific States Marine Fisheries Commission — This feature class includes the POLYGON waterbody features from the 2001 version of the PNW River Reach files Arc/INFO coverage. Separate, companion feature classes...
Stream Habitat Reach Summary - North Coast [ds63
California Natural Resource Agency — The shapefile is based on habitat unit level data summarized at the stream reach level. The database represents salmonid stream habitat surveys from 645 streams of...
LTRM Fish Sampling Strata, UMRS La Grange Reach
Department of the Interior — The data set includes delineation of sampling strata for the six study reaches of the UMRR Programâs LTRM element. Separate strata coverages exist for each of the...
LTRM Water Quality Sampling Strata, UMRS La Grange Reach
Department of the Interior — The data set includes delineation of sampling strata for the six study reaches of the UMRR Programâs LTRM element. Separate strata coverages exist for each of the...
Annual summary of vital statistics: 2005.
Hamilton, Brady E; Miniño, Arialdi M; Martin, Joyce A; Kochanek, Kenneth D; Strobino, Donna M; Guyer, Bernard
2007-02-01
The general fertility rate in 2005 was 66.7 births per 1000 women aged 15 to 44 years, the highest level since 1993. The birth rate for teen mothers (aged 15 to 19 years) declined by 2% between 2004 and 2005, falling to 40.4 births per 1000 women, the lowest ever recorded in the 65 years for which there are consistent data. The birth rates for women > or = 30 years of age rose in 2005 to levels not seen in almost 40 years. Childbearing by unmarried women also increased to historic record levels for the United States in 2005. The cesarean-delivery rate rose by 4% in 2005 to 30.2% of all births, another record high. The preterm birth rate continued to rise (to 12.7% in 2005), as did the rate for low birth weight births (8.2%). The infant mortality rate was 6.79 infant deaths per 1000 live births in 2004, not statistically different from the rate in 2003. Pronounced differences in infant mortality rates by race and Hispanic origin continue, with non-Hispanic black newborns more than twice as likely as non-Hispanic white and Hispanic infants to die within 1 year of birth. The expectation of life at birth reached a record high in 2004 of 77.8 years for all gender and race groups combined. Death rates in the United States continued to decline, with death rates decreasing for 9 of the 15 leading causes. The crude death rate for children aged 1 to 19 years did not decrease significantly between 2003 and 2004. Of the 10 leading causes of death for 2004 in this age group, only the rates for influenza and pneumonia showed a significant decrease. The death rates increased for intentional self-harm (suicide), whereas rates for other causes did not change significantly for children. A large proportion of childhood deaths continue to occur as a result of preventable injuries.
Østby-Deglum, Marie; Axcrona, Karol; Brennhovd, Bjørn; Dahl, Alv A
2016-06-01
To study the ability to reach orgasm after robot-assisted laparoscopic prostatectomy (RALP) in relation to demographic, cancer-related, and surgical variables, and the use of erectile aids. In this cross-sectional study at a mean of 3 years after RALP at Oslo University Hospital, 982 men were invited to complete a mailed questionnaire, and 777 responded. Respondents who reported postoperative radiotherapy or hormone treatment, or did not report on orgasm were omitted, leaving 609 patients for analysis. Ability to reach orgasm was rated on 1 question from The Expanded Prostate Cancer Index Composite 26-item version, and dichotomized into "good" or "poor." Overall, 27% of the men reported good ability to reach orgasm: 22% among those did not use erectile aids and 34% among those did (P = .001). Univariate analysis of men with good versus poor ability to reach orgasm showed many significant differences. In multivariate analysis, being older, having a reduced physical quality of life, and erectile dysfunction were significantly associated with poor ability to reach orgasm. Erectile dysfunction showed an odds ratio of 4.86 for poor orgasmic ability. The 48% of men who used erectile aids had significantly better orgasmic ability than the nonusers. In our sample, 27% had good ability to reach orgasm at a mean of 3 years after RALP. Poor orgasmic ability was associated with being older, poor erectile function, and a reduced physical quality of life. Using erectile aids increased the rate of good ability to reach orgasm. Copyright © 2016 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Giuseppe Bombino
2010-09-01
Full Text Available The complex hydrogeomorphological processes within the active channel of rivers strongly influence riparian vegetation development and organization, particularly in mountain streams where such processes can be remarkably impacted by engineering control works. In four mountain reaches of Calabrian fiumaras we analyze, through previously arranged methods (integrated by a multivariate statistic analysis, the relationships among hydrogeomorphological river characteristics and structure and the development of riparian vegetation within the active channel in transects located in proximity of check dams and in less disturbed sites. The results of this study demonstrate clear and relevant contrasts, due to the presence of check dams, in the physical and vegetation properties of upstream, downstream and intermediate sites around check dams. The multivariate statistical approach through the Principal Component Analysis (PCA highlighted evident relationships in all transects between groups of physical and vegetation properties. The regression analysis performed between the vegetation properties and the width:depth ratio or the specific discharge showed very different relationships between groups of transects, due to evident changes in channel morphology and in flow regime locally induced by check dams. Overall we have shown that check dams have far reaching effects in the extent and development of riparian vegetation of mountain torrent reaches, which extend far beyond physical adjustments to changed morphological, hydraulic and sedimentary conditions.
REACH: next step to a sound chemicals management.
Van der Wielen, Arnold
2007-12-01
REACH is the new European Regulation for Registration, Evaluation, Authorisation and Restriction of Chemicals. It entered into force on 1st June 2007 to streamline and improve the former legislative framework on new and on existing chemical substances of the European Union. Companies which manufacture or import more than 1 tonne of a substance per year will be required to register the substance at the new EU Chemicals Agency located in Helsinki. REACH places greater responsibility on industry to manage the risks that chemicals may pose to the health and the environment and to provide safety information that will be passed down the supply chain. In principle, REACH applies to all chemicals as such, as components in preparations and as used in articles. REACH is a radical step forward in the EU chemicals management. The onus will move from the authorities to industry. In addition, REACH will allow the further evaluation of substances where there are grounds for concern, foresees an authorisation system for the use of substances of very high concern and a system of restrictions, where applicable, for substances of concern. The Authorisation system will require companies to switch progressively to safer alternatives where a suitable alternative exists. Current use restrictions will remain under REACH system.
Ector, Hugo
2010-12-01
I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.
Do children perceive postural constraints when estimating reach or action planning?
Gabbard, Carl; Cordova, Alberto; Lee, Sunghan
2009-03-01
Estimation of whether an object is reachable from a specific body position constitutes an important aspect in effective motor planning. Researchers who estimate reachability by way of motor imagery with adults consistently report the tendency to overestimate, with some evidence of a postural effect (postural stability hypothesis). This idea suggests that perceived reaching limits depend on an individual's perceived postural constraints. Based on previous work with adults, the authors expected a significant postural effect with the Reach 2 condition, as evidenced by reduced overestimation. Furthermore, the authors hypothesized that the postural effect would be greater in younger children. They then tested these propositions among children aged 7, 9, and 11 years by asking them to estimate reach while seated (Reach 1) and in the more demanding posture of standing on 1 foot and leaning forward (Reach 2). Results indicated no age or condition difference, therefore providing no support for a postural effect. When the authors compared these data to a published report of adults, a developmental difference emerged. That is, adults recognize the perceived postural constraint of the standing position resulting in under- rather than overestimation, as displayed in the seated condition. Although preliminary, these observations suggest that estimates of reach (action planning) continue to be refined between late childhood and young adulthood.
DEFF Research Database (Denmark)
Jespersen, C M
1997-01-01
BACKGROUND: Angina pectoris accompanied by transient ST-segment changes during the in-hospital phase of acute myocardial infarction (AMI) is a well established marker of subsequent cardiac death and reinfarction. HYPOTHESIS: This study was undertaken to record the prognostic significance of angina...... on study treatment 1 month after discharge. Of these patients, 311 (39%) reported chest pain during the first month following discharge. RESULTS: Patients with angina pectoris had a significantly increased risk of reinfarction [hazard 1.71; 95%-confidence limit (CL): 1.09, 2.69] and increased mortality...... risk which, however, only reached borderline statistical significance (hazard 1.52; 95%-CL: 0.96, 2.40). When patients were subdivided according to both angina pectoris and heart failure, those with one or both of these risk markers had significantly increased mortality (p 0.03) and reinfarction (p 0...
International Nuclear Information System (INIS)
Tonchev, N.; Shumovskij, A.S.
1986-01-01
The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
D'Alessio, Michael
2012-01-01
AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da
Mauro, John
2013-01-01
Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g
Liao, Tim Futing
2011-01-01
An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Mineral statistics yearbook 1994
International Nuclear Information System (INIS)
1994-01-01
A summary of mineral production in Saskatchewan was compiled and presented as a reference manual. Statistical information on fuel minerals such as crude oil, natural gas, liquefied petroleum gas and coal, and of industrial and metallic minerals, such as potash, sodium sulphate, salt and uranium, was provided in all conceivable variety of tables. Production statistics, disposition and value of sales of industrial and metallic minerals were also made available. Statistical data on drilling of oil and gas reservoirs and crown land disposition were also included. figs., tabs
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Methods of statistical physics
Akhiezer, Aleksandr I
1981-01-01
Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be
Expanding the reach of heavy neutrino searches at the LHC
Directory of Open Access Journals (Sweden)
Andrés Flórez
2018-03-01
Full Text Available The observation of neutrino oscillations establishes that neutrinos have non-zero mass and provides one of the more compelling arguments for physics beyond the standard model (SM of particle physics. We present a feasibility study to search for hypothetical Majorana neutrinos (N with TeV scale masses, predicted by extensions of the SM to explain the small but non-zero SM neutrino mass, using vector boson fusion (VBF processes at the 13 TeV LHC. In the context of the minimal Type-I seesaw mechanism (mTISM, the VBF production cross-section of a lepton (ℓ and associated heavy Majorana neutrino (Nℓ surpasses that of the Drell–Yan process at approximately mNℓ=1.4TeV. We consider second and third-generation heavy neutrino (Nμ or Nτ, where ℓ= muon (μ or tau (τ leptons production through VBF processes, with subsequent Nμ and Nτ decays to a lepton and two jets, as benchmark cases to show the effectiveness of the VBF topology for Nℓ searches at the 13 TeV LHC. The requirement of a dilepton pair combined with four jets, two of which are identified as VBF jets with large separation in pseudorapidity and a TeV scale dijet mass, is effective at reducing the SM background. These criteria may provide expected exclusion bounds, at 95% confidence level, of mNℓ<1.7 (2.4 TeV, assuming 100 (1000 fb−1 of 13 TeV data from the LHC and mixing |VℓNℓ|2=1. The use of the VBF topology to search for mNℓ increases the discovery reach at the LHC, with expected significances greater than 5σ (3σ for Nℓ masses up to 1.7 (2.05 TeV using 1000fb−1 of 13 TeV data from the LHC.
Expanding the reach of heavy neutrino searches at the LHC
Flórez, Andrés; Gui, Kaiwen; Gurrola, Alfredo; Patiño, Carlos; Restrepo, Diego
2018-03-01
The observation of neutrino oscillations establishes that neutrinos have non-zero mass and provides one of the more compelling arguments for physics beyond the standard model (SM) of particle physics. We present a feasibility study to search for hypothetical Majorana neutrinos (N) with TeV scale masses, predicted by extensions of the SM to explain the small but non-zero SM neutrino mass, using vector boson fusion (VBF) processes at the 13 TeV LHC. In the context of the minimal Type-I seesaw mechanism (mTISM), the VBF production cross-section of a lepton (ℓ) and associated heavy Majorana neutrino (Nℓ) surpasses that of the Drell-Yan process at approximately mNℓ = 1.4TeV. We consider second and third-generation heavy neutrino (Nμ or Nτ, where ℓ= muon (μ) or tau (τ) leptons) production through VBF processes, with subsequent Nμ and Nτ decays to a lepton and two jets, as benchmark cases to show the effectiveness of the VBF topology for Nℓ searches at the 13 TeV LHC. The requirement of a dilepton pair combined with four jets, two of which are identified as VBF jets with large separation in pseudorapidity and a TeV scale dijet mass, is effective at reducing the SM background. These criteria may provide expected exclusion bounds, at 95% confidence level, of mNℓ < 1.7 (2.4) TeV, assuming 100 (1000) fb-1 of 13 TeV data from the LHC and mixing |VℓNℓ|2 = 1. The use of the VBF topology to search for mNℓ increases the discovery reach at the LHC, with expected significances greater than 5σ (3σ) for Nℓ masses up to 1.7 (2.05) TeV using 1000fb-1 of 13 TeV data from the LHC.
Agreement reached on integrated safeguards in European Union
International Nuclear Information System (INIS)
2010-01-01
Full text: The International Atomic Energy Agency (IAEA), in cooperation with the European Commission, has reached agreement on arrangements to implement 'integrated safeguards' in all non-nuclear-weapon States of the European Union with significant nuclear activities. 'This important milestone is the result of the constructive common efforts of all parties concerned. It is a clear signal of the importance attributed by the EU and its Member States, as well as the IAEA, to the reinforcement of the nuclear non-proliferation regime,' said Andris Piebalgs, Member of the European Commission in charge of Energy. 'Once we have sufficient confidence that a State' s nuclear activities are purely peaceful, we can apply safeguards measures in a less prescriptive, more customised manner. This reduces the inspection burden on the State and the inspection effort of the IAEA, while enabling the IAEA to maintain the conclusion that all nuclear material has remained in peaceful activities,' said Olli Heinonen, Deputy Director General and Head of IAEA Safeguards Department. Background The Nuclear Non-Proliferation Treaty (NPT) is the main international Treaty prohibiting the spread of nuclear weapons. It entrusts the IAEA to verify that nuclear material is not diverted to nuclear weapons or other nuclear explosive devices through the application of 'safeguards'. IAEA safeguards include comprehensive safeguards agreements and additional protocols that enable the IAEA to conclude that all nuclear material has remained in peaceful activities in a State. Integrated Safeguards refers to the optimum combination of all safeguards measures available to the Agency under comprehensive safeguards agreements and additional protocols to achieve maximum effectiveness and efficiency in meeting the Agency ' s safeguards obligations. In the European Union, nuclear safeguards are implemented on the basis of the Euratom Treaty and trilateral agreements between Euratom, its Member States and the IAEA
Solid expandable systems put deepwater targets within reach
Energy Technology Data Exchange (ETDEWEB)
Perez-Roca, Eduardo [Enventure Global Technology L.L.C., Houston, TX (United States). Latin America; Fristch, Jerry [Enventure Global Technology L.L.C., Houston, TX (United States)
2008-07-01
Enabling technologies that take drilling operations to deeper objectives have made a significant impact on the practicality of many projects, especially deep water offshore targets. Increasing vertical depth and lateral reach requires adequate hole size to attain the desired objectives of the well bore. Solid expandable technology can maintain and retain hole size to address both the physical limitations and the economic feasibility of deep water operations. With each and every casing point, the potential for adequate hole size at total depth (TD) decreases. Solid expandable open hole liners and single-diameter systems reduce and eliminate, respectively, the well bore tapering that dictates hole size at TD and subsequent completion size. Successful mitigation of this tapering, whether through the entire well bore or through select zones, enables operators to gain access to previously unreachable reserves. Solid expandable systems have proven to be reliable and effective with over 1,000 installations in a myriad of conditions and environments worldwide. To date, over 115 of those applications have been in deep water environments. The current operating envelope for solid expandable systems include the deepest installation at {approx}28,750 ft (8,763 m) and the longest at 6,867 ft (2,083 m) in water depth over 3,150 ft (960 m). This record-length application consisted of an open hole liner installed and expanded in a single run. This paper will discuss the effectiveness of solid expandable systems in deep water operations and how the technology brings value to offshore projects especially when planned into the initial design. Case histories will be used to further illustrate the features, advantages, and benefits of expandable technology. In addition, this paper will examine the state of the solid expandable technology and its continuing evolution to provide even more drilling solutions. (author)
Impulsivity modulates performance under response uncertainty in a reaching task.
Tzagarakis, C; Pellizzer, G; Rogers, R D
2013-03-01
We sought to explore the interaction of the impulsivity trait with response uncertainty. To this end, we used a reaching task (Pellizzer and Hedges in Exp Brain Res 150:276-289, 2003) where a motor response direction was cued at different levels of uncertainty (1 cue, i.e., no uncertainty, 2 cues or 3 cues). Data from 95 healthy adults (54 F, 41 M) were analysed. Impulsivity was measured using the Barratt Impulsiveness Scale version 11 (BIS-11). Behavioral variables recorded were reaction time (RT), errors of commission (referred to as 'early errors') and errors of precision. Data analysis employed generalised linear mixed models and generalised additive mixed models. For the early errors, there was an interaction of impulsivity with uncertainty and gender, with increased errors for high impulsivity in the one-cue condition for women and the three-cue condition for men. There was no effect of impulsivity on precision errors or RT. However, the analysis of the effect of RT and impulsivity on precision errors showed a different pattern for high versus low impulsives in the high uncertainty (3 cue) condition. In addition, there was a significant early error speed-accuracy trade-off for women, primarily in low uncertainty and a 'reverse' speed-accuracy trade-off for men in high uncertainty. These results extend those of past studies of impulsivity which help define it as a behavioural trait that modulates speed versus accuracy response styles depending on environmental constraints and highlight once more the importance of gender in the interplay of personality and behaviour.
Directory of Open Access Journals (Sweden)
Shameela .T .V
2015-12-01
Full Text Available Background: Among health care workers the highest level of work related back injuries are more affected in nurses. There were many studies done to assess low back pain by using different tools. So this study aimed to identify the prevalence low back pain disability among female nursing professionals and the association between BMI, functional reach test and low back pain, so that a better tool can be used during the clinical examination for the betterment of the patient. The objective of the study is to identify the prevalence of low back pain disability, the association of Low Back Pain(LBP with BMI and functional reach test among female nursing professionals. Methods: A total of 256 subjects were assessed for disability due to back pain using OswestryLBP Disability Questionnaire and the prevalence of disability was determined. The sit and reach test, forward reach test and their BMI were calculated for those who had a disability score of 20 and above (n=87. Results: Data was analyzed using Pearson’s correlation.The study result showed a significant correlation (p=0.03 of sit and reach test with low back pain disability scores. There was a negative correlationseen among BMI and LBP disability score forward reach test and LBP disability score, and BMI and no low back pain disability score. Conclusion: The prevalence of LBP disability among nursing professionals was 33.9%. This study suggest that sit and reach test can be used as an indicator of low back pain. Whereas BMI and forward reach test do not indicate low back pain.
Cancer Data and Statistics Tools
... Educational Campaigns Initiatives Stay Informed Cancer Data and Statistics Tools Recommend on Facebook Tweet Share Compartir Cancer Statistics Tools United States Cancer Statistics: Data Visualizations The ...
Sanders, Jim; Guse, Clare E
2016-08-09
There is a significant disparity in hypertensive treatment rates between those with and without health insurance. If left untreated, hypertension leads to significant morbidity and mortality. The uninsured face numerous barriers to access chronic disease care. We developed the Community-based Chronic Disease Management (CCDM) clinics specifically for the uninsured with hypertension utilizing nurse-led teams, community-based locations, and evidence-based clinical protocols. All services, including laboratory and medications, are provided on-site and free of charge. In order to ascertain if the CCDM model of care was as effective as traditional models of care in achieving blood pressure goals, we compared CCDM clinics' hypertensive care outcomes with 2 traditional fee-for-service physician-led clinics. All the clinics are located near one another in poor urban neighborhoods of Milwaukee, Wisconsin. Patients seen at the CCDM clinics and at 1 of the 2 traditional clinics showed a statistically significant improvement in reaching blood pressure goal at 6 months (P fee-for-service clinics when compared with the CCDM clinics. The CCDM model of care is at least as effective in controlling hypertension as more traditional fee-for-service models caring for the same population. The CCDM model of care to treat hypertension may offer another approach for engaging the urban poor in chronic disease care. © The Author(s) 2016.
Pimenta, Rui; Nascimento, Ana; Vieira, Margarida; Costa, Elísio
2010-01-01
In previous works, we evaluated the statistical reasoning ability acquired by health sciences’ students carrying out their final undergraduate project. We found that these students achieved a good level of statistical literacy and reasoning in descriptive statistics. However, concerning inferential statistics the students did not reach a similar level. Statistics educators therefore claim for more effective ways to learn statistics such as project based investigations. These can be simulat...
Memory-guided reaching in a patient with visual hemiagnosia.
Cornelsen, Sonja; Rennig, Johannes; Himmelbach, Marc
2016-06-01
The two-visual-systems hypothesis (TVSH) postulates that memory-guided movements rely on intact functions of the ventral stream. Its particular importance for memory-guided actions was initially inferred from behavioral dissociations in the well-known patient DF. Despite of rather accurate reaching and grasping movements to visible targets, she demonstrated grossly impaired memory-guided grasping as much as impaired memory-guided reaching. These dissociations were later complemented by apparently reversed dissociations in patients with dorsal damage and optic ataxia. However, grasping studies in DF and optic ataxia patients differed with respect to the retinotopic position of target objects, questioning the interpretation of the respective findings as a double dissociation. In contrast, the findings for reaching errors in both types of patients came from similar peripheral target presentations. However, new data on brain structural changes and visuomotor deficits in DF also questioned the validity of a double dissociation in reaching. A severe visuospatial short-term memory deficit in DF further questioned the specificity of her memory-guided reaching deficit. Therefore, we compared movement accuracy in visually-guided and memory-guided reaching in a new patient who suffered a confined unilateral damage to the ventral visual system due to stroke. Our results indeed support previous descriptions of memory-guided movements' inaccuracies in DF. Furthermore, our data suggest that recently discovered optic-ataxia like misreaching in DF is most likely caused by her parieto-occipital and not by her ventral stream damage. Finally, multiple visuospatial memory measurements in HWS suggest that inaccuracies in memory-guided reaching tasks in patients with ventral damage cannot be explained by visuospatial short-term memory or perceptual deficits, but by a specific deficit in visuomotor processing. Copyright © 2016 Elsevier Ltd. All rights reserved.
Breast cancer statistics, 2011.
DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin
2011-01-01
In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population. Copyright © 2011 American Cancer Society, Inc.
On the statistical assessment of classifiers using DNA microarray data
Directory of Open Access Journals (Sweden)
Carella M
2006-08-01
decreases as the training set size increases, reaching its best performances with 35 training examples. In this case, RLS and SVM have error rates of e = 14% (p = 0.027 and e = 11% (p = 0.019. Concerning the number of genes, we found about 6000 genes (p e = 16% (p Conclusions The method proposed provides statistically significant answers to precise questions relevant for the diagnosis and prognosis of cancer. We found that, with as few as 15 examples, it is possible to train statistically significant classifiers for colon cancer diagnosis. As for the definition of the number of genes sufficient for a reliable classification of colon cancer, our results suggest that it depends on the accuracy required.
Elements of statistical thermodynamics
Nash, Leonard K
2006-01-01
Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.
Department of Veterans Affairs — National-level, VISN-level, and/or VAMC-level statistics on the numbers and percentages of users of VHA care form the Northeast Program Evaluation Center (NEPEC)....
International Nuclear Information System (INIS)
Hilaire, S.
2001-01-01
A review of the statistical model of nuclear reactions is presented. The main relations are described, together with the ingredients necessary to perform practical calculations. In addition, a substantial overview of the width fluctuation correction factor is given. (author)
... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... them at higher risk. Reported Cases of Human Plague - United States, 1970-2016 Since the mid–20th ...
Statistical Measures of Marksmanship
National Research Council Canada - National Science Library
Johnson, Richard
2001-01-01
.... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...
Titanic: A Statistical Exploration.
Takis, Sandra L.
1999-01-01
Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)
... About Us Information For… Media Policy Makers Data & Statistics Recommend on Facebook Tweet Share Compartir Sickle cell ... 1999 through 2002. This drop coincided with the introduction in 2000 of a vaccine that protects against ...
U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have
CMS Statistics Reference Booklet
U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
Statistical electromagnetics: Complex cavities
Naus, H.W.L.
2008-01-01
A selection of the literature on the statistical description of electromagnetic fields and complex cavities is concisely reviewed. Some essential concepts, for example, the application of the central limit theorem and the maximum entropy principle, are scrutinized. Implicit assumptions, biased
Davison, Anthony C.; Huser, Raphaë l
2015-01-01
Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
CSIR Research Space (South Africa)
Shepperson, L
1997-12-01
Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...
Genton, Marc G.; Castruccio, Stefano; Crippa, Paola; Dutta, Subhajit; Huser, Raphaë l; Sun, Ying; Vettori, Sabrina
2015-01-01
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Illinois travel statistics, 2008
2009-01-01
The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Illinois travel statistics, 2009
2010-01-01
The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
Cholesterol Facts and Statistics
... Managing High Cholesterol Cholesterol-lowering Medicine High Cholesterol Statistics and Maps High Cholesterol Facts High Cholesterol Maps ... Deo R, et al. Heart disease and stroke statistics—2017 update: a report from the American Heart ...
Illinois travel statistics, 2010
2011-01-01
The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...
U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...
Information theory and statistics
Kullback, Solomon
1968-01-01
Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Medicaid Drug Claims Statistics
U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.