WorldWideScience

Sample records for statistically significant response

  1. Statistical determination of significant curved I-girder bridge seismic response parameters

    Science.gov (United States)

    Seo, Junwon

    2013-06-01

    Curved steel bridges are commonly used at interchanges in transportation networks and more of these structures continue to be designed and built in the United States. Though the use of these bridges continues to increase in locations that experience high seismicity, the effects of curvature and other parameters on their seismic behaviors have been neglected in current risk assessment tools. These tools can evaluate the seismic vulnerability of a transportation network using fragility curves. One critical component of fragility curve development for curved steel bridges is the completion of sensitivity analyses that help identify influential parameters related to their seismic response. In this study, an accessible inventory of existing curved steel girder bridges located primarily in the Mid-Atlantic United States (MAUS) was used to establish statistical characteristics used as inputs for a seismic sensitivity study. Critical seismic response quantities were captured using 3D nonlinear finite element models. Influential parameters from these quantities were identified using statistical tools that incorporate experimental Plackett-Burman Design (PBD), which included Pareto optimal plots and prediction profiler techniques. The findings revealed that the potential variation in the influential parameters included number of spans, radius of curvature, maximum span length, girder spacing, and cross-frame spacing. These parameters showed varying levels of influence on the critical bridge response.

  2. Test for the statistical significance of differences between ROC curves

    International Nuclear Information System (INIS)

    Metz, C.E.; Kronman, H.B.

    1979-01-01

    A test for the statistical significance of observed differences between two measured Receiver Operating Characteristic (ROC) curves has been designed and evaluated. The set of observer response data for each ROC curve is assumed to be independent and to arise from a ROC curve having a form which, in the absence of statistical fluctuations in the response data, graphs as a straight line on double normal-deviate axes. To test the significance of an apparent difference between two measured ROC curves, maximum likelihood estimates of the two parameters of each curve and the associated parameter variances and covariance are calculated from the corresponding set of observer response data. An approximate Chi-square statistic with two degrees of freedom is then constructed from the differences between the parameters estimated for each ROC curve and from the variances and covariances of these estimates. This statistic is known to be truly Chi-square distributed only in the limit of large numbers of trials in the observer performance experiments. Performance of the statistic for data arising from a limited number of experimental trials was evaluated. Independent sets of rating scale data arising from the same underlying ROC curve were paired, and the fraction of differences found (falsely) significant was compared to the significance level, α, used with the test. Although test performance was found to be somewhat dependent on both the number of trials in the data and the position of the underlying ROC curve in the ROC space, the results for various significance levels showed the test to be reliable under practical experimental conditions

  3. Statistical significance of cis-regulatory modules

    Directory of Open Access Journals (Sweden)

    Smith Andrew D

    2007-01-01

    Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.

  4. The thresholds for statistical and clinical significance

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Gluud, Christian; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance are insufficiently demonstrated by 95% confidence intervals or P-values when assessing results from randomised clinical trials. First, a P-value only shows the probability of getting a result assuming that the null hypothesis is true and does...... not reflect the probability of getting a result assuming an alternative hypothesis to the null hypothesis is true. Second, a confidence interval or a P-value showing significance may be caused by multiplicity. Third, statistical significance does not necessarily result in clinical significance. Therefore...... of the probability that a given trial result is compatible with a 'null' effect (corresponding to the P-value) divided by the probability that the trial result is compatible with the intervention effect hypothesised in the sample size calculation; (3) adjust the confidence intervals and the statistical significance...

  5. Measuring individual significant change on the Beck Depression Inventory-II through IRT-based statistics.

    NARCIS (Netherlands)

    Brouwer, D.; Meijer, R.R.; Zevalkink, D.J.

    2013-01-01

    Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual

  6. The insignificance of statistical significance testing

    Science.gov (United States)

    Johnson, Douglas H.

    1999-01-01

    Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.

  7. Confidence intervals permit, but don't guarantee, better inference than statistical significance testing

    Directory of Open Access Journals (Sweden)

    Melissa Coulson

    2010-07-01

    Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.

  8. Significance levels for studies with correlated test statistics.

    Science.gov (United States)

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  9. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators...

  10. Statistically significant relational data mining :

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.

    2014-02-01

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.

  11. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  12. Statistical significance versus clinical relevance.

    Science.gov (United States)

    van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G

    2017-04-01

    In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  13. Health significance and statistical uncertainty. The value of P-value.

    Science.gov (United States)

    Consonni, Dario; Bertazzi, Pier Alberto

    2017-10-27

    The P-value is widely used as a summary statistics of scientific results. Unfortunately, there is a widespread tendency to dichotomize its value in "P0.05" ("statistically not significant"), with the former implying a "positive" result and the latter a "negative" one. To show the unsuitability of such an approach when evaluating the effects of environmental and occupational risk factors. We provide examples of distorted use of P-value and of the negative consequences for science and public health of such a black-and-white vision. The rigid interpretation of P-value as a dichotomy favors the confusion between health relevance and statistical significance, discourages thoughtful thinking, and distorts attention from what really matters, the health significance. A much better way to express and communicate scientific results involves reporting effect estimates (e.g., risks, risks ratios or risk differences) and their confidence intervals (CI), which summarize and convey both health significance and statistical uncertainty. Unfortunately, many researchers do not usually consider the whole interval of CI but only examine if it includes the null-value, therefore degrading this procedure to the same P-value dichotomy (statistical significance or not). In reporting statistical results of scientific research present effects estimates with their confidence intervals and do not qualify the P-value as "significant" or "not significant".

  14. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  15. Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.

    Science.gov (United States)

    Breunig, Nancy A.

    Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…

  16. Swiss solar power statistics 2007 - Significant expansion

    International Nuclear Information System (INIS)

    Hostettler, T.

    2008-01-01

    This article presents and discusses the 2007 statistics for solar power in Switzerland. A significant number of new installations is noted as is the high production figures from newer installations. The basics behind the compilation of the Swiss solar power statistics are briefly reviewed and an overview for the period 1989 to 2007 is presented which includes figures on the number of photovoltaic plant in service and installed peak power. Typical production figures in kilowatt-hours (kWh) per installed kilowatt-peak power (kWp) are presented and discussed for installations of various sizes. Increased production after inverter replacement in older installations is noted. Finally, the general political situation in Switzerland as far as solar power is concerned are briefly discussed as are international developments.

  17. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  18. Increasing the statistical significance of entanglement detection in experiments.

    Science.gov (United States)

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  19. Testing the Difference of Correlated Agreement Coefficients for Statistical Significance

    Science.gov (United States)

    Gwet, Kilem L.

    2016-01-01

    This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…

  20. Statistical Significance for Hierarchical Clustering

    Science.gov (United States)

    Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.

    2017-01-01

    Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990

  1. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  2. Statistical significance of trends in monthly heavy precipitation over the US

    KAUST Repository

    Mahajan, Salil

    2011-05-11

    Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.

  3. Auditory Magnetoencephalographic Frequency-Tagged Responses Mirror the Ongoing Segmentation Processes Underlying Statistical Learning.

    Science.gov (United States)

    Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe

    2017-03-01

    Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.

  4. Clinical, Radiologic, and Legal Significance of "Extensor Response" in Posttraumatic Coma.

    Science.gov (United States)

    Firsching, Raimund; Woischneck, Dieter; Langejürgen, Alexander; Parreidt, Andreas; Bondar, Imre; Skalej, Martin; Röhl, Friedrich; Voellger, Benjamin

    2015-11-01

    The timely detection of neurologic deterioration can be critical for the survival of a neurosurgical patient following head injury. Because little reliable evidence is available on the prognostic value of the clinical sign "extensor response" in comatose posttraumatic patients, we investigated the correlation of this clinical sign with outcome and with early radiologic findings from magnetic resonance imaging (MRI). This retrospective analysis of prospectively obtained data included 157 patients who had remained in a coma for a minimum of 24 hours after traumatic brain injury. All patients received a 1.5-T MRI within 10 days (median: 2 days) of the injury. The correlations between clinical findings 12 and 24 hours after the injury-in particular, extensor response and pupillary function, MRI findings, and outcome after 1 year-were investigated. Statistical analysis included contingency tables, Fisher exact test, odds ratios (ORs) with confidence intervals (CIs), and weighted κ values. There were 48 patients with extensor response within the first 24 hours after the injury. Patients with extensor response (World Federation of Neurosurgical Societies coma grade III) statistically were significantly more likely to harbor MRI lesions in the brainstem when compared with patients in a coma who had no further deficiencies (coma grade I; p = 0.0004 by Fisher exact test, OR 10.8 with 95% CI, 2.7-42.5) and patients with unilateral loss of pupil function (coma grade II; p = 0.0187, OR 2.8 with 95% CI, 1.2-6.5). The correlation of brainstem lesions as found by MRI and outcome according to the Glasgow Outcome Scale after 1 year was also highly significant (p ≤ 0.016). The correlation of extensor response and loss of pupil function with an unfavorable outcome and with brainstem lesions revealed by MRI is highly significant. Their sudden onset may be associated with the sudden onset of brainstem dysfunction and should therefore be regarded as one of the most

  5. Statistical and low dose response

    International Nuclear Information System (INIS)

    Thorson, M.R.; Endres, G.W.R.

    1981-01-01

    The low dose response and the lower limit of detection of the Hanford dosimeter depend upon may factors, including the energy of the radiation, whether the exposure is to be a single radiation or mixed fields, annealing cycles, environmental factors, and how well various batches of TLD materials are matched in the system. A careful statistical study and sensitivity analysis were performed to determine how these factors influence the response of the dosimeter system. Estimates have been included in this study of the standard deviation of calculated dose for various mixed field exposures from 0 to 1000 mrem

  6. Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.

    Science.gov (United States)

    Kramer, Karen L; Veile, Amanda; Otárola-Castillo, Erik

    2016-01-01

    Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1) as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2) competition from young siblings will negatively impact child growth during the post weaning period; 3) however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.

  7. Sibling Competition & Growth Tradeoffs. Biological vs. Statistical Significance.

    Directory of Open Access Journals (Sweden)

    Karen L Kramer

    Full Text Available Early childhood growth has many downstream effects on future health and reproduction and is an important measure of offspring quality. While a tradeoff between family size and child growth outcomes is theoretically predicted in high-fertility societies, empirical evidence is mixed. This is often attributed to phenotypic variation in parental condition. However, inconsistent study results may also arise because family size confounds the potentially differential effects that older and younger siblings can have on young children's growth. Additionally, inconsistent results might reflect that the biological significance associated with different growth trajectories is poorly understood. This paper addresses these concerns by tracking children's monthly gains in height and weight from weaning to age five in a high fertility Maya community. We predict that: 1 as an aggregate measure family size will not have a major impact on child growth during the post weaning period; 2 competition from young siblings will negatively impact child growth during the post weaning period; 3 however because of their economic value, older siblings will have a negligible effect on young children's growth. Accounting for parental condition, we use linear mixed models to evaluate the effects that family size, younger and older siblings have on children's growth. Congruent with our expectations, it is younger siblings who have the most detrimental effect on children's growth. While we find statistical evidence of a quantity/quality tradeoff effect, the biological significance of these results is negligible in early childhood. Our findings help to resolve why quantity/quality studies have had inconsistent results by showing that sibling competition varies with sibling age composition, not just family size, and that biological significance is distinct from statistical significance.

  8. Increasing the statistical significance of entanglement detection in experiments

    Energy Technology Data Exchange (ETDEWEB)

    Jungnitsch, Bastian; Niekamp, Soenke; Kleinmann, Matthias; Guehne, Otfried [Institut fuer Quantenoptik und Quanteninformation, Innsbruck (Austria); Lu, He; Gao, Wei-Bo; Chen, Zeng-Bing [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Chen, Yu-Ao; Pan, Jian-Wei [Hefei National Laboratory for Physical Sciences at Microscale and Department of Modern Physics, University of Science and Technology of China, Hefei (China); Physikalisches Institut, Universitaet Heidelberg (Germany)

    2010-07-01

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. We show this to be the case for an error model in which the variance of an observable is interpreted as its error and for the standard error model in photonic experiments. Specifically, we demonstrate that the Mermin inequality yields a Bell test which is statistically more significant than the Ardehali inequality in the case of a photonic four-qubit state that is close to a GHZ state. Experimentally, we observe this phenomenon in a four-photon experiment, testing the above inequalities for different levels of noise.

  9. Reporting effect sizes as a supplement to statistical significance ...

    African Journals Online (AJOL)

    The purpose of the article is to review the statistical significance reporting practices in reading instruction studies and to provide guidelines for when to calculate and report effect sizes in educational research. A review of six readily accessible (online) and accredited journals publishing research on reading instruction ...

  10. Your Chi-Square Test Is Statistically Significant: Now What?

    Science.gov (United States)

    Sharpe, Donald

    2015-01-01

    Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data…

  11. Testing statistical significance scores of sequence comparison methods with structure similarity

    Directory of Open Access Journals (Sweden)

    Leunissen Jack AM

    2006-10-01

    Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.

  12. Statistical significance of epidemiological data. Seminar: Evaluation of epidemiological studies

    International Nuclear Information System (INIS)

    Weber, K.H.

    1993-01-01

    In stochastic damages, the numbers of events, e.g. the persons who are affected by or have died of cancer, and thus the relative frequencies (incidence or mortality) are binomially distributed random variables. Their statistical fluctuations can be characterized by confidence intervals. For epidemiologic questions, especially for the analysis of stochastic damages in the low dose range, the following issues are interesting: - Is a sample (a group of persons) with a definite observed damage frequency part of the whole population? - Is an observed frequency difference between two groups of persons random or statistically significant? - Is an observed increase or decrease of the frequencies with increasing dose random or statistically significant and how large is the regression coefficient (= risk coefficient) in this case? These problems can be solved by sttistical tests. So-called distribution-free tests and tests which are not bound to the supposition of normal distribution are of particular interest, such as: - χ 2 -independence test (test in contingency tables); - Fisher-Yates-test; - trend test according to Cochran; - rank correlation test given by Spearman. These tests are explained in terms of selected epidemiologic data, e.g. of leukaemia clusters, of the cancer mortality of the Japanese A-bomb survivors especially in the low dose range as well as on the sample of the cancer mortality in the high background area in Yangjiang (China). (orig.) [de

  13. Statistical Analysis of Human Body Movement and Group Interactions in Response to Music

    Science.gov (United States)

    Desmet, Frank; Leman, Marc; Lesaffre, Micheline; de Bruyn, Leen

    Quantification of time series that relate to physiological data is challenging for empirical music research. Up to now, most studies have focused on time-dependent responses of individual subjects in controlled environments. However, little is known about time-dependent responses of between-subject interactions in an ecological context. This paper provides new findings on the statistical analysis of group synchronicity in response to musical stimuli. Different statistical techniques were applied to time-dependent data obtained from an experiment on embodied listening in individual and group settings. Analysis of inter group synchronicity are described. Dynamic Time Warping (DTW) and Cross Correlation Function (CCF) were found to be valid methods to estimate group coherence of the resulting movements. It was found that synchronicity of movements between individuals (human-human interactions) increases significantly in the social context. Moreover, Analysis of Variance (ANOVA) revealed that the type of music is the predominant factor in both the individual and the social context.

  14. The significance of age and sex for the absence of immune response to hepatitis B vaccination

    Directory of Open Access Journals (Sweden)

    Rosić Ilija

    2008-01-01

    Full Text Available INTRODUCTION Seroepidemiological investigations after the administration of hepatitis B vaccine have shown that even 15% of vaccinated healthy persons do not generate immune response to the vaccines currently in use. OBJECTIVE The aim of the research is to test the immunogenicity of hepatitis B vaccine in different age groups on the adult vaccinated population sample in Serbia. METHOD The tested general population sample consisted of 154 adult subjects. Immunization was done using the recombinant fungal vaccine obtained by genetic engineering (Euvax B vaccine, manufacturer LG, distributor Sanofi Pasteur. All tested subjects in the research received 1 ml of hepatitis B vaccine administered intramuscularly into the deltoid muscle by 0, 1, 6 schedule. RESULTS In the tested sample, 3.13% of persons aged up to 29 years, 6.25% aged 30-35 year and 19.23% of the tested persons aged 40 years and older had no immune response. The relative risk of “no response" findings was twice higher in the group aged 30-39 as compared to the population aged up to 29 years. The detected risk was six times higher for the population of 40 years and older in comparison to the population aged up to 29 years. Also, the relative risk of “no response" findings for the population of 40 years and older was more than three times higher than for the group aged 30-39. Absent immune response in relation to sex was found to be higher in male subjects. CONCLUSION The rates of “no response" finding was the following: 3.13% in the group aged up to 29 years, 6.25% in the group aged 30-39, as well as in the group aged 40 years and older (19.23%. Immune response in relation to age groups was statistically significantly different (p<0.001, while there was a statistically significant correlation (C=0.473; p<0.001 between the age of the subjects and the immune response. In relation to sex, the “no response" finding was found to be increased in the males, but without any statistically

  15. Statistical Significance and Effect Size: Two Sides of a Coin.

    Science.gov (United States)

    Fan, Xitao

    This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…

  16. Publication of statistically significant research findings in prosthodontics & implant dentistry in the context of other dental specialties.

    Science.gov (United States)

    Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos

    2015-10-01

    To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Significant Statistics: Viewed with a Contextual Lens

    Science.gov (United States)

    Tait-McCutcheon, Sandi

    2010-01-01

    This paper examines the pedagogical and organisational changes three lead teachers made to their statistics teaching and learning programs. The lead teachers posed the research question: What would the effect of contextually integrating statistical investigations and literacies into other curriculum areas be on student achievement? By finding the…

  18. Kappa statistic to measure agreement beyond chance in free-response assessments.

    Science.gov (United States)

    Carpentier, Marc; Combescure, Christophe; Merlini, Laura; Perneger, Thomas V

    2017-04-19

    The usual kappa statistic requires that all observations be enumerated. However, in free-response assessments, only positive (or abnormal) findings are notified, but negative (or normal) findings are not. This situation occurs frequently in imaging or other diagnostic studies. We propose here a kappa statistic that is suitable for free-response assessments. We derived the equivalent of Cohen's kappa statistic for two raters under the assumption that the number of possible findings for any given patient is very large, as well as a formula for sampling variance that is applicable to independent observations (for clustered observations, a bootstrap procedure is proposed). The proposed statistic was applied to a real-life dataset, and compared with the common practice of collapsing observations within a finite number of regions of interest. The free-response kappa is computed from the total numbers of discordant (b and c) and concordant positive (d) observations made in all patients, as 2d/(b + c + 2d). In 84 full-body magnetic resonance imaging procedures in children that were evaluated by 2 independent raters, the free-response kappa statistic was 0.820. Aggregation of results within regions of interest resulted in overestimation of agreement beyond chance. The free-response kappa provides an estimate of agreement beyond chance in situations where only positive findings are reported by raters.

  19. "What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"

    Science.gov (United States)

    Ozturk, Elif

    2012-01-01

    The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…

  20. Statistical vs. Economic Significance in Economics and Econometrics: Further comments on McCloskey & Ziliak

    DEFF Research Database (Denmark)

    Engsted, Tom

    I comment on the controversy between McCloskey & Ziliak and Hoover & Siegler on statistical versus economic significance, in the March 2008 issue of the Journal of Economic Methodology. I argue that while McCloskey & Ziliak are right in emphasizing 'real error', i.e. non-sampling error that cannot...... be eliminated through specification testing, they fail to acknowledge those areas in economics, e.g. rational expectations macroeconomics and asset pricing, where researchers clearly distinguish between statistical and economic significance and where statistical testing plays a relatively minor role in model...

  1. Statistical optimization of cultural conditions by response surface ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-08-04

    Aug 4, 2009 ... Full Length Research Paper. Statistical optimization of cultural conditions by response surface methodology for phenol degradation by a novel ... Phenol is a hydrocarbon compound that is highly toxic, ... Microorganism.

  2. Statistically significant faunal differences among Middle Ordovician age, Chickamauga Group bryozoan bioherms, central Alabama

    Energy Technology Data Exchange (ETDEWEB)

    Crow, C.J.

    1985-01-01

    Middle Ordovician age Chickamauga Group carbonates crop out along the Birmingham and Murphrees Valley anticlines in central Alabama. The macrofossil contents on exposed surfaces of seven bioherms have been counted to determine their various paleontologic characteristics. Twelve groups of organisms are present in these bioherms. Dominant organisms include bryozoans, algae, brachiopods, sponges, pelmatozoans, stromatoporoids and corals. Minor accessory fauna include predators, scavengers and grazers such as gastropods, ostracods, trilobites, cephalopods and pelecypods. Vertical and horizontal niche zonation has been detected for some of the bioherm dwelling fauna. No one bioherm of those studied exhibits all 12 groups of organisms; rather, individual bioherms display various subsets of the total diversity. Statistical treatment (G-test) of the diversity data indicates a lack of statistical homogeneity of the bioherms, both within and between localities. Between-locality population heterogeneity can be ascribed to differences in biologic responses to such gross environmental factors as water depth and clarity, and energy levels. At any one locality, gross aspects of the paleoenvironments are assumed to have been more uniform. Significant differences among bioherms at any one locality may have resulted from patchy distribution of species populations, differential preservation and other factors.

  3. Distinguishing between statistical significance and practical/clinical meaningfulness using statistical inference.

    Science.gov (United States)

    Wilkinson, Michael

    2014-03-01

    Decisions about support for predictions of theories in light of data are made using statistical inference. The dominant approach in sport and exercise science is the Neyman-Pearson (N-P) significance-testing approach. When applied correctly it provides a reliable procedure for making dichotomous decisions for accepting or rejecting zero-effect null hypotheses with known and controlled long-run error rates. Type I and type II error rates must be specified in advance and the latter controlled by conducting an a priori sample size calculation. The N-P approach does not provide the probability of hypotheses or indicate the strength of support for hypotheses in light of data, yet many scientists believe it does. Outcomes of analyses allow conclusions only about the existence of non-zero effects, and provide no information about the likely size of true effects or their practical/clinical value. Bayesian inference can show how much support data provide for different hypotheses, and how personal convictions should be altered in light of data, but the approach is complicated by formulating probability distributions about prior subjective estimates of population effects. A pragmatic solution is magnitude-based inference, which allows scientists to estimate the true magnitude of population effects and how likely they are to exceed an effect magnitude of practical/clinical importance, thereby integrating elements of subjective Bayesian-style thinking. While this approach is gaining acceptance, progress might be hastened if scientists appreciate the shortcomings of traditional N-P null hypothesis significance testing.

  4. Codon Deviation Coefficient: a novel measure for estimating codon usage bias and its statistical significance

    Directory of Open Access Journals (Sweden)

    Zhang Zhang

    2012-03-01

    Full Text Available Abstract Background Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB. Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis. Results Here we propose a novel measure--Codon Deviation Coefficient (CDC--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance. Conclusions As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions.

  5. Statistics Refresher for Molecular Imaging Technologists, Part 2: Accuracy of Interpretation, Significance, and Variance.

    Science.gov (United States)

    Farrell, Mary Beth

    2018-06-01

    This article is the second part of a continuing education series reviewing basic statistics that nuclear medicine and molecular imaging technologists should understand. In this article, the statistics for evaluating interpretation accuracy, significance, and variance are discussed. Throughout the article, actual statistics are pulled from the published literature. We begin by explaining 2 methods for quantifying interpretive accuracy: interreader and intrareader reliability. Agreement among readers can be expressed simply as a percentage. However, the Cohen κ-statistic is a more robust measure of agreement that accounts for chance. The higher the κ-statistic is, the higher is the agreement between readers. When 3 or more readers are being compared, the Fleiss κ-statistic is used. Significance testing determines whether the difference between 2 conditions or interventions is meaningful. Statistical significance is usually expressed using a number called a probability ( P ) value. Calculation of P value is beyond the scope of this review. However, knowing how to interpret P values is important for understanding the scientific literature. Generally, a P value of less than 0.05 is considered significant and indicates that the results of the experiment are due to more than just chance. Variance, standard deviation (SD), confidence interval, and standard error (SE) explain the dispersion of data around a mean of a sample drawn from a population. SD is commonly reported in the literature. A small SD indicates that there is not much variation in the sample data. Many biologic measurements fall into what is referred to as a normal distribution taking the shape of a bell curve. In a normal distribution, 68% of the data will fall within 1 SD, 95% will fall within 2 SDs, and 99.7% will fall within 3 SDs. Confidence interval defines the range of possible values within which the population parameter is likely to lie and gives an idea of the precision of the statistic being

  6. Systematic reviews of anesthesiologic interventions reported as statistically significant

    DEFF Research Database (Denmark)

    Imberger, Georgina; Gluud, Christian; Boylan, John

    2015-01-01

    statistically significant meta-analyses of anesthesiologic interventions, we used TSA to estimate power and imprecision in the context of sparse data and repeated updates. METHODS: We conducted a search to identify all systematic reviews with meta-analyses that investigated an intervention that may......: From 11,870 titles, we found 682 systematic reviews that investigated anesthesiologic interventions. In the 50 sampled meta-analyses, the median number of trials included was 8 (interquartile range [IQR], 5-14), the median number of participants was 964 (IQR, 523-1736), and the median number...

  7. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Science.gov (United States)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  8. P-Value, a true test of statistical significance? a cautionary note ...

    African Journals Online (AJOL)

    While it's not the intention of the founders of significance testing and hypothesis testing to have the two ideas intertwined as if they are complementary, the inconvenient marriage of the two practices into one coherent, convenient, incontrovertible and misinterpreted practice has dotted our standard statistics textbooks and ...

  9. Codon Deviation Coefficient: A novel measure for estimating codon usage bias and its statistical significance

    KAUST Repository

    Zhang, Zhang

    2012-03-22

    Background: Genetic mutation, selective pressure for translational efficiency and accuracy, level of gene expression, and protein function through natural selection are all believed to lead to codon usage bias (CUB). Therefore, informative measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have not statistically evaluated the significance of CUB in sequence analysis.Results: Here we propose a novel measure--Codon Deviation Coefficient (CDC)--that provides an informative measurement of CUB and its statistical significance without requiring any prior knowledge. Unlike previous measures, CDC estimates CUB by accounting for background nucleotide compositions tailored to codon positions and adopts the bootstrapping to assess the statistical significance of CUB for any given sequence. We evaluate CDC by examining its effectiveness on simulated sequences and empirical data and show that CDC outperforms extant measures by achieving a more informative estimation of CUB and its statistical significance.Conclusions: As validated by both simulated and empirical data, CDC provides a highly informative quantification of CUB and its statistical significance, useful for determining comparative magnitudes and patterns of biased codon usage for genes or genomes with diverse sequence compositions. 2012 Zhang et al; licensee BioMed Central Ltd.

  10. Statistics of the Von Mises Stress Response For Structures Subjected To Random Excitations

    Directory of Open Access Journals (Sweden)

    Mu-Tsang Chen

    1998-01-01

    Full Text Available Finite element-based random vibration analysis is increasingly used in computer aided engineering software for computing statistics (e.g., root-mean-square value of structural responses such as displacements, stresses and strains. However, these statistics can often be computed only for Cartesian responses. For the design of metal structures, a failure criterion based on an equivalent stress response, commonly known as the von Mises stress, is more appropriate and often used. This paper presents an approach for computing the statistics of the von Mises stress response for structures subjected to random excitations. Random vibration analysis is first performed to compute covariance matrices of Cartesian stress responses. Monte Carlo simulation is then used to perform scatter and failure analyses using the von Mises stress response.

  11. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  12. Strategies for Testing Statistical and Practical Significance in Detecting DIF with Logistic Regression Models

    Science.gov (United States)

    Fidalgo, Angel M.; Alavi, Seyed Mohammad; Amirian, Seyed Mohammad Reza

    2014-01-01

    This study examines three controversial aspects in differential item functioning (DIF) detection by logistic regression (LR) models: first, the relative effectiveness of different analytical strategies for detecting DIF; second, the suitability of the Wald statistic for determining the statistical significance of the parameters of interest; and…

  13. Thresholds for statistical and clinical significance in systematic reviews with meta-analytic methods

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...

  14. Dynamic Response to Pedestrian Loads with Statistical Frequency Distribution

    DEFF Research Database (Denmark)

    Krenk, Steen

    2012-01-01

    on the magnitude of the resulting response. A frequency representation of vertical pedestrian load is developed, and a compact explicit formula is developed for the magnitude of the resulting response, in terms of the damping ratio of the structure, the bandwidth of the pedestrian load, and the mean footfall...... frequency. The accuracy of the formula is verified by a statistical moment analysis using the Lyapunov equations....

  15. Model-generated air quality statistics for application in vegetation response models in Alberta

    International Nuclear Information System (INIS)

    McVehil, G.E.; Nosal, M.

    1990-01-01

    To test and apply vegetation response models in Alberta, air pollution statistics representative of various parts of the Province are required. At this time, air quality monitoring data of the requisite accuracy and time resolution are not available for most parts of Alberta. Therefore, there exists a need to develop appropriate air quality statistics. The objectives of the work reported here were to determine the applicability of model generated air quality statistics and to develop by modelling, realistic and representative time series of hourly SO 2 concentrations that could be used to generate the statistics demanded by vegetation response models

  16. Multivariate statistical analyses demonstrate unique host immune responses to single and dual lentiviral infection.

    Directory of Open Access Journals (Sweden)

    Sunando Roy

    2009-10-01

    Full Text Available Feline immunodeficiency virus (FIV and human immunodeficiency virus (HIV are recently identified lentiviruses that cause progressive immune decline and ultimately death in infected cats and humans. It is of great interest to understand how to prevent immune system collapse caused by these lentiviruses. We recently described that disease caused by a virulent FIV strain in cats can be attenuated if animals are first infected with a feline immunodeficiency virus derived from a wild cougar. The detailed temporal tracking of cat immunological parameters in response to two viral infections resulted in high-dimensional datasets containing variables that exhibit strong co-variation. Initial analyses of these complex data using univariate statistical techniques did not account for interactions among immunological response variables and therefore potentially obscured significant effects between infection state and immunological parameters.Here, we apply a suite of multivariate statistical tools, including Principal Component Analysis, MANOVA and Linear Discriminant Analysis, to temporal immunological data resulting from FIV superinfection in domestic cats. We investigated the co-variation among immunological responses, the differences in immune parameters among four groups of five cats each (uninfected, single and dual infected animals, and the "immune profiles" that discriminate among them over the first four weeks following superinfection. Dual infected cats mount an immune response by 24 days post superinfection that is characterized by elevated levels of CD8 and CD25 cells and increased expression of IL4 and IFNgamma, and FAS. This profile discriminates dual infected cats from cats infected with FIV alone, which show high IL-10 and lower numbers of CD8 and CD25 cells.Multivariate statistical analyses demonstrate both the dynamic nature of the immune response to FIV single and dual infection and the development of a unique immunological profile in dual

  17. Response statistics of rotating shaft with non-linear elastic restoring forces by path integration

    Science.gov (United States)

    Gaidai, Oleg; Naess, Arvid; Dimentberg, Michael

    2017-07-01

    Extreme statistics of random vibrations is studied for a Jeffcott rotor under uniaxial white noise excitation. Restoring force is modelled as elastic non-linear; comparison is done with linearized restoring force to see the force non-linearity effect on the response statistics. While for the linear model analytical solutions and stability conditions are available, it is not generally the case for non-linear system except for some special cases. The statistics of non-linear case is studied by applying path integration (PI) method, which is based on the Markov property of the coupled dynamic system. The Jeffcott rotor response statistics can be obtained by solving the Fokker-Planck (FP) equation of the 4D dynamic system. An efficient implementation of PI algorithm is applied, namely fast Fourier transform (FFT) is used to simulate dynamic system additive noise. The latter allows significantly reduce computational time, compared to the classical PI. Excitation is modelled as Gaussian white noise, however any kind distributed white noise can be implemented with the same PI technique. Also multidirectional Markov noise can be modelled with PI in the same way as unidirectional. PI is accelerated by using Monte Carlo (MC) estimated joint probability density function (PDF) as initial input. Symmetry of dynamic system was utilized to afford higher mesh resolution. Both internal (rotating) and external damping are included in mechanical model of the rotor. The main advantage of using PI rather than MC is that PI offers high accuracy in the probability distribution tail. The latter is of critical importance for e.g. extreme value statistics, system reliability, and first passage probability.

  18. Intensive inpatient treatment for bulimia nervosa: Statistical and clinical significance of symptom changes.

    Science.gov (United States)

    Diedrich, Alice; Schlegl, Sandra; Greetfeld, Martin; Fumi, Markus; Voderholzer, Ulrich

    2018-03-01

    This study examines the statistical and clinical significance of symptom changes during an intensive inpatient treatment program with a strong psychotherapeutic focus for individuals with severe bulimia nervosa. 295 consecutively admitted bulimic patients were administered the Structured Interview for Anorexic and Bulimic Syndromes-Self-Rating (SIAB-S), the Eating Disorder Inventory-2 (EDI-2), the Brief Symptom Inventory (BSI), and the Beck Depression Inventory-II (BDI-II) at treatment intake and discharge. Results indicated statistically significant symptom reductions with large effect sizes regarding severity of binge eating and compensatory behavior (SIAB-S), overall eating disorder symptom severity (EDI-2), overall psychopathology (BSI), and depressive symptom severity (BDI-II) even when controlling for antidepressant medication. The majority of patients showed either reliable (EDI-2: 33.7%, BSI: 34.8%, BDI-II: 18.1%) or even clinically significant symptom changes (EDI-2: 43.2%, BSI: 33.9%, BDI-II: 56.9%). Patients with clinically significant improvement were less distressed at intake and less likely to suffer from a comorbid borderline personality disorder when compared with those who did not improve to a clinically significant extent. Findings indicate that intensive psychotherapeutic inpatient treatment may be effective in about 75% of severely affected bulimic patients. For the remaining non-responding patients, inpatient treatment might be improved through an even stronger focus on the reduction of comorbid borderline personality traits.

  19. Recent Literature on Whether Statistical Significance Tests Should or Should Not Be Banned.

    Science.gov (United States)

    Deegear, James

    This paper summarizes the literature regarding statistical significant testing with an emphasis on recent literature in various discipline and literature exploring why researchers have demonstrably failed to be influenced by the American Psychological Association publication manual's encouragement to report effect sizes. Also considered are…

  20. Statistics of Visual Responses to Image Object Stimuli from Primate AIT Neurons to DNN Neurons.

    Science.gov (United States)

    Dong, Qiulei; Wang, Hong; Hu, Zhanyi

    2018-02-01

    Under the goal-driven paradigm, Yamins et al. ( 2014 ; Yamins & DiCarlo, 2016 ) have shown that by optimizing only the final eight-way categorization performance of a four-layer hierarchical network, not only can its top output layer quantitatively predict IT neuron responses but its penultimate layer can also automatically predict V4 neuron responses. Currently, deep neural networks (DNNs) in the field of computer vision have reached image object categorization performance comparable to that of human beings on ImageNet, a data set that contains 1.3 million training images of 1000 categories. We explore whether the DNN neurons (units in DNNs) possess image object representational statistics similar to monkey IT neurons, particularly when the network becomes deeper and the number of image categories becomes larger, using VGG19, a typical and widely used deep network of 19 layers in the computer vision field. Following Lehky, Kiani, Esteky, and Tanaka ( 2011 , 2014 ), where the response statistics of 674 IT neurons to 806 image stimuli are analyzed using three measures (kurtosis, Pareto tail index, and intrinsic dimensionality), we investigate the three issues in this letter using the same three measures: (1) the similarities and differences of the neural response statistics between VGG19 and primate IT cortex, (2) the variation trends of the response statistics of VGG19 neurons at different layers from low to high, and (3) the variation trends of the response statistics of VGG19 neurons when the numbers of stimuli and neurons increase. We find that the response statistics on both single-neuron selectivity and population sparseness of VGG19 neurons are fundamentally different from those of IT neurons in most cases; by increasing the number of neurons in different layers and the number of stimuli, the response statistics of neurons at different layers from low to high do not substantially change; and the estimated intrinsic dimensionality values at the low

  1. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    Science.gov (United States)

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  2. Predicting Statistical Response and Extreme Events in Uncertainty Quantification through Reduced-Order Models

    Science.gov (United States)

    Qi, D.; Majda, A.

    2017-12-01

    A low-dimensional reduced-order statistical closure model is developed for quantifying the uncertainty in statistical sensitivity and intermittency in principal model directions with largest variability in high-dimensional turbulent system and turbulent transport models. Imperfect model sensitivity is improved through a recent mathematical strategy for calibrating model errors in a training phase, where information theory and linear statistical response theory are combined in a systematic fashion to achieve the optimal model performance. The idea in the reduced-order method is from a self-consistent mathematical framework for general systems with quadratic nonlinearity, where crucial high-order statistics are approximated by a systematic model calibration procedure. Model efficiency is improved through additional damping and noise corrections to replace the expensive energy-conserving nonlinear interactions. Model errors due to the imperfect nonlinear approximation are corrected by tuning the model parameters using linear response theory with an information metric in a training phase before prediction. A statistical energy principle is adopted to introduce a global scaling factor in characterizing the higher-order moments in a consistent way to improve model sensitivity. Stringent models of barotropic and baroclinic turbulence are used to display the feasibility of the reduced-order methods. Principal statistical responses in mean and variance can be captured by the reduced-order models with accuracy and efficiency. Besides, the reduced-order models are also used to capture crucial passive tracer field that is advected by the baroclinic turbulent flow. It is demonstrated that crucial principal statistical quantities like the tracer spectrum and fat-tails in the tracer probability density functions in the most important large scales can be captured efficiently with accuracy using the reduced-order tracer model in various dynamical regimes of the flow field with

  3. Statistical Analysis of Zebrafish Locomotor Response.

    Science.gov (United States)

    Liu, Yiwen; Carmer, Robert; Zhang, Gaonan; Venkatraman, Prahatha; Brown, Skye Ashton; Pang, Chi-Pui; Zhang, Mingzhi; Ma, Ping; Leung, Yuk Fai

    2015-01-01

    Zebrafish larvae display rich locomotor behaviour upon external stimulation. The movement can be simultaneously tracked from many larvae arranged in multi-well plates. The resulting time-series locomotor data have been used to reveal new insights into neurobiology and pharmacology. However, the data are of large scale, and the corresponding locomotor behavior is affected by multiple factors. These issues pose a statistical challenge for comparing larval activities. To address this gap, this study has analyzed a visually-driven locomotor behaviour named the visual motor response (VMR) by the Hotelling's T-squared test. This test is congruent with comparing locomotor profiles from a time period. Different wild-type (WT) strains were compared using the test, which shows that they responded differently to light change at different developmental stages. The performance of this test was evaluated by a power analysis, which shows that the test was sensitive for detecting differences between experimental groups with sample numbers that were commonly used in various studies. In addition, this study investigated the effects of various factors that might affect the VMR by multivariate analysis of variance (MANOVA). The results indicate that the larval activity was generally affected by stage, light stimulus, their interaction, and location in the plate. Nonetheless, different factors affected larval activity differently over time, as indicated by a dynamical analysis of the activity at each second. Intriguingly, this analysis also shows that biological and technical repeats had negligible effect on larval activity. This finding is consistent with that from the Hotelling's T-squared test, and suggests that experimental repeats can be combined to enhance statistical power. Together, these investigations have established a statistical framework for analyzing VMR data, a framework that should be generally applicable to other locomotor data with similar structure.

  4. Power, effects, confidence, and significance: an investigation of statistical practices in nursing research.

    Science.gov (United States)

    Gaskin, Cadeyrn J; Happell, Brenda

    2014-05-01

    improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  5. Safety significance of ATR passive safety response attributes

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1990-01-01

    The Advanced Test Reactor (ATR) at the Idaho National Engineering Laboratory was designed with some passive safety response attributes which contribute to the safety of the facility. The three passive safety attributes being evaluated in the paper are: 1) In-core and in-vessel natural convection cooling, 2) a passive heat sink capability of the ATR primary coolant system (PCS) for the transfer of decay power from the uninsulated piping to the confinement, and 3) gravity feed of emergency coolant makeup. The safety significance of the ATR passive safety response attributes is that the reactor can passively respond to most transients, given a reactor scram, to provide adequate decay power removal and a significant time for operator action should the normal active heat removal systems and their backup systems both fail. The ATR Interim Level 1 Probabilistic Risk Assessment (PRA) models and results were used to evaluate the significance to ATR fuel damage frequency (or probability) of the above three passive response attributes. The results of the evaluation indicate that the first attribute is a major safety characteristic of the ATR. The second attribute has a noticeable but only minor safety significance. The third attribute has no significant influence on the ATR firewater injection system (emergency coolant system)

  6. Examining reproducibility in psychology : A hybrid method for combining a statistically significant original study and a replication

    NARCIS (Netherlands)

    Van Aert, R.C.M.; Van Assen, M.A.L.M.

    2018-01-01

    The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter

  7. A tutorial on hunting statistical significance by chasing N

    Directory of Open Access Journals (Sweden)

    Denes Szucs

    2016-09-01

    Full Text Available There is increasing concern about the replicability of studies in psychology and cognitive neuroscience. Hidden data dredging (also called p-hacking is a major contributor to this crisis because it substantially increases Type I error resulting in a much larger proportion of false positive findings than the usually expected 5%. In order to build better intuition to avoid, detect and criticise some typical problems, here I systematically illustrate the large impact of some easy to implement and so, perhaps frequent data dredging techniques on boosting false positive findings. I illustrate several forms of two special cases of data dredging. First, researchers may violate the data collection stopping rules of null hypothesis significance testing by repeatedly checking for statistical significance with various numbers of participants. Second, researchers may group participants post-hoc along potential but unplanned independent grouping variables. The first approach 'hacks' the number of participants in studies, the second approach ‘hacks’ the number of variables in the analysis. I demonstrate the high amount of false positive findings generated by these techniques with data from true null distributions. I also illustrate that it is extremely easy to introduce strong bias into data by very mild selection and re-testing. Similar, usually undocumented data dredging steps can easily lead to having 20-50%, or more false positives.

  8. Human physiological benefits of viewing nature: EEG responses to exact and statistical fractal patterns.

    Science.gov (United States)

    Hagerhall, C M; Laike, T; Küller, M; Marcheschi, E; Boydston, C; Taylor, R P

    2015-01-01

    Psychological and physiological benefits of viewing nature have been extensively studied for some time. More recently it has been suggested that some of these positive effects can be explained by nature's fractal properties. Virtually all studies on human responses to fractals have used stimuli that represent the specific form of fractal geometry found in nature, i.e. statistical fractals, as opposed to fractal patterns which repeat exactly at different scales. This raises the question of whether human responses like preference and relaxation are being driven by fractal geometry in general or by the specific form of fractal geometry found in nature. In this study we consider both types of fractals (statistical and exact) and morph one type into the other. Based on the Koch curve, nine visual stimuli were produced in which curves of three different fractal dimensions evolve gradually from an exact to a statistical fractal. The patterns were shown for one minute each to thirty-five subjects while qEEG was continuously recorded. The results showed that the responses to statistical and exact fractals differ, and that the natural form of the fractal is important for inducing alpha responses, an indicator of a wakefully relaxed state and internalized attention.

  9. On a logical basis for division of responsibilities in statistical practice

    Science.gov (United States)

    Deming, W. Edwards

    1966-01-01

    The purpose of this paper is to explain principles for division of responsibilities between the statistician and the people that he works with, and reasons why this division of responsibilities is important -- that is, how it improves the performance of both statistician and expert in subject-matter. The aim is to find and illustrate principles of practice by which statisticians may make effective use of their knowledge of theory. The specialist in statistical methods may find himself applying the same basic theory in a dozen different fields in a week, rotating through the same projects the next week. Or, he may work day after day primarily in a single substantive field. Either way, he requires rules of practice. A statement of statistical reliability should present any information that might help the reader to form his own opinion concerning the validity of conclusions likely to be drawn from the results. The aim of a statistical report is to protect the client from seeing merely what he would like to see; to protect him from losses that could come from misuse of results. A further aim is to forestall unwarranted claims of accuracy that the client's public might otherwise accept.

  10. Bayesian analysis applied to statistical uncertainties of extreme response distributions of offshore wind turbines

    NARCIS (Netherlands)

    Cheng, P.W.; Kuik, van G.A.M.; Bussel, van G.J.W.; Vrouwenvelder, A.C.W.M.

    2002-01-01

    Extreme response is an important design variable for wind turbines. The statistical uncertainties concerning the extreme response distribution are simulated here with data concerning physical characteristics obtained from measurements. The extreme responses are the flap moment at the blade root and

  11. Gender differences in binaural speech-evoked auditory brainstem response: are they clinically significant?

    Science.gov (United States)

    Jalaei, Bahram; Azmi, Mohd Hafiz Afifi Mohd; Zakaria, Mohd Normani

    2018-05-17

    Binaurally evoked auditory evoked potentials have good diagnostic values when testing subjects with central auditory deficits. The literature on speech-evoked auditory brainstem response evoked by binaural stimulation is in fact limited. Gender disparities in speech-evoked auditory brainstem response results have been consistently noted but the magnitude of gender difference has not been reported. The present study aimed to compare the magnitude of gender difference in speech-evoked auditory brainstem response results between monaural and binaural stimulations. A total of 34 healthy Asian adults aged 19-30 years participated in this comparative study. Eighteen of them were females (mean age=23.6±2.3 years) and the remaining sixteen were males (mean age=22.0±2.3 years). For each subject, speech-evoked auditory brainstem response was recorded with the synthesized syllable /da/ presented monaurally and binaurally. While latencies were not affected (p>0.05), the binaural stimulation produced statistically higher speech-evoked auditory brainstem response amplitudes than the monaural stimulation (p0.80), substantive gender differences were noted in most of speech-evoked auditory brainstem response peaks for both stimulation modes. The magnitude of gender difference between the two stimulation modes revealed some distinct patterns. Based on these clinically significant results, gender-specific normative data are highly recommended when using speech-evoked auditory brainstem response for clinical and future applications. The preliminary normative data provided in the present study can serve as the reference for future studies on this test among Asian adults. Copyright © 2018 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  12. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  13. Response to a spill of national significance

    International Nuclear Information System (INIS)

    Jensen, D.S.; Pond, R.; Johnson, M.H.

    1993-01-01

    Responding to a spill of national significance (SONS), such as the 1989 Exxon Valdez spill, requires an augmenting organization to support the local response organization. The US Coast Guard has developed SONS protocol to be better prepared to respond to these infrequent catastrophic spills. A flag-level Coast Guard officer assumes the role of national incident commander (NIC) and federal on-scene coordinator (OSC), and is supported by a national incident task force (NITF). The major role of the NITF is to develop a national response strategy, acquire response resources and allocate them efficiently, and effectively deal with many peripheral national issues. Unified command concepts have been incorporated into the NITF and its primary organizational elements. In addition, frequent training and exercising is essential to keep the SONS protocol's preparedness at an acceptable level

  14. Biological significance of sperm whale responses to sonar: Comparison with anti-predator responses

    NARCIS (Netherlands)

    Curé, C.; Isojunno, S.; Visser, F.; Wensveen, P.J.; Sivle, L.D.; Kvadsheim, P.H.; Lam, F.A.; Miller, P.J.O.

    2016-01-01

    A key issue when investigating effects of anthropogenic noise on cetacean behavior is to identify the biological significance of the responses. Predator presence can be considered a natural high-level disturbance stimulus to which prey animals have evolved adaptive response strategies to reduce

  15. Surprise responses in the human brain demonstrate statistical learning under high concurrent cognitive demand

    Science.gov (United States)

    Garrido, Marta Isabel; Teng, Chee Leong James; Taylor, Jeremy Alexander; Rowe, Elise Genevieve; Mattingley, Jason Brett

    2016-06-01

    The ability to learn about regularities in the environment and to make predictions about future events is fundamental for adaptive behaviour. We have previously shown that people can implicitly encode statistical regularities and detect violations therein, as reflected in neuronal responses to unpredictable events that carry a unique prediction error signature. In the real world, however, learning about regularities will often occur in the context of competing cognitive demands. Here we asked whether learning of statistical regularities is modulated by concurrent cognitive load. We compared electroencephalographic metrics associated with responses to pure-tone sounds with frequencies sampled from narrow or wide Gaussian distributions. We showed that outliers evoked a larger response than those in the centre of the stimulus distribution (i.e., an effect of surprise) and that this difference was greater for physically identical outliers in the narrow than in the broad distribution. These results demonstrate an early neurophysiological marker of the brain's ability to implicitly encode complex statistical structure in the environment. Moreover, we manipulated concurrent cognitive load by having participants perform a visual working memory task while listening to these streams of sounds. We again observed greater prediction error responses in the narrower distribution under both low and high cognitive load. Furthermore, there was no reliable reduction in prediction error magnitude under high-relative to low-cognitive load. Our findings suggest that statistical learning is not a capacity limited process, and that it proceeds automatically even when cognitive resources are taxed by concurrent demands.

  16. Statistical significance estimation of a signal within the GooFit framework on GPUs

    Directory of Open Access Journals (Sweden)

    Cristella Leonardo

    2017-01-01

    Full Text Available In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  17. Is statistical significance clinically important?--A guide to judge the clinical relevance of study findings

    NARCIS (Netherlands)

    Sierevelt, Inger N.; van Oldenrijk, Jakob; Poolman, Rudolf W.

    2007-01-01

    In this paper we describe several issues that influence the reporting of statistical significance in relation to clinical importance, since misinterpretation of p values is a common issue in orthopaedic literature. Orthopaedic research is tormented by the risks of false-positive (type I error) and

  18. Random cyclic stress-strain responses of a stainless steel pipe-weld metal. I. A statistical investigation

    International Nuclear Information System (INIS)

    Zhao, Y.X.; Wang, J.N.

    2000-01-01

    For pt.II see ibid., vol.199, p.315-26, 2000. This paper pays a special attention to the issue that there is a significant scatter of the stress-strain responses of a nuclear engineering material, 1Cr18Ni9Ti stainless steel pipe-weld metal. Statistical investigation is made to the cyclic stress amplitudes of this material. Three considerations are given. They consist of the total fit, the consistency with fatigue physics and the safety in practice of the seven commonly used statistical distributions, namely Weibull (two- and three-parameter), normal, lognormal, extreme minimum value, extreme maximum value and exponential. Results reveal that the data follow meanwhile the seven distributions but the local effects of the distributions yield a significant difference. Any of the normal, lognormal, extreme minimum value and extreme maximum value distributions might be an appropriate assumed distribution for characterizing the data. The normal and extreme minimum models are excellent. Other distributions do not fit the data as they violate two or three of the mentioned considerations. (orig.)

  19. Statistical significance of theoretical predictions: A new dimension in nuclear structure theories (I)

    International Nuclear Information System (INIS)

    DUDEK, J; SZPAK, B; FORNAL, B; PORQUET, M-G

    2011-01-01

    In this and the follow-up article we briefly discuss what we believe represents one of the most serious problems in contemporary nuclear structure: the question of statistical significance of parametrizations of nuclear microscopic Hamiltonians and the implied predictive power of the underlying theories. In the present Part I, we introduce the main lines of reasoning of the so-called Inverse Problem Theory, an important sub-field in the contemporary Applied Mathematics, here illustrated on the example of the Nuclear Mean-Field Approach.

  20. Confounding and Statistical Significance of Indirect Effects: Childhood Adversity, Education, Smoking, and Anxious and Depressive Symptomatology

    Directory of Open Access Journals (Sweden)

    Mashhood Ahmed Sheikh

    2017-08-01

    mediate the association between childhood adversity and ADS in adulthood. However, when education was excluded as a mediator-response confounding variable, the indirect effect of childhood adversity on ADS in adulthood was statistically significant (p < 0.05. This study shows that a careful inclusion of potential confounding variables is important when assessing mediation.

  1. Statistical Significance of the Contribution of Variables to the PCA Solution: An Alternative Permutation Strategy

    Science.gov (United States)

    Linting, Marielle; van Os, Bart Jan; Meulman, Jacqueline J.

    2011-01-01

    In this paper, the statistical significance of the contribution of variables to the principal components in principal components analysis (PCA) is assessed nonparametrically by the use of permutation tests. We compare a new strategy to a strategy used in previous research consisting of permuting the columns (variables) of a data matrix…

  2. A Note on Comparing the Power of Test Statistics at Low Significance Levels.

    Science.gov (United States)

    Morris, Nathan; Elston, Robert

    2011-01-01

    It is an obvious fact that the power of a test statistic is dependent upon the significance (alpha) level at which the test is performed. It is perhaps a less obvious fact that the relative performance of two statistics in terms of power is also a function of the alpha level. Through numerous personal discussions, we have noted that even some competent statisticians have the mistaken intuition that relative power comparisons at traditional levels such as α = 0.05 will be roughly similar to relative power comparisons at very low levels, such as the level α = 5 × 10 -8 , which is commonly used in genome-wide association studies. In this brief note, we demonstrate that this notion is in fact quite wrong, especially with respect to comparing tests with differing degrees of freedom. In fact, at very low alpha levels the cost of additional degrees of freedom is often comparatively low. Thus we recommend that statisticians exercise caution when interpreting the results of power comparison studies which use alpha levels that will not be used in practice.

  3. Statistical Issues in Social Allocation Models of Intelligence: A Review and a Response

    Science.gov (United States)

    Light, Richard J.; Smith, Paul V.

    1971-01-01

    This is a response to Shockley (1971) which summarizes the original Light and Smith work; outlines Shockley's criticisms; responds to the statistical issues; and concludes with the methodological implications of the disagreement. (VW)

  4. ClusterSignificance: A bioconductor package facilitating statistical analysis of class cluster separations in dimensionality reduced data

    DEFF Research Database (Denmark)

    Serviss, Jason T.; Gådin, Jesper R.; Eriksson, Per

    2017-01-01

    , e.g. genes in a specific pathway, alone can separate samples into these established classes. Despite this, the evaluation of class separations is often subjective and performed via visualization. Here we present the ClusterSignificance package; a set of tools designed to assess the statistical...... significance of class separations downstream of dimensionality reduction algorithms. In addition, we demonstrate the design and utility of the ClusterSignificance package and utilize it to determine the importance of long non-coding RNA expression in the identity of multiple hematological malignancies....

  5. Does response distortion statistically affect the relations between self-report psychopathy measures and external criteria?

    NARCIS (Netherlands)

    Watts, A.L.; Lilienfeld, S.O.; Edens, J.F.; Douglas, K.S.; Skeem, J.L.; Verschuere, B.; LoPilato, A.C.

    2016-01-01

    Given that psychopathy is associated with narcissism, lack of insight, and pathological lying, the assumption that the validity of self-report psychopathy measures is compromised by response distortion has been widespread. We examined the statistical effects (moderation, suppression) of response

  6. Statistical significance versus clinical importance: trials on exercise therapy for chronic low back pain as example.

    NARCIS (Netherlands)

    van Tulder, M.W.; Malmivaara, A.; Hayden, J.; Koes, B.

    2007-01-01

    STUDY DESIGN. Critical appraisal of the literature. OBJECIVES. The objective of this study was to assess if results of back pain trials are statistically significant and clinically important. SUMMARY OF BACKGROUND DATA. There seems to be a discrepancy between conclusions reported by authors and

  7. LOR-OSEM: statistical PET reconstruction from raw line-of-response histograms

    International Nuclear Information System (INIS)

    Kadrmas, Dan J

    2004-01-01

    Iterative statistical reconstruction methods are becoming the standard in positron emission tomography (PET). Conventional maximum-likelihood expectation-maximization (MLEM) and ordered-subsets (OSEM) algorithms act on data which have been pre-processed into corrected, evenly-spaced histograms; however, such pre-processing corrupts the Poisson statistics. Recent advances have incorporated attenuation, scatter and randoms compensation into the iterative reconstruction. The objective of this work was to incorporate the remaining pre-processing steps, including arc correction, to reconstruct directly from raw unevenly-spaced line-of-response (LOR) histograms. This exactly preserves Poisson statistics and full spatial information in a manner closely related to listmode ML, making full use of the ML statistical model. The LOR-OSEM algorithm was implemented using a rotation-based projector which maps directly to the unevenly-spaced LOR grid. Simulation and phantom experiments were performed to characterize resolution, contrast and noise properties for 2D PET. LOR-OSEM provided a beneficial noise-resolution tradeoff, outperforming AW-OSEM by about the same margin that AW-OSEM outperformed pre-corrected OSEM. The relationship between LOR-ML and listmode ML algorithms was explored, and implementation differences are discussed. LOR-OSEM is a viable alternative to AW-OSEM for histogram-based reconstruction with improved spatial resolution and noise properties

  8. Short separation regression improves statistical significance and better localizes the hemodynamic response obtained by near-infrared spectroscopy for tasks with differing autonomic responses.

    Science.gov (United States)

    Yücel, Meryem A; Selb, Juliette; Aasted, Christopher M; Petkov, Mike P; Becerra, Lino; Borsook, David; Boas, David A

    2015-07-01

    Autonomic nervous system response is known to be highly task-dependent. The sensitivity of near-infrared spectroscopy (NIRS) measurements to superficial layers, particularly to the scalp, makes it highly susceptible to systemic physiological changes. Thus, one critical step in NIRS data processing is to remove the contribution of superficial layers to the NIRS signal and to obtain the actual brain response. This can be achieved using short separation channels that are sensitive only to the hemodynamics in the scalp. We investigated the contribution of hemodynamic fluctuations due to autonomous nervous system activation during various tasks. Our results provide clear demonstrations of the critical role of using short separation channels in NIRS measurements to disentangle differing autonomic responses from the brain activation signal of interest.

  9. Indirectional statistics and the significance of an asymmetry discovered by Birch

    International Nuclear Information System (INIS)

    Kendall, D.G.; Young, G.A.

    1984-01-01

    Birch (1982, Nature, 298, 451) reported an apparent 'statistical asymmetry of the Universe'. The authors here develop 'indirectional analysis' as a technique for investigating statistical effects of this kind and conclude that the reported effect (whatever may be its origin) is strongly supported by the observations. The estimated pole of the asymmetry is at RA 13h 30m, Dec. -37deg. The angular error in its estimation is unlikely to exceed 20-30deg. (author)

  10. Search for lesions in mammograms: Statistical characterization of observer responses

    International Nuclear Information System (INIS)

    Bochud, Francois O.; Abbey, Craig K.; Eckstein, Miguel P.

    2004-01-01

    We investigate human performance for visually detecting simulated microcalcifications and tumors embedded in x-ray mammograms as a function of signal contrast and the number of possible signal locations. Our results show that performance degradation with an increasing number of locations is well approximated by signal detection theory (SDT) with the usual Gaussian assumption. However, more stringent statistical analysis finds a departure from Gaussian assumptions for the detection of microcalcifications. We investigated whether these departures from the SDT Gaussian model could be accounted for by an increase in human internal response correlations arising from the image-pixel correlations present in 1/f spectrum backgrounds and/or observer internal response distributions that departed from the Gaussian assumption. Results were consistent with a departure from the Gaussian response distributions and suggested that the human observer internal responses were more compact than the Gaussian distribution. Finally, we conducted a free search experiment where the signal could appear anywhere within the image. Results show that human performance in a multiple-alternative forced-choice experiment can be used to predict performance in the clinically realistic free search experiment when the investigator takes into account the search area and the observers' inherent spatial imprecision to localize the targets

  11. Rapid Classification and Identification of Multiple Microorganisms with Accurate Statistical Significance via High-Resolution Tandem Mass Spectrometry.

    Science.gov (United States)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo

    2018-06-05

    Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.

  12. Industrial commodity statistics yearbook 2001. Production statistics (1992-2001)

    International Nuclear Information System (INIS)

    2003-01-01

    This is the thirty-fifth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1992-2001 for about 200 countries and areas

  13. Industrial commodity statistics yearbook 2002. Production statistics (1993-2002)

    International Nuclear Information System (INIS)

    2004-01-01

    This is the thirty-sixth in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title 'The Growth of World industry' and the next eight editions under the title 'Yearbook of Industrial Statistics'. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. The statistics refer to the ten-year period 1993-2002 for about 200 countries and areas

  14. Industrial commodity statistics yearbook 2000. Production statistics (1991-2000)

    International Nuclear Information System (INIS)

    2002-01-01

    This is the thirty-third in a series of annual compilations of statistics on world industry designed to meet both the general demand for information of this kind and the special requirements of the United Nations and related international bodies. Beginning with the 1992 edition, the title of the publication was changed to industrial Commodity Statistics Yearbook as the result of a decision made by the United Nations Statistical Commission at its twenty-seventh session to discontinue, effective 1994, publication of the Industrial Statistics Yearbook, volume I, General Industrial Statistics by the Statistics Division of the United Nations. The United Nations Industrial Development Organization (UNIDO) has become responsible for the collection and dissemination of general industrial statistics while the Statistics Division of the United Nations continues to be responsible for industrial commodity production statistics. The previous title, Industrial Statistics Yearbook, volume II, Commodity Production Statistics, was introduced in the 1982 edition. The first seven editions in this series were published under the title The Growth of World industry and the next eight editions under the title Yearbook of Industrial Statistics. This edition of the Yearbook contains annual quantity data on production of industrial commodities by country, geographical region, economic grouping and for the world. A standard list of about 530 commodities (about 590 statistical series) has been adopted for the publication. Most of the statistics refer to the ten-year period 1991-2000 for about 200 countries and areas

  15. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data.

    Science.gov (United States)

    Kim, Sung-Min; Choi, Yosoon

    2017-06-18

    To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  16. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data

    Directory of Open Access Journals (Sweden)

    Sung-Min Kim

    2017-06-01

    Full Text Available To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH, high content with a low z-score (HL, low content with a high z-score (LH, and low content with a low z-score (LL. The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  17. On testing the significance of atmospheric response to smoke from the Kuwaiti oil fires using the Los Alamos general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Kao, C.J.; Glatzmaier, G.A.; Malone, R.C. [Los Alamos National Laboratory, Los Alamos, NM (United States)

    1994-07-01

    The response of the Los Alamos atmospheric general circulation model to the smoke from the Kuwaiti oil fires set in 1991 is examined. The model has an interactive soot transport module that uses a Lagrangian tracer particle scheme. The statistical significance of the results is evaluated using a methodology based on the classic Student`s t test. Among various estimated smoke emission rates and associated visible absorption coefficients, the worst- and best-case scenarios are selected. In each of the scenarios, an ensemble of 10 30-day June simulations are conducted with the smoke and are compared to the same 10 June simulations without the smoke. The results of the worst-case scneario show that a statistically significant wave train pattern propagates eastward-poleward downstream from the source. The signals favorably compare with the observed climate anomalies in summer 1991, albeit some possible El Nino-Southern Oscillation effects were involved in the actual climate. The results of the best-case (i.e., least-impact) scenario show that the significance is rather small but that its general pattern is quite similar to that in the worst-case scenario.

  18. On testing the significance of atmospheric response to smoke from the Kuwaiti oil fires using the Los Alamos general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Chih-Yue Jim Kao; Glatzmaier, G.A.; Malone, R.C. [Los Alamos National Lab., NM (United States)

    1994-07-20

    The response of the Los Alamos atmospheric general circulation model to the smoke from the Kuwaiti oil fires set in 1991 is examined. The model has an interactive soot transport module that uses a Lagrangian tracer particle scheme. The statistical significance of the results is evaluated using a methodology based on the classic Student`s t test. Among various estimated smoke emission rates and associated visible absorption coefficients, the worst- and best-case scenarios are selected. In each of the scenarios, an ensemble of 10, 30-day June simulations are conducted with the smoke, and are compared to the same 10 June simulations without the smoke. The results of the worst-case scenario show that a statistically significant wave train pattern propagates eastward-poleward downstream from the source. The signals favorably compare with the observed climate anomalies in summer 1991, albeit some possible El Nino-Southern Oscillation effects were involved in the actual climate. The results of the best-case (i.e., least-impact) scenario show that the significance is rather small but that its general pattern is quite similar to that in the worst-case scenario. 24 refs., 5 figs.

  19. Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks

    Science.gov (United States)

    2016-04-26

    Systems, Statistics & Management Science, University of Alabama, USA. 1 DISTRIBUTION A: Distribution approved for public release. Contents 1 Summary 5...13 5 Application to Real Networks 18 5.1 2012 FBS Football Schedule Network... football schedule network. . . . . . . . . . . . . . . . . . . . . . 21 14 Stem plot of degree-ordered vertices versus the degree for college football

  20. Statistical significant changes in ground thermal conditions of alpine Austria during the last decade

    Science.gov (United States)

    Kellerer-Pirklbauer, Andreas

    2016-04-01

    Longer data series (e.g. >10 a) of ground temperatures in alpine regions are helpful to improve the understanding regarding the effects of present climate change on distribution and thermal characteristics of seasonal frost- and permafrost-affected areas. Beginning in 2004 - and more intensively since 2006 - a permafrost and seasonal frost monitoring network was established in Central and Eastern Austria by the University of Graz. This network consists of c.60 ground temperature (surface and near-surface) monitoring sites which are located at 1922-3002 m a.s.l., at latitude 46°55'-47°22'N and at longitude 12°44'-14°41'E. These data allow conclusions about general ground thermal conditions, potential permafrost occurrence, trend during the observation period, and regional pattern of changes. Calculations and analyses of several different temperature-related parameters were accomplished. At an annual scale a region-wide statistical significant warming during the observation period was revealed by e.g. an increase in mean annual temperature values (mean, maximum) or the significant lowering of the surface frost number (F+). At a seasonal scale no significant trend of any temperature-related parameter was in most cases revealed for spring (MAM) and autumn (SON). Winter (DJF) shows only a weak warming. In contrast, the summer (JJA) season reveals in general a significant warming as confirmed by several different temperature-related parameters such as e.g. mean seasonal temperature, number of thawing degree days, number of freezing degree days, or days without night frost. On a monthly basis August shows the statistically most robust and strongest warming of all months, although regional differences occur. Despite the fact that the general ground temperature warming during the last decade is confirmed by the field data in the study region, complications in trend analyses arise by temperature anomalies (e.g. warm winter 2006/07) or substantial variations in the winter

  1. Safety significance of ATR [Advanced Test Reactor] passive safety response attributes

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1989-01-01

    The Advanced Test Reactor (ATR) at the Idaho National Engineering Laboratory was designed with some passive safety response attributes which contribute to the safety posture of the facility. The three passive safety attributes being evaluated in the paper are: (1) In-core and in-vessel natural convection cooling, (2) a passive heat sink capability of the ATR primary coolant system (PCS) for the transfer of decay power from the uninsulated piping to the confinement, and (3) gravity feed of emergency coolant makeup. The safety significance of the ATR passive safety response attributes is that the reactor can passively respond for most transients, given a reactor scram, to provide adequate decay power removal and a significant time for operator action should the normal active heat removal systems and their backup systems both fail. The ATR Interim Level 1 Probabilistic Risk Assessment (PRA) model ands results were used to evaluate the significance to ATR fuel damage frequency (or probability) of the above three passive response attributes. The results of the evaluation indicate that the first attribute is a major safety characteristic of the ATR. The second attribute has a noticeable but only minor safety significance. The third attribute has no significant influence on the ATR Level 1 PRA because of the diversity and redundancy of the ATR firewater injection system (emergency coolant system). 8 refs., 4 figs., 1 tab

  2. TIMP-1 Is Significantly Associated with Objective Response and Survival in Metastatic Colorectal Cancer Patients Receiving Combination of Irinotecan, 5-Fluorouracil, and Folinic Acid

    DEFF Research Database (Denmark)

    Sørensen, Nanna M; Byström, Per; Christensen, Ib Jarle

    2007-01-01

    that patients with low plasmaTIMP-1had higher probability of obtaining an objective response [odds ratio (OR), 3.5; 95% confidence interval (95% CI), 1.4-8.5,P =0.007].CEAtreatedas a continuous variablewas also a statistically significantpredictorof no response (OR,1.3; 95%CI,1.0-1.7, P = 0.02, areaunder......Purpose: Tissue inhibitor of metalloproteinase-1 (TIMP-1) is known to protect cells against apoptosis.We raisedth e hypothesis that elevatedtu mor tissue levels and thereby plasma levels of TIMP-1wouldp redict resistance to apoptosis-inducing chemotherapy. Experimental Design: Ninety patients...... includingTTP instead of OS showed that only plasmaTIMP-1was retained in the model (HR,1.5). CEA was not significantly associatedw ith TTP or OS when TIMP-1was included in the model. Conclusion: This study shows that plasmaTIMP-1 levels are significantly and independently associatedw ith objective response...

  3. Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science.

    Science.gov (United States)

    Veldkamp, Coosje L S; Nuijten, Michèle B; Dominguez-Alvarez, Linda; van Assen, Marcel A L M; Wicherts, Jelte M

    2014-01-01

    Statistical analysis is error prone. A best practice for researchers using statistics would therefore be to share data among co-authors, allowing double-checking of executed tasks just as co-pilots do in aviation. To document the extent to which this 'co-piloting' currently occurs in psychology, we surveyed the authors of 697 articles published in six top psychology journals and asked them whether they had collaborated on four aspects of analyzing data and reporting results, and whether the described data had been shared between the authors. We acquired responses for 49.6% of the articles and found that co-piloting on statistical analysis and reporting results is quite uncommon among psychologists, while data sharing among co-authors seems reasonably but not completely standard. We then used an automated procedure to study the prevalence of statistical reporting errors in the articles in our sample and examined the relationship between reporting errors and co-piloting. Overall, 63% of the articles contained at least one p-value that was inconsistent with the reported test statistic and the accompanying degrees of freedom, and 20% of the articles contained at least one p-value that was inconsistent to such a degree that it may have affected decisions about statistical significance. Overall, the probability that a given p-value was inconsistent was over 10%. Co-piloting was not found to be associated with reporting errors.

  4. Conducting tests for statistically significant differences using forest inventory data

    Science.gov (United States)

    James A. Westfall; Scott A. Pugh; John W. Coulston

    2013-01-01

    Many forest inventory and monitoring programs are based on a sample of ground plots from which estimates of forest resources are derived. In addition to evaluating metrics such as number of trees or amount of cubic wood volume, it is often desirable to make comparisons between resource attributes. To properly conduct statistical tests for differences, it is imperative...

  5. The significance of translation regulation in the stress response

    Science.gov (United States)

    2013-01-01

    Background The stress response in bacteria involves the multistage control of gene expression but is not entirely understood. To identify the translational response of bacteria in stress conditions and assess its contribution to the regulation of gene expression, the translational states of all mRNAs were compared under optimal growth condition and during nutrient (isoleucine) starvation. Results A genome-scale study of the translational response to nutritional limitation was performed in the model bacterium Lactococcus lactis. Two measures were used to assess the translational status of each individual mRNA: the fraction engaged in translation (ribosome occupancy) and ribosome density (number of ribosomes per 100 nucleotides). Under isoleucine starvation, half of the mRNAs considered were translationally down-regulated mainly due to decreased ribosome density. This pattern concerned genes involved in growth-related functions such as translation, transcription, and the metabolism of fatty acids, phospholipids and bases, contributing to the slowdown of growth. Only 4% of the mRNAs were translationally up-regulated, mostly related to prophagic expression in response to stress. The remaining genes exhibited antagonistic regulations of the two markers of translation. Ribosome occupancy increased significantly for all the genes involved in the biosynthesis of isoleucine, although their ribosome density had decreased. The results revealed complex translational regulation of this pathway, essential to cope with isoleucine starvation. To elucidate the regulation of global gene expression more generally, translational regulation was compared to transcriptional regulation under isoleucine starvation and to other post-transcriptional regulations related to mRNA degradation and mRNA dilution by growth. Translational regulation appeared to accentuate the effects of transcriptional changes for down-regulated growth-related functions under isoleucine starvation although m

  6. The distribution of P-values in medical research articles suggested selective reporting associated with statistical significance.

    Science.gov (United States)

    Perneger, Thomas V; Combescure, Christophe

    2017-07-01

    Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P values values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Using Cochran's Z Statistic to Test the Kernel-Smoothed Item Response Function Differences between Focal and Reference Groups

    Science.gov (United States)

    Zheng, Yinggan; Gierl, Mark J.; Cui, Ying

    2010-01-01

    This study combined the kernel smoothing procedure and a nonparametric differential item functioning statistic--Cochran's Z--to statistically test the difference between the kernel-smoothed item response functions for reference and focal groups. Simulation studies were conducted to investigate the Type I error and power of the proposed…

  8. Statistical Learning and Adaptive Decision-Making Underlie Human Response Time Variability in Inhibitory Control

    Directory of Open Access Journals (Sweden)

    Ning eMa

    2015-08-01

    Full Text Available Response time (RT is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task, in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop, and stop-signal onset time, SSD (stop-signal delay, with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop and SSD. The human behavioral data (n=20 bear out this prediction, showing P(stop and SSD both to be significant, independent predictors of RT, with P(stop being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making.

  9. Statistical learning and adaptive decision-making underlie human response time variability in inhibitory control.

    Science.gov (United States)

    Ma, Ning; Yu, Angela J

    2015-01-01

    Response time (RT) is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task (SST), in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop), and stop-signal onset time, SSD (stop-signal delay), with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop) and SSD. The human behavioral data (n = 20) bear out this prediction, showing P(stop) and SSD both to be significant, independent predictors of RT, with P(stop) being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making.

  10. The new statistics: why and how.

    Science.gov (United States)

    Cumming, Geoff

    2014-01-01

    We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial. This article explains why the new statistics are important and offers guidance for their use. It describes an eight-step new-statistics strategy for research with integrity, which starts with formulation of research questions in estimation terms, has no place for NHST, and is aimed at building a cumulative quantitative discipline.

  11. Statistical mechanics of nonequilibrium liquids

    CERN Document Server

    Evans, Denis J; Craig, D P; McWeeny, R

    1990-01-01

    Statistical Mechanics of Nonequilibrium Liquids deals with theoretical rheology. The book discusses nonlinear response of systems and outlines the statistical mechanical theory. In discussing the framework of nonequilibrium statistical mechanics, the book explains the derivation of a nonequilibrium analogue of the Gibbsian basis for equilibrium statistical mechanics. The book reviews the linear irreversible thermodynamics, the Liouville equation, and the Irving-Kirkwood procedure. The text then explains the Green-Kubo relations used in linear transport coefficients, the linear response theory,

  12. The Global Statistical Response of the Outer Radiation Belt During Geomagnetic Storms

    Science.gov (United States)

    Murphy, K. R.; Watt, C. E. J.; Mann, I. R.; Jonathan Rae, I.; Sibeck, D. G.; Boyd, A. J.; Forsyth, C. F.; Turner, D. L.; Claudepierre, S. G.; Baker, D. N.; Spence, H. E.; Reeves, G. D.; Blake, J. B.; Fennell, J.

    2018-05-01

    Using the total radiation belt electron content calculated from Van Allen Probe phase space density, the time-dependent and global response of the outer radiation belt during storms is statistically studied. Using phase space density reduces the impacts of adiabatic changes in the main phase, allowing a separation of adiabatic and nonadiabatic effects and revealing a clear modality and repeatable sequence of events in storm time radiation belt electron dynamics. This sequence exhibits an important first adiabatic invariant (μ)-dependent behavior in the seed (150 MeV/G), relativistic (1,000 MeV/G), and ultrarelativistic (4,000 MeV/G) populations. The outer radiation belt statistically shows an initial phase dominated by loss followed by a second phase of rapid acceleration, while the seed population shows little loss and immediate enhancement. The time sequence of the transition to the acceleration is also strongly μ dependent and occurs at low μ first, appearing to be repeatable from storm to storm.

  13. Stationary and Transient Response Statistics

    DEFF Research Database (Denmark)

    Madsen, Peter Hauge; Krenk, Steen

    1982-01-01

    The covariance functions for the transient response of a linear MDOF-system due to stationary time limited excitation with an arbitrary frequency content are related directly to the covariance functions of the stationary response. For rational spectral density functions closed form expressions fo...

  14. Statistical uncertainty of response characteristic of building-appendage system for spectrum-compatible artificial earthquake motion

    International Nuclear Information System (INIS)

    Kurosaki, A.; Kozeki, M.

    1981-01-01

    Spectrum-compatible artificial time histories of ground motions are frequently used for the seismic design of nuclear power plant structures and components. However, statistical uncertainty of the responses of building structures and mechanical components mounted on the building (building-appendage systems) are anticipated, since an artificial time history is no more than one sample from a population of such time histories that match a specified design response spectrum. This uncertainty may spoil the reliability of the seismic design and therefore the extent of the uncertainty of the response characteristic is a matter of great concern. In this paper, above-mentioned uncertainty of the dynamic response characteristics of the building-appendage system to the spectrum-compatible artificial earthquake is investigated. (orig./RW)

  15. Dietary change: what are the responses and roles of significant others?

    Science.gov (United States)

    Paisley, Judy; Beanlands, Heather; Goldman, Joanne; Evers, Susan; Chappell, Janet

    2008-01-01

    This study examined the impact of one person's dietary change on the experiences of a significant other with whom they regularly shared meals. Qualitative constant comparison approach using semistructured interviews. Community-based. Forty-two participants were recruited using a stratified purposive sampling strategy. Verbatim transcripts were analyzed using NUD*IST, version 4.0 software (Qualitative Solutions and Research, Melbourne, Australia, 1997) and manual coding. Most dietary changers had modified their diets in response to a disease diagnosis (eg, cardiovascular disease, diabetes, cancer, hypoglycemia, acquired immunodeficiency syndrome (AIDS), ulcer, allergies). Others had changed their diets for personal reasons (eg, weight loss, vegetarian diets). The dietary changes included dietary fat reduction, conversion to vegetarian or vegan diets, restriction of total kilocalorie intake, and elimination or reduction of specific food items. Significant others described a range of emotional responses to the dietary change, including cooperation, encouragement, skepticism, and anger. Significant others' descriptions of the roles that they played in the dietary change were positive (enabling), neutral (neither enabling nor inhibiting), or negative (inhibiting). Most significant others played positive roles; few played neutral or negative roles. Understanding dietary change from the perspective of significant others can enable nutrition professionals to develop strategies to promote dietary modifications as a shared activity.

  16. Business Statistics Education: Content and Software in Undergraduate Business Statistics Courses.

    Science.gov (United States)

    Tabatabai, Manouchehr; Gamble, Ralph

    1997-01-01

    Survey responses from 204 of 500 business schools identified most often topics in business statistics I and II courses. The most popular software at both levels was Minitab. Most schools required both statistics I and II. (SK)

  17. Sigsearch: a new term for post hoc unplanned search for statistically significant relationships with the intent to create publishable findings.

    Science.gov (United States)

    Hashim, Muhammad Jawad

    2010-09-01

    Post-hoc secondary data analysis with no prespecified hypotheses has been discouraged by textbook authors and journal editors alike. Unfortunately no single term describes this phenomenon succinctly. I would like to coin the term "sigsearch" to define this practice and bring it within the teaching lexicon of statistics courses. Sigsearch would include any unplanned, post-hoc search for statistical significance using multiple comparisons of subgroups. It would also include data analysis with outcomes other than the prespecified primary outcome measure of a study as well as secondary data analyses of earlier research.

  18. Significant linkage to airway responsiveness on chromosome 12q24 in families of children with asthma in Costa Rica.

    Science.gov (United States)

    Celedón, Juan C; Soto-Quiros, Manuel E; Avila, Lydiana; Lake, Stephen L; Liang, Catherine; Fournier, Eduardo; Spesny, Mitzi; Hersh, Craig P; Sylvia, Jody S; Hudson, Thomas J; Verner, Andrei; Klanderman, Barbara J; Freimer, Nelson B; Silverman, Edwin K; Weiss, Scott T

    2007-01-01

    Although asthma is a major public health problem in certain Hispanic subgroups in the United States and Latin America, only one genome scan for asthma has included Hispanic individuals. Because of small sample size, that study had limited statistical power to detect linkage to asthma and its intermediate phenotypes in Hispanic participants. To identify genomic regions that contain susceptibility genes for asthma and airway responsiveness in an isolated Hispanic population living in the Central Valley of Costa Rica, we conducted a genome-wide linkage analysis of asthma (n = 638) and airway responsiveness (n = 488) in members of eight large pedigrees of Costa Rican children with asthma. Nonparametric multipoint linkage analysis of asthma was conducted by the NPL-PAIR allele-sharing statistic, and variance component models were used for the multipoint linkage analysis of airway responsiveness as a quantitative phenotype. All linkage analyses were repeated after exclusion of the phenotypic data of former and current smokers. Chromosome 12q showed some evidence of linkage to asthma, particularly in nonsmokers (P asthma (airway responsiveness) in Costa Ricans.

  19. Response Burden in Official Business Surveys: Measurement and Reduction Practices of National Statistical Institutes

    Directory of Open Access Journals (Sweden)

    Bavdaž Mojca

    2015-12-01

    Full Text Available Response burden in business surveys has long been a concern for National Statistical Institutes (NSIs for three types of reasons: political reasons, because response burden is part of the total administrative burden governments impose on businesses; methodological reasons, because an excessive response burden may reduce data quality and increase data-collection costs; and strategic reasons, because it affects relations between the NSIs and the business community. This article investigates NSI practices concerning business response burden measurement and reduction actions based on a survey of 41 NSIs from 39 countries. Most NSIs monitor at least some burden aspects and have implemented some actions to reduce burden, but large differences exist between NSIs’ methodologies for burden measurement and actions taken to reduce burden. Future research should find ways to deal with methodological differences in burden conceptualization, operationalization, and measurement, and provide insights into the effectiveness and efficiency of burden-reduction actions.

  20. Intelligent system for statistically significant expertise knowledge on the basis of the model of self-organizing nonequilibrium dissipative system

    Directory of Open Access Journals (Sweden)

    E. A. Tatokchin

    2017-01-01

    Full Text Available Development of the modern educational technologies caused by broad introduction of comput-er testing and development of distant forms of education does necessary revision of methods of an examination of pupils. In work it was shown, need transition to mathematical criteria, exami-nations of knowledge which are deprived of subjectivity. In article the review of the problems arising at realization of this task and are offered approaches for its decision. The greatest atten-tion is paid to discussion of a problem of objective transformation of rated estimates of the ex-pert on to the scale estimates of the student. In general, the discussion this question is was con-cluded that the solution to this problem lies in the creation of specialized intellectual systems. The basis for constructing intelligent system laid the mathematical model of self-organizing nonequilibrium dissipative system, which is a group of students. This article assumes that the dissipative system is provided by the constant influx of new test items of the expert and non-equilibrium – individual psychological characteristics of students in the group. As a result, the system must self-organize themselves into stable patterns. This patern will allow for, relying on large amounts of data, get a statistically significant assessment of student. To justify the pro-posed approach in the work presents the data of the statistical analysis of the results of testing a large sample of students (> 90. Conclusions from this statistical analysis allowed to develop intelligent system statistically significant examination of student performance. It is based on data clustering algorithm (k-mean for the three key parameters. It is shown that this approach allows you to create of the dynamics and objective expertise evaluation.

  1. The Importance of Integrating Clinical Relevance and Statistical Significance in the Assessment of Quality of Care--Illustrated Using the Swedish Stroke Register.

    Directory of Open Access Journals (Sweden)

    Anita Lindmark

    Full Text Available When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance.The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method.Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252 and high specificity (0.991. There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence.The study emphasizes the importance of combining clinical relevance and level of statistical

  2. The Importance of Integrating Clinical Relevance and Statistical Significance in the Assessment of Quality of Care--Illustrated Using the Swedish Stroke Register.

    Science.gov (United States)

    Lindmark, Anita; van Rompaye, Bart; Goetghebeur, Els; Glader, Eva-Lotta; Eriksson, Marie

    2016-01-01

    When profiling hospital performance, quality inicators are commonly evaluated through hospital-specific adjusted means with confidence intervals. When identifying deviations from a norm, large hospitals can have statistically significant results even for clinically irrelevant deviations while important deviations in small hospitals can remain undiscovered. We have used data from the Swedish Stroke Register (Riksstroke) to illustrate the properties of a benchmarking method that integrates considerations of both clinical relevance and level of statistical significance. The performance measure used was case-mix adjusted risk of death or dependency in activities of daily living within 3 months after stroke. A hospital was labeled as having outlying performance if its case-mix adjusted risk exceeded a benchmark value with a specified statistical confidence level. The benchmark was expressed relative to the population risk and should reflect the clinically relevant deviation that is to be detected. A simulation study based on Riksstroke patient data from 2008-2009 was performed to investigate the effect of the choice of the statistical confidence level and benchmark value on the diagnostic properties of the method. Simulations were based on 18,309 patients in 76 hospitals. The widely used setting, comparing 95% confidence intervals to the national average, resulted in low sensitivity (0.252) and high specificity (0.991). There were large variations in sensitivity and specificity for different requirements of statistical confidence. Lowering statistical confidence improved sensitivity with a relatively smaller loss of specificity. Variations due to different benchmark values were smaller, especially for sensitivity. This allows the choice of a clinically relevant benchmark to be driven by clinical factors without major concerns about sufficiently reliable evidence. The study emphasizes the importance of combining clinical relevance and level of statistical confidence when

  3. Estimates of statistical significance for comparison of individual positions in multiple sequence alignments

    Directory of Open Access Journals (Sweden)

    Sadreyev Ruslan I

    2004-08-01

    Full Text Available Abstract Background Profile-based analysis of multiple sequence alignments (MSA allows for accurate comparison of protein families. Here, we address the problems of detecting statistically confident dissimilarities between (1 MSA position and a set of predicted residue frequencies, and (2 between two MSA positions. These problems are important for (i evaluation and optimization of methods predicting residue occurrence at protein positions; (ii detection of potentially misaligned regions in automatically produced alignments and their further refinement; and (iii detection of sites that determine functional or structural specificity in two related families. Results For problems (1 and (2, we propose analytical estimates of P-value and apply them to the detection of significant positional dissimilarities in various experimental situations. (a We compare structure-based predictions of residue propensities at a protein position to the actual residue frequencies in the MSA of homologs. (b We evaluate our method by the ability to detect erroneous position matches produced by an automatic sequence aligner. (c We compare MSA positions that correspond to residues aligned by automatic structure aligners. (d We compare MSA positions that are aligned by high-quality manual superposition of structures. Detected dissimilarities reveal shortcomings of the automatic methods for residue frequency prediction and alignment construction. For the high-quality structural alignments, the dissimilarities suggest sites of potential functional or structural importance. Conclusion The proposed computational method is of significant potential value for the analysis of protein families.

  4. Soil-structure interaction effects on containment fragilities and floor response spectra statistics

    International Nuclear Information System (INIS)

    Pires, J.; Reich, M.; Chokshi, N.C.

    1987-01-01

    The probability-based method for the reliability evaluation of nuclear structures developed at Brookhaven National Laboratory (BNL) is extended to include soil-structure interaction effects. A reinforced concrete containment is analyzed in order to investigate the soil-structure interaction effects on: structural fragilities; floor response spectra statistics and acceleration response correlations. To include the effect of soil flexibility on the reliability assessment the following two step approach is used. In the first step, the lumped parameter method for soil-structure interaction analysis is used together with a stick model representation of the structure in order to obtain the motions of the foundation plate. These motions, which include both translations and rotations of the foundation plate, are expressed in terms of the power-spectral density of the free-field ground excitation and the transfer function of the total acceleration response of the foundation. The second step involves a detailed finite element model of the structure subjected to the interaction motions computed from step one. Making use of the structural model and interaction motion the reliability analysis method yields the limit stat probabilities and fragility data for the structure

  5. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  6. School Violence: Data & Statistics

    Science.gov (United States)

    ... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...

  7. Kappa statistic for the clustered dichotomous responses from physicians and patients

    Science.gov (United States)

    Kang, Chaeryon; Qaqish, Bahjat; Monaco, Jane; Sheridan, Stacey L.; Cai, Jianwen

    2013-01-01

    The bootstrap method for estimating the standard error of the kappa statistic in the presence of clustered data is evaluated. Such data arise, for example, in assessing agreement between physicians and their patients regarding their understanding of the physician-patient interaction and discussions. We propose a computationally efficient procedure for generating correlated dichotomous responses for physicians and assigned patients for simulation studies. The simulation result demonstrates that the proposed bootstrap method produces better estimate of the standard error and better coverage performance compared to the asymptotic standard error estimate that ignores dependence among patients within physicians with at least a moderately large number of clusters. An example of an application to a coronary heart disease prevention study is presented. PMID:23533082

  8. Response to traumatic brain injury neurorehabilitation through an artificial intelligence and statistics hybrid knowledge discovery from databases methodology.

    Science.gov (United States)

    Gibert, Karina; García-Rudolph, Alejandro; García-Molina, Alberto; Roig-Rovira, Teresa; Bernabeu, Montse; Tormos, José María

    2008-01-01

    Develop a classificatory tool to identify different populations of patients with Traumatic Brain Injury based on the characteristics of deficit and response to treatment. A KDD framework where first, descriptive statistics of every variable was done, data cleaning and selection of relevant variables. Then data was mined using a generalization of Clustering based on rules (CIBR), an hybrid AI and Statistics technique which combines inductive learning (AI) and clustering (Statistics). A prior Knowledge Base (KB) is considered to properly bias the clustering; semantic constraints implied by the KB hold in final clusters, guaranteeing interpretability of the resultis. A generalization (Exogenous Clustering based on rules, ECIBR) is presented, allowing to define the KB in terms of variables which will not be considered in the clustering process itself, to get more flexibility. Several tools as Class panel graph are introduced in the methodology to assist final interpretation. A set of 5 classes was recommended by the system and interpretation permitted profiles labeling. From the medical point of view, composition of classes is well corresponding with different patterns of increasing level of response to rehabilitation treatments. All the patients initially assessable conform a single group. Severe impaired patients are subdivided in four profiles which clearly distinct response patterns. Particularly interesting the partial response profile, where patients could not improve executive functions. Meaningful classes were obtained and, from a semantics point of view, the results were sensibly improved regarding classical clustering, according to our opinion that hybrid AI & Stats techniques are more powerful for KDD than pure ones.

  9. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition.

    Science.gov (United States)

    Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2014-06-05

    In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological

  10. A Fuzzy Modeling Approach for Replicated Response Measures Based on Fuzzification of Replications with Descriptive Statistics and Golden Ratio

    Directory of Open Access Journals (Sweden)

    Özlem TÜRKŞEN

    2018-03-01

    Full Text Available Some of the experimental designs can be composed of replicated response measures in which the replications cannot be identified exactly and may have uncertainty different than randomness. Then, the classical regression analysis may not be proper to model the designed data because of the violation of probabilistic modeling assumptions. In this case, fuzzy regression analysis can be used as a modeling tool. In this study, the replicated response values are newly formed to fuzzy numbers by using descriptive statistics of replications and golden ratio. The main aim of the study is obtaining the most suitable fuzzy model for replicated response measures through fuzzification of the replicated values by taking into account the data structure of the replications in statistical framework. Here, the response and unknown model coefficients are considered as triangular type-1 fuzzy numbers (TT1FNs whereas the inputs are crisp. Predicted fuzzy models are obtained according to the proposed fuzzification rules by using Fuzzy Least Squares (FLS approach. The performances of the predicted fuzzy models are compared by using Root Mean Squared Error (RMSE criteria. A data set from the literature, called wheel cover component data set, is used to illustrate the performance of the proposed approach and the obtained results are discussed. The calculation results show that the combined formulation of the descriptive statistics and the golden ratio is the most preferable fuzzification rule according to the well-known decision making method, called TOPSIS, for the data set.

  11. Aging Affects Adaptation to Sound-Level Statistics in Human Auditory Cortex.

    Science.gov (United States)

    Herrmann, Björn; Maess, Burkhard; Johnsrude, Ingrid S

    2018-02-21

    Optimal perception requires efficient and adaptive neural processing of sensory input. Neurons in nonhuman mammals adapt to the statistical properties of acoustic feature distributions such that they become sensitive to sounds that are most likely to occur in the environment. However, whether human auditory responses adapt to stimulus statistical distributions and how aging affects adaptation to stimulus statistics is unknown. We used MEG to study how exposure to different distributions of sound levels affects adaptation in auditory cortex of younger (mean: 25 years; n = 19) and older (mean: 64 years; n = 20) adults (male and female). Participants passively listened to two sound-level distributions with different modes (either 15 or 45 dB sensation level). In a control block with long interstimulus intervals, allowing neural populations to recover from adaptation, neural response magnitudes were similar between younger and older adults. Critically, both age groups demonstrated adaptation to sound-level stimulus statistics, but adaptation was altered for older compared with younger people: in the older group, neural responses continued to be sensitive to sound level under conditions in which responses were fully adapted in the younger group. The lack of full adaptation to the statistics of the sensory environment may be a physiological mechanism underlying the known difficulty that older adults have with filtering out irrelevant sensory information. SIGNIFICANCE STATEMENT Behavior requires efficient processing of acoustic stimulation. Animal work suggests that neurons accomplish efficient processing by adjusting their response sensitivity depending on statistical properties of the acoustic environment. Little is known about the extent to which this adaptation to stimulus statistics generalizes to humans, particularly to older humans. We used MEG to investigate how aging influences adaptation to sound-level statistics. Listeners were presented with sounds drawn from

  12. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when

  13. Statistics available for site studies in registers and surveys at Statistics Sweden

    International Nuclear Information System (INIS)

    Haldorson, Marie

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  14. Statistics available for site studies in registers and surveys at Statistics Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Haldorson, Marie [Statistics Sweden, Oerebro (Sweden)

    2000-03-01

    Statistics Sweden (SCB) has produced this report on behalf of the Swedish Nuclear Fuel and Waste Management Company (SKB), as part of the data to be used by SKB in conducting studies of potential sites. The report goes over the statistics obtainable from SCB in the form of registers and surveys. The purpose is to identify the variables that are available, and to specify their degree of geographical detail and the time series that are available. Chapter two describes the statistical registers available at SCB, registers that share the common feature that they provide total coverage, i.e. they contain all 'objects' of a given type, such as population, economic activities (e.g. from statements of employees' earnings provided to the tax authorities), vehicles, enterprises or real estate. SCB has exclusive responsibility for seven of the nine registers included in the chapter, while two registers are ordered by public authorities with statistical responsibilities. Chapter three describes statistical surveys that are conducted by SCB, with the exception of the National Forest Inventory, which is carried out by the Swedish University of Agricultural Sciences. In terms of geographical breakdown, the degree of detail in the surveys varies, but all provide some possibility of reporting data at lower than the national level. The level involved may be county, municipality, yield district, coastal district or category of enterprises, e.g. aquaculture. Six of the nine surveys included in the chapter have been ordered by public authorities with statistical responsibilities, while SCB has exclusive responsibility for the others. Chapter four presents an overview of the statistics on land use maintained by SCB. This chapter does not follow the same pattern as chapters two and three but instead gives a more general account. The conclusion can be drawn that there are good prospects that SKB can make use of SCB's data as background information or in other ways when undertaking future

  15. Statistically significant dependence of the Xaa-Pro peptide bond conformation on secondary structure and amino acid sequence

    Directory of Open Access Journals (Sweden)

    Leitner Dietmar

    2005-04-01

    Full Text Available Abstract Background A reliable prediction of the Xaa-Pro peptide bond conformation would be a useful tool for many protein structure calculation methods. We have analyzed the Protein Data Bank and show that the combined use of sequential and structural information has a predictive value for the assessment of the cis versus trans peptide bond conformation of Xaa-Pro within proteins. For the analysis of the data sets different statistical methods such as the calculation of the Chou-Fasman parameters and occurrence matrices were used. Furthermore we analyzed the relationship between the relative solvent accessibility and the relative occurrence of prolines in the cis and in the trans conformation. Results One of the main results of the statistical investigations is the ranking of the secondary structure and sequence information with respect to the prediction of the Xaa-Pro peptide bond conformation. We observed a significant impact of secondary structure information on the occurrence of the Xaa-Pro peptide bond conformation, while the sequence information of amino acids neighboring proline is of little predictive value for the conformation of this bond. Conclusion In this work, we present an extensive analysis of the occurrence of the cis and trans proline conformation in proteins. Based on the data set, we derived patterns and rules for a possible prediction of the proline conformation. Upon adoption of the Chou-Fasman parameters, we are able to derive statistically relevant correlations between the secondary structure of amino acid fragments and the Xaa-Pro peptide bond conformation.

  16. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  17. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    Science.gov (United States)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    Urban catchments are typically characterised by a more flashy nature of the hydrological response compared to natural catchments. Predicting flow changes associated with urbanisation is not straightforward, as they are influenced by interactions between impervious cover, basin size, drainage connectivity and stormwater management infrastructure. In this study, we present an alternative approach to statistical analysis of hydrological response variability and basin flashiness, based on the distribution of inter-amount times. We analyse inter-amount time distributions of high-resolution streamflow time series for 17 (semi-)urbanised basins in North Carolina, USA, ranging from 13 to 238 km2 in size. We show that in the inter-amount-time framework, sampling frequency is tuned to the local variability of the flow pattern, resulting in a different representation and weighting of high and low flow periods in the statistical distribution. This leads to important differences in the way the distribution quantiles, mean, coefficient of variation and skewness vary across scales and results in lower mean intermittency and improved scaling. Moreover, we show that inter-amount-time distributions can be used to detect regulation effects on flow patterns, identify critical sampling scales and characterise flashiness of hydrological response. The possibility to use both the classical approach and the inter-amount-time framework to identify minimum observable scales and analyse flow data opens up interesting areas for future research.

  18. Response times of operators in a control room

    International Nuclear Information System (INIS)

    Platz, O.; Rasmussen, J.; Skanborg, P.Z.

    1982-12-01

    A statistical analysis was made of operator response times recorded in the control room of a research reactor during the years 1972-1974. A homogeneity test revealed that the data consist of a mixture of populations. A small but statistically significant difference is found between day and night response times. Lognormal distributions are found to provide the best fit of the day and the night response times. (author)

  19. Statistical learning from a regression perspective

    CERN Document Server

    Berk, Richard A

    2016-01-01

    This textbook considers statistical learning applications when interest centers on the conditional distribution of the response variable, given a set of predictors, and when it is important to characterize how the predictors are related to the response. As a first approximation, this can be seen as an extension of nonparametric regression. This fully revised new edition includes important developments over the past 8 years. Consistent with modern data analytics, it emphasizes that a proper statistical learning data analysis derives from sound data collection, intelligent data management, appropriate statistical procedures, and an accessible interpretation of results. A continued emphasis on the implications for practice runs through the text. Among the statistical learning procedures examined are bagging, random forests, boosting, support vector machines and neural networks. Response variables may be quantitative or categorical. As in the first edition, a unifying theme is supervised learning that can be trea...

  20. Statistics Graduate Teaching Assistants' Beliefs, Practices and Preparation for Teaching Introductory Statistics

    Science.gov (United States)

    Justice, Nicola; Zieffler, Andrew; Garfield, Joan

    2017-01-01

    Graduate teaching assistants (GTAs) are responsible for the instruction of many statistics courses offered at the university level, yet little is known about these students' preparation for teaching, their beliefs about how introductory statistics should be taught, or the pedagogical practices of the courses they teach. An online survey to examine…

  1. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  2. Generate floor response spectra, Part 2: Response spectra for equipment-structure resonance

    International Nuclear Information System (INIS)

    Li, Bo; Jiang, Wei; Xie, Wei-Chau; Pandey, Mahesh D.

    2015-01-01

    Highlights: • The concept of tRS is proposed to deal with tuning of equipment and structures. • Established statistical approaches for estimating tRS corresponding to given GRS. • Derived a new modal combination rule from the theory of random vibration. • Developed efficient and accurate direct method for generating floor response spectra. - Abstract: When generating floor response spectra (FRS) using the direct spectra-to-spectra method developed in the companion paper, probability distribution of t-response spectrum (tRS), which deals with equipment-structure resonance or tuning, corresponding to a specified ground response spectrum (GRS) is required. In this paper, simulation results using a large number of horizontal and vertical ground motions are employed to establish statistical relationships between tRS and GRS. It is observed that the influence of site conditions on horizontal statistical relationships is negligible, whereas the effect of site conditions on vertical statistical relationships cannot be ignored. Considering the influence of site conditions, horizontal statistical relationship suitable for all site conditions and vertical statistical relationships suitable for hard sites and soft sites, respectively, are established. The horizontal and vertical statistical relationships are suitable to estimate tRS for design spectra in USNRC R.G. 1.60 and NUREG/CR-0098, Uniform Hazard Spectra (UHS) in Western North America (WNA), or any GRS falling inside the valid coverage of the statistical relationship. For UHS with significant high frequency spectral accelerations, such as UHS in Central and Eastern North America (CENA), an amplification ratio method is proposed to estimate tRS. Numerical examples demonstrate that the statistical relationships and the amplification ratio method are acceptable to estimate tRS for given GRS and to generate FRS using the direct method in different practical situations.

  3. Generate floor response spectra, Part 2: Response spectra for equipment-structure resonance

    Energy Technology Data Exchange (ETDEWEB)

    Li, Bo, E-mail: b68li@uwaterloo.ca; Jiang, Wei, E-mail: w46jiang@uwaterloo.ca; Xie, Wei-Chau, E-mail: xie@uwaterloo.ca; Pandey, Mahesh D., E-mail: mdpandey@uwaterloo.ca

    2015-11-15

    Highlights: • The concept of tRS is proposed to deal with tuning of equipment and structures. • Established statistical approaches for estimating tRS corresponding to given GRS. • Derived a new modal combination rule from the theory of random vibration. • Developed efficient and accurate direct method for generating floor response spectra. - Abstract: When generating floor response spectra (FRS) using the direct spectra-to-spectra method developed in the companion paper, probability distribution of t-response spectrum (tRS), which deals with equipment-structure resonance or tuning, corresponding to a specified ground response spectrum (GRS) is required. In this paper, simulation results using a large number of horizontal and vertical ground motions are employed to establish statistical relationships between tRS and GRS. It is observed that the influence of site conditions on horizontal statistical relationships is negligible, whereas the effect of site conditions on vertical statistical relationships cannot be ignored. Considering the influence of site conditions, horizontal statistical relationship suitable for all site conditions and vertical statistical relationships suitable for hard sites and soft sites, respectively, are established. The horizontal and vertical statistical relationships are suitable to estimate tRS for design spectra in USNRC R.G. 1.60 and NUREG/CR-0098, Uniform Hazard Spectra (UHS) in Western North America (WNA), or any GRS falling inside the valid coverage of the statistical relationship. For UHS with significant high frequency spectral accelerations, such as UHS in Central and Eastern North America (CENA), an amplification ratio method is proposed to estimate tRS. Numerical examples demonstrate that the statistical relationships and the amplification ratio method are acceptable to estimate tRS for given GRS and to generate FRS using the direct method in different practical situations.

  4. Statistical learning of music- and language-like sequences and tolerance for spectral shifts.

    Science.gov (United States)

    Daikoku, Tatsuya; Yatomi, Yutaka; Yumoto, Masato

    2015-02-01

    In our previous study (Daikoku, Yatomi, & Yumoto, 2014), we demonstrated that the N1m response could be a marker for the statistical learning process of pitch sequence, in which each tone was ordered by a Markov stochastic model. The aim of the present study was to investigate how the statistical learning of music- and language-like auditory sequences is reflected in the N1m responses based on the assumption that both language and music share domain generality. By using vowel sounds generated by a formant synthesizer, we devised music- and language-like auditory sequences in which higher-ordered transitional rules were embedded according to a Markov stochastic model by controlling fundamental (F0) and/or formant frequencies (F1-F2). In each sequence, F0 and/or F1-F2 were spectrally shifted in the last one-third of the tone sequence. Neuromagnetic responses to the tone sequences were recorded from 14 right-handed normal volunteers. In the music- and language-like sequences with pitch change, the N1m responses to the tones that appeared with higher transitional probability were significantly decreased compared with the responses to the tones that appeared with lower transitional probability within the first two-thirds of each sequence. Moreover, the amplitude difference was even retained within the last one-third of the sequence after the spectral shifts. However, in the language-like sequence without pitch change, no significant difference could be detected. The pitch change may facilitate the statistical learning in language and music. Statistically acquired knowledge may be appropriated to process altered auditory sequences with spectral shifts. The relative processing of spectral sequences may be a domain-general auditory mechanism that is innate to humans. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Determining coding CpG islands by identifying regions significant for pattern statistics on Markov chains.

    Science.gov (United States)

    Singer, Meromit; Engström, Alexander; Schönhuth, Alexander; Pachter, Lior

    2011-09-23

    Recent experimental and computational work confirms that CpGs can be unmethylated inside coding exons, thereby showing that codons may be subjected to both genomic and epigenomic constraint. It is therefore of interest to identify coding CpG islands (CCGIs) that are regions inside exons enriched for CpGs. The difficulty in identifying such islands is that coding exons exhibit sequence biases determined by codon usage and constraints that must be taken into account. We present a method for finding CCGIs that showcases a novel approach we have developed for identifying regions of interest that are significant (with respect to a Markov chain) for the counts of any pattern. Our method begins with the exact computation of tail probabilities for the number of CpGs in all regions contained in coding exons, and then applies a greedy algorithm for selecting islands from among the regions. We show that the greedy algorithm provably optimizes a biologically motivated criterion for selecting islands while controlling the false discovery rate. We applied this approach to the human genome (hg18) and annotated CpG islands in coding exons. The statistical criterion we apply to evaluating islands reduces the number of false positives in existing annotations, while our approach to defining islands reveals significant numbers of undiscovered CCGIs in coding exons. Many of these appear to be examples of functional epigenetic specialization in coding exons.

  6. TU-C-12A-02: Development of a Multiparametric Statistical Response Map for Quantitative Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bosca, R [The University of Texas Graduate School of Biomedical Sciences, Houston, TX (United States); The University of Texas MD Anderson Cancer Center, Houston, TX (United States); Mahajan, A; Brown, PD; Stafford, RJ [The University of Texas MD Anderson Cancer Center, Houston, TX (United States); Johnson, VE [Texas A' M University, College Station, TX (United States); Dong, L [Scripps Proton Therapy Center, San Diego, CA (United States); Jackson, EF [University of Wisconsin, Madison, WI (United States)

    2014-06-15

    Purpose: Quantitative imaging biomarkers (QIB) are becoming increasingly utilized in early phase clinical trials as a means of non-invasively assessing treatment response and associated response heterogeneity. The aim of this study was to develop a flexible multiparametric statistical framework to predict voxel-by-voxel response of several potential MRI QIBs. Methods: Patients with histologically proven glioblastomas (n=11) were treated with chemoradiation (with/without bevacizumab) and underwent one baseline and two mid-treatment (3–4wks) MRIs. Dynamic contrast-enhanced (3D FSPGR, 6.3sec/phase, 0.1 mmol/kg Gd-DTPA), dynamic susceptibility contrast (2D GRE-EPI, 1.5sec/phase, 0.2mmol/kg Gd-DTPA), and diffusion tensor (2D DW-EPI, b=0, 1200 s/mm{sup 2}, 27 directions) imaging acquisitions were obtained during each study. Mid-treatment and pre-treatment images were rigidly aligned, and regions of partial response (PR), stable disease (SD), and progressive disease (PD) were contoured in consensus by two experienced radiation oncologists. Voxels in these categories were used to train ordinal (PRstatistical framework for incorporating longitudinal multiparametric

  7. TU-C-12A-02: Development of a Multiparametric Statistical Response Map for Quantitative Imaging

    International Nuclear Information System (INIS)

    Bosca, R; Mahajan, A; Brown, PD; Stafford, RJ; Johnson, VE; Dong, L; Jackson, EF

    2014-01-01

    Purpose: Quantitative imaging biomarkers (QIB) are becoming increasingly utilized in early phase clinical trials as a means of non-invasively assessing treatment response and associated response heterogeneity. The aim of this study was to develop a flexible multiparametric statistical framework to predict voxel-by-voxel response of several potential MRI QIBs. Methods: Patients with histologically proven glioblastomas (n=11) were treated with chemoradiation (with/without bevacizumab) and underwent one baseline and two mid-treatment (3–4wks) MRIs. Dynamic contrast-enhanced (3D FSPGR, 6.3sec/phase, 0.1 mmol/kg Gd-DTPA), dynamic susceptibility contrast (2D GRE-EPI, 1.5sec/phase, 0.2mmol/kg Gd-DTPA), and diffusion tensor (2D DW-EPI, b=0, 1200 s/mm 2 , 27 directions) imaging acquisitions were obtained during each study. Mid-treatment and pre-treatment images were rigidly aligned, and regions of partial response (PR), stable disease (SD), and progressive disease (PD) were contoured in consensus by two experienced radiation oncologists. Voxels in these categories were used to train ordinal (PRstatistical framework for incorporating longitudinal multiparametric

  8. How Often Is the Misfit of Item Response Theory Models Practically Significant?

    Science.gov (United States)

    Sinharay, Sandip; Haberman, Shelby J.

    2014-01-01

    Standard 3.9 of the Standards for Educational and Psychological Testing ([, 1999]) demands evidence of model fit when item response theory (IRT) models are employed to data from tests. Hambleton and Han ([Hambleton, R. K., 2005]) and Sinharay ([Sinharay, S., 2005]) recommended the assessment of practical significance of misfit of IRT models, but…

  9. High order statistical signatures from source-driven measurements of subcritical fissile systems

    International Nuclear Information System (INIS)

    Mattingly, J.K.

    1998-01-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements

  10. Habituation of the cold shock response may include a significant perceptual component.

    Science.gov (United States)

    Barwood, Martin J; Corbett, Jo; Wagstaff, Christopher R D

    2014-02-01

    Accidental immersion in cold water is a risk factor for many occupations. Habituation to cold-water immersion (CWI) is one practical means of reducing the cold shock response (CSR) on immersion. We investigated whether repeated thermoneutral water immersion (TWI) induced a perceptual habituation (i.e., could lessen perceived threat and anxiety) and consequently reduce the CSR on subsequent CWI. There were 12 subjects who completed seven 7-min head-out immersions. Immersions one and seven were CWls [15.0 (0.1) degrees C], and immersions two to six were TWI [34.9 (0.10) degrees C]. Anxiety 120-cm visual analogue scale) and the cardiorespiratory responses [heart rate (f(C)), respiratory frequency (f(R)), tidal volume (V(T)), and minute ventilation (V(E))] to immersion were measured throughout. Data were compared within subject between conditions using ANOVA to an alpha level of 0.05. Acute anxiety was significantly reduced after repeated exposure to the immersion scenario (i.e., TWI): CWI-1: 6.3 (4.4) cm; and CWI-2: 4.5 (4.0) cm [condition mean (SD)]. These differences did not influence the peak in the CSR. The f(C), f(R), and V(E) responses were similar between CWI-1 and CWI-2. V(T) response was significantly lower in CWI-2; mean (SD) across the immersion: CWI-1 1.27 (0.17) vs. CWI-2 1.11 0.21 L. Repeated TWI lessened the anxiety associated with CWI (perceptual habituation). This had a negligible effect on the primary components of the CSR, but did lower VT, which may reduce the volume of any aspirated water in an emergency situation. Reducing the threat appraisal of an environmental stressor may be a useful biproduct of survival training, thereby minimizing psychophysiological strain.

  11. Statistical analyses in the study of solar wind-magnetosphere coupling

    International Nuclear Information System (INIS)

    Baker, D.N.

    1985-01-01

    Statistical analyses provide a valuable method for establishing initially the existence (or lack of existence) of a relationship between diverse data sets. Statistical methods also allow one to make quantitative assessments of the strengths of observed relationships. This paper reviews the essential techniques and underlying statistical bases for the use of correlative methods in solar wind-magnetosphere coupling studies. Techniques of visual correlation and time-lagged linear cross-correlation analysis are emphasized, but methods of multiple regression, superposed epoch analysis, and linear prediction filtering are also described briefly. The long history of correlation analysis in the area of solar wind-magnetosphere coupling is reviewed with the assessments organized according to data averaging time scales (minutes to years). It is concluded that these statistical methods can be very useful first steps, but that case studies and various advanced analysis methods should be employed to understand fully the average response of the magnetosphere to solar wind input. It is clear that many workers have not always recognized underlying assumptions of statistical methods and thus the significance of correlation results can be in doubt. Long-term averages (greater than or equal to 1 hour) can reveal gross relationships, but only when dealing with high-resolution data (1 to 10 min) can one reach conclusions pertinent to magnetospheric response time scales and substorm onset mechanisms

  12. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    Science.gov (United States)

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  13. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  14. Development of a simplified statistical methodology for nuclear fuel rod internal pressure calculation

    International Nuclear Information System (INIS)

    Kim, Kyu Tae; Kim, Oh Hwan

    1999-01-01

    A simplified statistical methodology is developed in order to both reduce over-conservatism of deterministic methodologies employed for PWR fuel rod internal pressure (RIP) calculation and simplify the complicated calculation procedure of the widely used statistical methodology which employs the response surface method and Monte Carlo simulation. The simplified statistical methodology employs the system moment method with a deterministic statistical methodology employs the system moment method with a deterministic approach in determining the maximum variance of RIP. The maximum RIP variance is determined with the square sum of each maximum value of a mean RIP value times a RIP sensitivity factor for all input variables considered. This approach makes this simplified statistical methodology much more efficient in the routine reload core design analysis since it eliminates the numerous calculations required for the power history-dependent RIP variance determination. This simplified statistical methodology is shown to be more conservative in generating RIP distribution than the widely used statistical methodology. Comparison of the significances of each input variable to RIP indicates that fission gas release model is the most significant input variable. (author). 11 refs., 6 figs., 2 tabs

  15. Sb2Te3 and Its Superlattices: Optimization by Statistical Design.

    Science.gov (United States)

    Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E

    2018-05-02

    The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.

  16. Cisatracurium dose–response relationship in patients with chronic liver disease

    Directory of Open Access Journals (Sweden)

    Mohamed Z. Ali

    2014-04-01

    Results: The preoperative laboratory parameters showed statistically significant differences between the two groups regarding serum albumin, total bilirubin, ALT, AST, PT, PC and INR. The operative data showed statistically insignificant difference between the two groups regarding the 1st dose response (p = 0.152, the estimated ED80 (p = 0.886 and the calculated 2nd dose (p = 0.886 and statistically significant differences between the two groups regarding the 2nd dose response (p = 0.006, the measured ED50 (p = 0.010 and the measured ED95 (p = 0.001. In conclusion, the measured ED50 and ED95 through two-dose dose–response curve technique were clinically insignificant from using the single-dose technique. The dose–response curve of cisatracurium in patients with chronic liver disease was clinically insignificant in comparison with healthy subjects.

  17. The large break LOCA evaluation method with the simplified statistic approach

    International Nuclear Information System (INIS)

    Kamata, Shinya; Kubo, Kazuo

    2004-01-01

    USNRC published the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology to large break LOCA which supported the revised rule for Emergency Core Cooling System performance in 1989. In USNRC regulatory guide 1.157, it is required that the peak cladding temperature (PCT) cannot exceed 2200deg F with high probability 95th percentile. In recent years, overseas countries have developed statistical methodology and best estimate code with the model which can provide more realistic simulation for the phenomena based on the CSAU evaluation methodology. In order to calculate PCT probability distribution by Monte Carlo trials, there are approaches such as the response surface technique using polynomials, the order statistics method, etc. For the purpose of performing rational statistic analysis, Mitsubishi Heavy Industries, LTD (MHI) tried to develop the statistic LOCA method using the best estimate LOCA code MCOBRA/TRAC and the simplified code HOTSPOT. HOTSPOT is a Monte Carlo heat conduction solver to evaluate the uncertainties of the significant fuel parameters at the PCT positions of the hot rod. The direct uncertainty sensitivity studies can be performed without the response surface because the Monte Carlo simulation for key parameters can be performed in short time using HOTSPOT. With regard to the parameter uncertainties, MHI established the treatment that the bounding conditions are given for LOCA boundary and plant initial conditions, the Monte Carlo simulation using HOTSPOT is applied to the significant fuel parameters. The paper describes the large break LOCA evaluation method with the simplified statistic approach and the results of the application of the method to the representative four-loop nuclear power plant. (author)

  18. A Statistical Framework to Interpret Individual Response to Intervention: Paving the Way for Personalized Nutrition and Exercise Prescription

    Directory of Open Access Journals (Sweden)

    Paul A. Swinton

    2018-05-01

    Full Text Available The concept of personalized nutrition and exercise prescription represents a topical and exciting progression for the discipline given the large inter-individual variability that exists in response to virtually all performance and health related interventions. Appropriate interpretation of intervention-based data from an individual or group of individuals requires practitioners and researchers to consider a range of concepts including the confounding influence of measurement error and biological variability. In addition, the means to quantify likely statistical and practical improvements are facilitated by concepts such as confidence intervals (CIs and smallest worthwhile change (SWC. The purpose of this review is to provide accessible and applicable recommendations for practitioners and researchers that interpret, and report personalized data. To achieve this, the review is structured in three sections that progressively develop a statistical framework. Section 1 explores fundamental concepts related to measurement error and describes how typical error and CIs can be used to express uncertainty in baseline measurements. Section 2 builds upon these concepts and demonstrates how CIs can be combined with the concept of SWC to assess whether meaningful improvements occur post-intervention. Finally, section 3 introduces the concept of biological variability and discusses the subsequent challenges in identifying individual response and non-response to an intervention. Worked numerical examples and interactive Supplementary Material are incorporated to solidify concepts and assist with implementation in practice.

  19. After statistics reform : Should we still teach significance testing?

    NARCIS (Netherlands)

    A. Hak (Tony)

    2014-01-01

    textabstractIn the longer term null hypothesis significance testing (NHST) will disappear because p- values are not informative and not replicable. Should we continue to teach in the future the procedures of then abolished routines (i.e., NHST)? Three arguments are discussed for not teaching NHST in

  20. A statistical mechanical approach for the computation of the climatic response to general forcings

    Directory of Open Access Journals (Sweden)

    V. Lucarini

    2011-01-01

    Full Text Available The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing

  1. Worry, Intolerance of Uncertainty, and Statistics Anxiety

    Science.gov (United States)

    Williams, Amanda S.

    2013-01-01

    Statistics anxiety is a problem for most graduate students. This study investigates the relationship between intolerance of uncertainty, worry, and statistics anxiety. Intolerance of uncertainty was significantly related to worry, and worry was significantly related to three types of statistics anxiety. Six types of statistics anxiety were…

  2. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    Science.gov (United States)

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  3. Evaluation of significantly modified water bodies in Vojvodina by using multivariate statistical techniques

    Directory of Open Access Journals (Sweden)

    Vujović Svetlana R.

    2013-01-01

    Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an

  4. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    Science.gov (United States)

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  5. Response Times of Operators in a Control Room

    DEFF Research Database (Denmark)

    Platz, O.; Rasmussen, Jens; Skanborg, Preben Zacho

    A statistical analysis was made of operator response times recorded in the control room of a research reactor during the years 1972-1974. A homogeneity test revealed that the data consist of a mixture of populations. A small but statistically significant difference is found between day and night...

  6. 75 FR 67776 - Comment Request; Review of Productivity Statistics

    Science.gov (United States)

    2010-11-03

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Comment Request; Review of Productivity Statistics... Statistics (BLS) is responsible for publishing measures of labor productivity and multifactor productivity..., Office of Productivity and Technology, Bureau of Labor Statistics, Room 2150, 2 Massachusetts Avenue, NE...

  7. Adjusting the Adjusted X[superscript 2]/df Ratio Statistic for Dichotomous Item Response Theory Analyses: Does the Model Fit?

    Science.gov (United States)

    Tay, Louis; Drasgow, Fritz

    2012-01-01

    Two Monte Carlo simulation studies investigated the effectiveness of the mean adjusted X[superscript 2]/df statistic proposed by Drasgow and colleagues and, because of problems with the method, a new approach for assessing the goodness of fit of an item response theory model was developed. It has been previously recommended that mean adjusted…

  8. Low-level contrast statistics of natural images can modulate the frequency of event-related potentials (ERP in humans

    Directory of Open Access Journals (Sweden)

    Masoud Ghodrati

    2016-12-01

    Full Text Available Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs’ power within theta frequency band (~3-7 Hz. This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.

  9. A novel complete-case analysis to determine statistical significance between treatments in an intention-to-treat population of randomized clinical trials involving missing data.

    Science.gov (United States)

    Liu, Wei; Ding, Jinhui

    2018-04-01

    The application of the principle of the intention-to-treat (ITT) to the analysis of clinical trials is challenged in the presence of missing outcome data. The consequences of stopping an assigned treatment in a withdrawn subject are unknown. It is difficult to make a single assumption about missing mechanisms for all clinical trials because there are complicated reactions in the human body to drugs due to the presence of complex biological networks, leading to data missing randomly or non-randomly. Currently there is no statistical method that can tell whether a difference between two treatments in the ITT population of a randomized clinical trial with missing data is significant at a pre-specified level. Making no assumptions about the missing mechanisms, we propose a generalized complete-case (GCC) analysis based on the data of completers. An evaluation of the impact of missing data on the ITT analysis reveals that a statistically significant GCC result implies a significant treatment effect in the ITT population at a pre-specified significance level unless, relative to the comparator, the test drug is poisonous to the non-completers as documented in their medical records. Applications of the GCC analysis are illustrated using literature data, and its properties and limits are discussed.

  10. Statistical Models and Methods for Lifetime Data

    CERN Document Server

    Lawless, Jerald F

    2011-01-01

    Praise for the First Edition"An indispensable addition to any serious collection on lifetime data analysis and . . . a valuable contribution to the statistical literature. Highly recommended . . ."-Choice"This is an important book, which will appeal to statisticians working on survival analysis problems."-Biometrics"A thorough, unified treatment of statistical models and methods used in the analysis of lifetime data . . . this is a highly competent and agreeable statistical textbook."-Statistics in MedicineThe statistical analysis of lifetime or response time data is a key tool in engineering,

  11. The significance of reporting to the thousandths place: Figuring out the laboratory limitations

    Directory of Open Access Journals (Sweden)

    Joely A. Straseski

    2017-04-01

    Full Text Available Objectives: A request to report laboratory values to a specific number of decimal places represents a delicate balance between clinical interpretation of a true analytical change versus laboratory understanding of analytical imprecision and significant figures. Prostate specific antigen (PSA was used as an example to determine if an immunoassay routinely reported to the hundredths decimal place based on significant figure assessment in our laboratory was capable of providing analytically meaningful results when reported to the thousandths places when requested by clinicians. Design and methods: Results of imprecision studies of a representative PSA assay (Roche MODULAR E170 employing two methods of statistical analysis are reported. Sample pools were generated with target values of 0.01 and 0.20 μg/L PSA as determined by the E170. Intra-assay imprecision studies were conducted and the resultant data were analyzed using two independent statistical methods to evaluate reporting limits. Results: These statistical methods indicated reporting results to the thousandths place at the two assessed concentrations was an appropriate reflection of the measurement imprecision for the representative assay. This approach used two independent statistical tests to determine the ability of an analytical system to support a desired reporting level. Importantly, data were generated during a routine intra-assay imprecision study, thus this approach does not require extra data collection by the laboratory. Conclusions: Independent statistical analysis must be used to determine appropriate significant figure limitations for clinically relevant analytes. Establishing these limits is the responsibility of the laboratory and should be determined prior to providing clinical results. Keywords: Significant figures, Imprecision, Prostate cancer, Prostate specific antigen, PSA

  12. The significance of translation regulation in the stress response

    OpenAIRE

    Picard, Flora; Loubière, Pascal; Girbal, Laurence; Bousquet, Muriel

    2013-01-01

    Background: The stress response in bacteria involves the multistage control of gene expression but is not entirely understood. To identify the translational response of bacteria in stress conditions and assess its contribution to the regulation of gene expression, the translational states of all mRNAs were compared under optimal growth condition and during nutrient (isoleucine) starvation. Results: A genome-scale study of the translational response to nutritional limitation was performed in t...

  13. A New Statistical Approach to Characterize Chemical-Elicited Behavioral Effects in High-Throughput Studies Using Zebrafish.

    Directory of Open Access Journals (Sweden)

    Guozhu Zhang

    Full Text Available Zebrafish have become an important alternative model for characterizing chemical bioactivity, partly due to the efficiency at which systematic, high-dimensional data can be generated. However, these new data present analytical challenges associated with scale and diversity. We developed a novel, robust statistical approach to characterize chemical-elicited effects in behavioral data from high-throughput screening (HTS of all 1,060 Toxicity Forecaster (ToxCast™ chemicals across 5 concentrations at 120 hours post-fertilization (hpf. Taking advantage of the immense scale of data for a global view, we show that this new approach reduces bias introduced by extreme values yet allows for diverse response patterns that confound the application of traditional statistics. We have also shown that, as a summary measure of response for local tests of chemical-associated behavioral effects, it achieves a significant reduction in coefficient of variation compared to many traditional statistical modeling methods. This effective increase in signal-to-noise ratio augments statistical power and is observed across experimental periods (light/dark conditions that display varied distributional response patterns. Finally, we integrated results with data from concomitant developmental endpoint measurements to show that appropriate statistical handling of HTS behavioral data can add important biological context that informs mechanistic hypotheses.

  14. Chromosomal radiosensitivity: a study of the chromosomal G2 assay in human blood lymphocytes indicating significant inter-individual variability

    International Nuclear Information System (INIS)

    Smart, V.; Curwen, G.B.; Whitehouse, C.A.; Edwards, A.; Tawn, E.J.

    2003-01-01

    The G 2 chromosomal radiosensitivity assay is a technically demanding assay. To ensure that it is reproducible in our laboratory, we have examined the effects of storage and culture conditions by applying the assay to a group of healthy controls and determined the extent of intra- and inter-individual variations. Nineteen different individuals provided one or more blood samples resulting in a total of 57 successful tests. Multiple cultures from a single blood sample showed no statistically significant difference in the number of chromatid type aberrations between cultures. A 24 h delay prior to culturing the lymphocytes did not significantly affect the induced G 2 score. Intra-individual variation was not statistically significant in seven out of nine individuals. Inter-individual variation was highly statistically significant (P<0.001), indicating that there is a real difference between individuals in the response to radiation using this assay

  15. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data.

    Science.gov (United States)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M

    2015-03-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.

  16. Performance studies of GooFit on GPUs vs RooFit on CPUs while estimating the statistical significance of a new physical signal

    Science.gov (United States)

    Di Florio, Adriano

    2017-10-01

    In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  17. The Precautionary Principle and statistical approaches to uncertainty

    DEFF Research Database (Denmark)

    Keiding, Niels; Budtz-Jørgensen, Esben

    2004-01-01

    is unhelpful, because lack of significance can be due either to uninformative data or to genuine lack of effect (the Type II error problem). Its inversion, bioequivalence testing, might sometimes be a model for the Precautionary Principle in its ability to "prove the null hypothesis". Current procedures...... for setting safe exposure levels are essentially derived from these classical statistical ideas, and we outline how uncertainties in the exposure and response measurements affect the no observed adverse effect level, the Benchmark approach and the "Hockey Stick" model. A particular problem concerns model...

  18. Using Person Fit Statistics to Detect Outliers in Survey Research

    Directory of Open Access Journals (Sweden)

    John M. Felt

    2017-05-01

    Full Text Available Context: When working with health-related questionnaires, outlier detection is important. However, traditional methods of outlier detection (e.g., boxplots can miss participants with “atypical” responses to the questions that otherwise have similar total (subscale scores. In addition to detecting outliers, it can be of clinical importance to determine the reason for the outlier status or “atypical” response.Objective: The aim of the current study was to illustrate how to derive person fit statistics for outlier detection through a statistical method examining person fit with a health-based questionnaire.Design and Participants: Patients treated for Cushing's syndrome (n = 394 were recruited from the Cushing's Support and Research Foundation's (CSRF listserv and Facebook page.Main Outcome Measure: Patients were directed to an online survey containing the CushingQoL (English version. A two-dimensional graded response model was estimated, and person fit statistics were generated using the Zh statistic.Results: Conventional outlier detections methods revealed no outliers reflecting extreme scores on the subscales of the CushingQoL. However, person fit statistics identified 18 patients with “atypical” response patterns, which would have been otherwise missed (Zh > |±2.00|.Conclusion: While the conventional methods of outlier detection indicated no outliers, person fit statistics identified several patients with “atypical” response patterns who otherwise appeared average. Person fit statistics allow researchers to delve further into the underlying problems experienced by these “atypical” patients treated for Cushing's syndrome. Annotated code is provided to aid other researchers in using this method.

  19. Statistical analysis of earthquake ground motion parameters

    International Nuclear Information System (INIS)

    1979-12-01

    Several earthquake ground response parameters that define the strength, duration, and frequency content of the motions are investigated using regression analyses techniques; these techniques incorporate statistical significance testing to establish the terms in the regression equations. The parameters investigated are the peak acceleration, velocity, and displacement; Arias intensity; spectrum intensity; bracketed duration; Trifunac-Brady duration; and response spectral amplitudes. The study provides insight into how these parameters are affected by magnitude, epicentral distance, local site conditions, direction of motion (i.e., whether horizontal or vertical), and earthquake event type. The results are presented in a form so as to facilitate their use in the development of seismic input criteria for nuclear plants and other major structures. They are also compared with results from prior investigations that have been used in the past in the criteria development for such facilities

  20. Multilevel linear modelling of the response-contingent learning of young children with significant developmental delays.

    Science.gov (United States)

    Raab, Melinda; Dunst, Carl J; Hamby, Deborah W

    2018-02-27

    The purpose of the study was to isolate the sources of variations in the rates of response-contingent learning among young children with multiple disabilities and significant developmental delays randomly assigned to contrasting types of early childhood intervention. Multilevel, hierarchical linear growth curve modelling was used to analyze four different measures of child response-contingent learning where repeated child learning measures were nested within individual children (Level-1), children were nested within practitioners (Level-2), and practitioners were nested within the contrasting types of intervention (Level-3). Findings showed that sources of variations in rates of child response-contingent learning were associated almost entirely with type of intervention after the variance associated with differences in practitioners nested within groups were accounted for. Rates of child learning were greater among children whose existing behaviour were used as the building blocks for promoting child competence (asset-based practices) compared to children for whom the focus of intervention was promoting child acquisition of missing skills (needs-based practices). The methods of analysis illustrate a practical approach to clustered data analysis and the presentation of results in ways that highlight sources of variations in the rates of response-contingent learning among young children with multiple developmental disabilities and significant developmental delays. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  1. Statistical-Dynamical Seasonal Forecasts of Central-Southwest Asian Winter Precipitation.

    Science.gov (United States)

    Tippett, Michael K.; Goddard, Lisa; Barnston, Anthony G.

    2005-06-01

    Interannual precipitation variability in central-southwest (CSW) Asia has been associated with East Asian jet stream variability and western Pacific tropical convection. However, atmospheric general circulation models (AGCMs) forced by observed sea surface temperature (SST) poorly simulate the region's interannual precipitation variability. The statistical-dynamical approach uses statistical methods to correct systematic deficiencies in the response of AGCMs to SST forcing. Statistical correction methods linking model-simulated Indo-west Pacific precipitation and observed CSW Asia precipitation result in modest, but statistically significant, cross-validated simulation skill in the northeast part of the domain for the period from 1951 to 1998. The statistical-dynamical method is also applied to recent (winter 1998/99 to 2002/03) multimodel, two-tier December-March precipitation forecasts initiated in October. This period includes 4 yr (winter of 1998/99 to 2001/02) of severe drought. Tercile probability forecasts are produced using ensemble-mean forecasts and forecast error estimates. The statistical-dynamical forecasts show enhanced probability of below-normal precipitation for the four drought years and capture the return to normal conditions in part of the region during the winter of 2002/03.May Kabul be without gold, but not without snow.—Traditional Afghan proverb

  2. Population activity statistics dissect subthreshold and spiking variability in V1.

    Science.gov (United States)

    Bányai, Mihály; Koman, Zsombor; Orbán, Gergő

    2017-07-01

    Response variability, as measured by fluctuating responses upon repeated performance of trials, is a major component of neural responses, and its characterization is key to interpret high dimensional population recordings. Response variability and covariability display predictable changes upon changes in stimulus and cognitive or behavioral state, providing an opportunity to test the predictive power of models of neural variability. Still, there is little agreement on which model to use as a building block for population-level analyses, and models of variability are often treated as a subject of choice. We investigate two competing models, the doubly stochastic Poisson (DSP) model assuming stochasticity at spike generation, and the rectified Gaussian (RG) model tracing variability back to membrane potential variance, to analyze stimulus-dependent modulation of both single-neuron and pairwise response statistics. Using a pair of model neurons, we demonstrate that the two models predict similar single-cell statistics. However, DSP and RG models have contradicting predictions on the joint statistics of spiking responses. To test the models against data, we build a population model to simulate stimulus change-related modulations in pairwise response statistics. We use single-unit data from the primary visual cortex (V1) of monkeys to show that while model predictions for variance are qualitatively similar to experimental data, only the RG model's predictions are compatible with joint statistics. These results suggest that models using Poisson-like variability might fail to capture important properties of response statistics. We argue that membrane potential-level modeling of stochasticity provides an efficient strategy to model correlations. NEW & NOTEWORTHY Neural variability and covariability are puzzling aspects of cortical computations. For efficient decoding and prediction, models of information encoding in neural populations hinge on an appropriate model of

  3. Significance and nature of bystander responses induced by various agents.

    Science.gov (United States)

    Verma, Neha; Tiku, Ashu Bhan

    2017-07-01

    Bystander effects in a biological system are the responses shown by non-targeted neighbouring cells/tissues/organisms. These responses are triggered by factors released from targeted cells when exposed to a stress inducing agent. The biological response to stress inducing agents is complex, owing to the diversity of mechanisms and pathways activated in directly targeted and bystander cells. These responses are highly variable and can be either beneficial or hazardous depending on the cell lines tested, dose of agent used, experimental end points and time course selected. Recently non-targeted cells have even been reported to rescue the directly exposed cells by releasing protective signals that might be induced by non-targeted bystander responses. The nature of bystander signal/s is not yet clear. However, there are evidences suggesting involvement of ROS, RNS, protein factors and even DNA molecules leading to the activation of a number of signaling pathways. These can act independently or in a cascade, to induce events leading to changes in gene expression patterns that could elicit detrimental or beneficial effects. Many review articles on radiation induced bystander responses have been published. However, to the best of our knowledge, a comprehensive review on bystander responses induced by other genotoxic chemicals and stress inducing agents has not been published so far. Therefore, the aim of the present review is to give an overview of the literature on different aspects of bystander responses: agents that induce these responses, factors that can modulate bystander responses and the mechanisms involved. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Robust statistical methods for significance evaluation and applications in cancer driver detection and biomarker discovery

    DEFF Research Database (Denmark)

    Madsen, Tobias

    2017-01-01

    In the present thesis I develop, implement and apply statistical methods for detecting genomic elements implicated in cancer development and progression. This is done in two separate bodies of work. The first uses the somatic mutation burden to distinguish cancer driver mutations from passenger m...

  5. Statistically accurate low-order models for uncertainty quantification in turbulent dynamical systems.

    Science.gov (United States)

    Sapsis, Themistoklis P; Majda, Andrew J

    2013-08-20

    A framework for low-order predictive statistical modeling and uncertainty quantification in turbulent dynamical systems is developed here. These reduced-order, modified quasilinear Gaussian (ROMQG) algorithms apply to turbulent dynamical systems in which there is significant linear instability or linear nonnormal dynamics in the unperturbed system and energy-conserving nonlinear interactions that transfer energy from the unstable modes to the stable modes where dissipation occurs, resulting in a statistical steady state; such turbulent dynamical systems are ubiquitous in geophysical and engineering turbulence. The ROMQG method involves constructing a low-order, nonlinear, dynamical system for the mean and covariance statistics in the reduced subspace that has the unperturbed statistics as a stable fixed point and optimally incorporates the indirect effect of non-Gaussian third-order statistics for the unperturbed system in a systematic calibration stage. This calibration procedure is achieved through information involving only the mean and covariance statistics for the unperturbed equilibrium. The performance of the ROMQG algorithm is assessed on two stringent test cases: the 40-mode Lorenz 96 model mimicking midlatitude atmospheric turbulence and two-layer baroclinic models for high-latitude ocean turbulence with over 125,000 degrees of freedom. In the Lorenz 96 model, the ROMQG algorithm with just a single mode captures the transient response to random or deterministic forcing. For the baroclinic ocean turbulence models, the inexpensive ROMQG algorithm with 252 modes, less than 0.2% of the total, captures the nonlinear response of the energy, the heat flux, and even the one-dimensional energy and heat flux spectra.

  6. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  7. Significance evaluation in factor graphs

    DEFF Research Database (Denmark)

    Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet

    2017-01-01

    in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...

  8. Optimization of Biodiesel-Diesel Blended Fuel Properties and Engine Performance with Ether Additive Using Statistical Analysis and Response Surface Methods

    Directory of Open Access Journals (Sweden)

    Obed M. Ali

    2015-12-01

    Full Text Available In this study, the fuel properties and engine performance of blended palm biodiesel-diesel using diethyl ether as additive have been investigated. The properties of B30 blended palm biodiesel-diesel fuel were measured and analyzed statistically with the addition of 2%, 4%, 6% and 8% (by volume diethyl ether additive. The engine tests were conducted at increasing engine speeds from 1500 rpm to 3500 rpm and under constant load. Optimization of independent variables was performed using the desirability approach of the response surface methodology (RSM with the goal of minimizing emissions and maximizing performance parameters. The experiments were designed using a statistical tool known as design of experiments (DoE based on RSM.

  9. Are Statistics Labs Worth the Effort?--Comparison of Introductory Statistics Courses Using Different Teaching Methods

    Directory of Open Access Journals (Sweden)

    Jose H. Guardiola

    2010-01-01

    Full Text Available This paper compares the academic performance of students in three similar elementary statistics courses taught by the same instructor, but with the lab component differing among the three. One course is traditionally taught without a lab component; the second with a lab component using scenarios and an extensive use of technology, but without explicit coordination between lab and lecture; and the third using a lab component with an extensive use of technology that carefully coordinates the lab with the lecture. Extensive use of technology means, in this context, using Minitab software in the lab section, doing homework and quizzes using MyMathlab ©, and emphasizing interpretation of computer output during lectures. Initially, an online instrument based on Gardner’s multiple intelligences theory, is given to students to try to identify students’ learning styles and intelligence types as covariates. An analysis of covariance is performed in order to compare differences in achievement. In this study there is no attempt to measure difference in student performance across the different treatments. The purpose of this study is to find indications of associations among variables that support the claim that statistics labs could be associated with superior academic achievement in one of these three instructional environments. Also, this study tries to identify individual student characteristics that could be associated with superior academic performance. This study did not find evidence of any individual student characteristics that could be associated with superior achievement. The response variable was computed as percentage of correct answers for the three exams during the semester added together. The results of this study indicate a significant difference across these three different instructional methods, showing significantly higher mean scores for the response variable on students taking the lab component that was carefully coordinated with

  10. Explorations in Statistics: The Analysis of Change

    Science.gov (United States)

    Curran-Everett, Douglas; Williams, Calvin L.

    2015-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of "Explorations in Statistics" explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can…

  11. Escherichia coli DinB inhibits replication fork progression without significantly inducing the SOS response.

    Science.gov (United States)

    Mori, Tetsuya; Nakamura, Tatsuro; Okazaki, Naoto; Furukohri, Asako; Maki, Hisaji; Akiyama, Masahiro Tatsumi

    2012-01-01

    The SOS response is readily triggered by replication fork stalling caused by DNA damage or a dysfunctional replicative apparatus in Escherichia coli cells. E. coli dinB encodes DinB DNA polymerase and its expression is upregulated during the SOS response. DinB catalyzes translesion DNA synthesis in place of a replicative DNA polymerase III that is stalled at a DNA lesion. We showed previously that DNA replication was suppressed without exogenous DNA damage in cells overproducing DinB. In this report, we confirm that this was due to a dose-dependent inhibition of ongoing replication forks by DinB. Interestingly, the DinB-overproducing cells did not significantly induce the SOS response even though DNA replication was perturbed. RecA protein is activated by forming a nucleoprotein filament with single-stranded DNA, which leads to the onset of the SOS response. In the DinB-overproducing cells, RecA was not activated to induce the SOS response. However, the SOS response was observed after heat-inducible activation in strain recA441 (encoding a temperature-sensitive RecA) and after replication blockage in strain dnaE486 (encoding a temperature-sensitive catalytic subunit of the replicative DNA polymerase III) at a non-permissive temperature when DinB was overproduced in these cells. Furthermore, since catalytically inactive DinB could avoid the SOS response to a DinB-promoted fork block, it is unlikely that overproduced DinB takes control of primer extension and thus limits single-stranded DNA. These observations suggest that DinB possesses a feature that suppresses DNA replication but does not abolish the cell's capacity to induce the SOS response. We conclude that DinB impedes replication fork progression in a way that does not activate RecA, in contrast to obstructive DNA lesions and dysfunctional replication machinery.

  12. Statistical lamb wave localization based on extreme value theory

    Science.gov (United States)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  13. Critical Realism and Statistical Methods--A Response to Nash

    Science.gov (United States)

    Scott, David

    2007-01-01

    This article offers a defence of critical realism in the face of objections Nash (2005) makes to it in a recent edition of this journal. It is argued that critical and scientific realisms are closely related and that both are opposed to statistical positivism. However, the suggestion is made that scientific realism retains (from statistical…

  14. Advanced statistical methods in data science

    CERN Document Server

    Chen, Jiahua; Lu, Xuewen; Yi, Grace; Yu, Hao

    2016-01-01

    This book gathers invited presentations from the 2nd Symposium of the ICSA- CANADA Chapter held at the University of Calgary from August 4-6, 2015. The aim of this Symposium was to promote advanced statistical methods in big-data sciences and to allow researchers to exchange ideas on statistics and data science and to embraces the challenges and opportunities of statistics and data science in the modern world. It addresses diverse themes in advanced statistical analysis in big-data sciences, including methods for administrative data analysis, survival data analysis, missing data analysis, high-dimensional and genetic data analysis, longitudinal and functional data analysis, the design and analysis of studies with response-dependent and multi-phase designs, time series and robust statistics, statistical inference based on likelihood, empirical likelihood and estimating functions. The editorial group selected 14 high-quality presentations from this successful symposium and invited the presenters to prepare a fu...

  15. The significance of Good Chair as part of children’s school and home environment in the preventive treatment of body statistics distortions

    OpenAIRE

    Mirosław Mrozkowiak; Hanna Żukowska

    2015-01-01

    Mrozkowiak Mirosław, Żukowska Hanna. Znaczenie Dobrego Krzesła, jako elementu szkolnego i domowego środowiska ucznia, w profilaktyce zaburzeń statyki postawy ciała = The significance of Good Chair as part of children’s school and home environment in the preventive treatment of body statistics distortions. Journal of Education, Health and Sport. 2015;5(7):179-215. ISSN 2391-8306. DOI 10.5281/zenodo.19832 http://ojs.ukw.edu.pl/index.php/johs/article/view/2015%3B5%287%29%3A179-215 https:...

  16. Can a significance test be genuinely Bayesian?

    OpenAIRE

    Pereira, Carlos A. de B.; Stern, Julio Michael; Wechsler, Sergio

    2008-01-01

    The Full Bayesian Significance Test, FBST, is extensively reviewed. Its test statistic, a genuine Bayesian measure of evidence, is discussed in detail. Its behavior in some problems of statistical inference like testing for independence in contingency tables is discussed.

  17. Beyond imperviousness: A statistical approach to identifying functional differences between development morphologies on variable source area-type response in urbanized watersheds

    Science.gov (United States)

    Lim, T. C.

    2016-12-01

    Empirical evidence has shown linkages between urbanization, hydrological regime change, and degradation of water quality and aquatic habitat. Percent imperviousness, has long been suggested as the dominant source of these negative changes. However, recent research identifying alternative pathways of runoff production at the watershed scale have called into question percent impervious surface area's primacy in urban runoff production compared to other aspects of urbanization including change in vegetative cover, imported water and water leakages, and the presence of drainage infrastructure. In this research I show how a robust statistical methodology can detect evidence of variable source area (VSA)-type hydrologic response associated with incremental hydraulic connectivity in watersheds. I then use logistic regression to explore how evidence of VSA-type response relates to the physical and meterological characteristics of the watershed. I find that impervious surface area is highly correlated with development, but does not add significant explanatory power beyond percent developed in predicting VSA-type response. Other aspects of development morphology, including percent developed open space and type of drainage infrastructure also do not add to the explanatory power of undeveloped land in predicting VSA-type response. Within only developed areas, the effect of developed open space was found to be more similar to that of total impervious area than to undeveloped land. These findings were consistent when tested across a national cross-section of urbanized watersheds, a higher resolution dataset of Baltimore Metropolitan Area watersheds, and a subsample of watersheds confirmed not to be served by combined sewer systems. These findings suggest that land development policies that focus on lot coverage should be revisited, and more focus should be placed on preserving native vegetation and soil conditions alongside development.

  18. Computational algebraic geometry for statistical modeling FY09Q2 progress.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, David C.; Rojas, Joseph Maurice; Pebay, Philippe Pierre

    2009-03-01

    This is a progress report on polynomial system solving for statistical modeling. This is a progress report on polynomial system solving for statistical modeling. This quarter we have developed our first model of shock response data and an algorithm for identifying the chamber cone containing a polynomial system in n variables with n+k terms within polynomial time - a significant improvement over previous algorithms, all having exponential worst-case complexity. We have implemented and verified the chamber cone algorithm for n+3 and are working to extend the implementation to handle arbitrary k. Later sections of this report explain chamber cones in more detail; the next section provides an overview of the project and how the current progress fits into it.

  19. Possible uses of animal databases for further statistical evaluation and modeling

    International Nuclear Information System (INIS)

    Griffith, W.C.; Boecker, B.B.; Gerber, G.B.

    1995-01-01

    Many studies have been performed in animals which mimic potential exposures of people in order to understand how factors modify radiation dose-response relationships. Cooperative analyses by investigators in different laboratories have a large potential for strengthening the conclusions that can be drawn from individual studies. When information on each animal is combined, then formal tests can be made to demonstrate that apparent consistencies or inconsistencies are statistically significant. Statistical methods must be carefully chosen so that differences between laboratories or studies can be controlled or described as part of the analysis in the interpretation of the conclusions. In this report, the example of bone cancer of the large number of studies of modifying factors for bone cancer available from studies in US and European laboratories

  20. Empirical Tryout of a New Statistic for Detecting Temporally Inconsistent Responders.

    Science.gov (United States)

    Kerry, Matthew J

    2018-01-01

    Statistical screening of self-report data is often advised to support the quality of analyzed responses - For example, reduction of insufficient effort responding (IER). One recently introduced index based on Mahalanobis's D for detecting outliers in cross-sectional designs replaces centered scores with difference scores between repeated-measure items: Termed person temporal consistency ( D 2 ptc ). Although the adapted D 2 ptc index demonstrated usefulness in simulation datasets, it has not been applied to empirical data. The current study addresses D 2 ptc 's low uptake by critically appraising its performance across three empirical applications. Independent samples were selected to represent a range of scenarios commonly encountered by organizational researchers. First, in Sample 1, a repeat-measure of future time perspective (FTP) inexperienced working adults (age >40-years; n = 620) indicated that temporal inconsistency was significantly related to respondent age and item reverse-scoring. Second, in repeat-measure of team efficacy aggregations, D 2 ptc successfully detected team-level inconsistency across repeat-performance cycles. Thirdly, the usefulness of the D 2 ptc was examined in an experimental study dataset of subjective life expectancy indicated significantly more stable responding in experimental conditions compared to controls. The empirical findings support D 2 ptc 's flexible and useful application to distinct study designs. Discussion centers on current limitations and further extensions that may be of value to psychologists screening self-report data for strengthening response quality and meaningfulness of inferences from repeated-measures self-reports. Taken together, the findings support the usefulness of the newly devised statistic for detecting IER and other extreme response patterns.

  1. Statistical Analysis of Compressive and Flexural Test Results on the Sustainable Adobe Reinforced with Steel Wire Mesh

    Science.gov (United States)

    Jokhio, Gul A.; Syed Mohsin, Sharifah M.; Gul, Yasmeen

    2018-04-01

    It has been established that Adobe provides, in addition to being sustainable and economic, a better indoor air quality without spending extensive amounts of energy as opposed to the modern synthetic materials. The material, however, suffers from weak structural behaviour when subjected to adverse loading conditions. A wide range of mechanical properties has been reported in literature owing to lack of research and standardization. The present paper presents the statistical analysis of the results that were obtained through compressive and flexural tests on Adobe samples. Adobe specimens with and without wire mesh reinforcement were tested and the results were reported. The statistical analysis of these results presents an interesting read. It has been found that the compressive strength of adobe increases by about 43% after adding a single layer of wire mesh reinforcement. This increase is statistically significant. The flexural response of Adobe has also shown improvement with the addition of wire mesh reinforcement, however, the statistical significance of the same cannot be established.

  2. Variability of Neuronal Responses: Types and Functional Significance in Neuroplasticity and Neural Darwinism.

    Science.gov (United States)

    Chervyakov, Alexander V; Sinitsyn, Dmitry O; Piradov, Michael A

    2016-01-01

    HIGHLIGHTS We suggest classifying variability of neuronal responses as follows: false (associated with a lack of knowledge about the influential factors), "genuine harmful" (noise), "genuine neutral" (synonyms, repeats), and "genuine useful" (the basis of neuroplasticity and learning).The genuine neutral variability is considered in terms of the phenomenon of degeneracy.Of particular importance is the genuine useful variability that is considered as a potential basis for neuroplasticity and learning. This type of variability is considered in terms of the neural Darwinism theory. In many cases, neural signals detected under the same external experimental conditions significantly change from trial to trial. The variability phenomenon, which complicates extraction of reproducible results and is ignored in many studies by averaging, has attracted attention of researchers in recent years. In this paper, we classify possible types of variability based on its functional significance and describe features of each type. We describe the key adaptive significance of variability at the neural network level and the degeneracy phenomenon that may be important for learning processes in connection with the principle of neuronal group selection.

  3. Clinical Significance of Immuno phenotypic Markers in Pediatric T-cell Acute Lymphoblastic Leukemia

    International Nuclear Information System (INIS)

    SIDHOM, I.; SHAABAN, Kh.; SOLIMAN, S.; HAMDY, N.; YASSIN, D.; SALEM, Sh.; HASSANEIN, H.; MANSOUR, M.T.; EZZAT, S.; EL-ANWAR, W.

    2008-01-01

    Background: Cell-marker profiling has led to conflicting conclusions about its prognostic significance in T-ALL. Aim: To investigate the prevalence of the expression of CD34, CD10 and myeloid associated antigens (CD13/ CD33) in childhood T-ALL and to relate their presence to initial clinical and biologic features and early response to therapy. Patients and Methods: This study included 67 consecutive patients with newly diagnosed T-ALL recruited from the Children's Cancer Hospital in Egypt during the time period from July 2007 to June 2008. Immuno phenotypic markers and minimal residual disease (MRD) were studied by five-color flow cytometry. Results: The frequency of CD34 was 34.9%, CD10 33.3%, while CD13/CD33 was 18.8%. No significant association was encountered between CD34, CD10 or myeloid antigen positivity and the presenting clinical features as age, sex, TLC and CNS leukemia. Only CD10+ expression had significant association with initial CNS involvement (p=0.039). CD34 and CD13/CD33 expression was significantly associated with T-cell maturation stages (p<0.05). No relationship was observed for age, TLC, gender, NCI risk or CNS involvement with early response to therapy illustrated by BM as well as MRD day 15 and day 42. CD34+, CD13/CD33+ and early T-cell stage had high MRD levels on day 15 that was statistically highly significant (p<0.01), but CD10+ had statistically significant lower MRD level on day 15 (p=0.049). However, only CD34 retained its significance at an MRD cut-off level of 0.01%. Conclusion: CD34, CD10, CD13/CD33 expression, as well as T-cell maturation stages, may have prognostic significance in pediatric T-ALL as they have a significant impact on early clearance of leukemic cells detected by MRD day 15.

  4. Characterizing Financial and Statistical Literacy

    DEFF Research Database (Denmark)

    Di Girolamo, Amalia; Harrison, Glenn W.; Lau, Morten

    We characterize the literacy of an individual in a domain by their elicited subjective belief distribution over the possible responses to a question posed in that domain. We consider literacy across several financial, economic and statistical domains. We find considerable demographic heterogeneity...

  5. The Significance of Community to Business Social Responsibility.

    Science.gov (United States)

    Besser, Terry L.

    1998-01-01

    Interviews with 1008 business owners and managers in 30 small Iowa communities found that the majority were committed to their community and provided support to youth programs, local schools, or community development activities. Business social responsibility was related to operator age, education, success, and perceptions of community collective…

  6. Statistics for experimentalists

    CERN Document Server

    Cooper, B E

    2014-01-01

    Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...

  7. Projection operator techniques in nonequilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Grabert, H.

    1982-01-01

    This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)

  8. Statistical modelling of transcript profiles of differentially regulated genes

    Directory of Open Access Journals (Sweden)

    Sergeant Martin J

    2008-07-01

    Full Text Available Abstract Background The vast quantities of gene expression profiling data produced in microarray studies, and the more precise quantitative PCR, are often not statistically analysed to their full potential. Previous studies have summarised gene expression profiles using simple descriptive statistics, basic analysis of variance (ANOVA and the clustering of genes based on simple models fitted to their expression profiles over time. We report the novel application of statistical non-linear regression modelling techniques to describe the shapes of expression profiles for the fungus Agaricus bisporus, quantified by PCR, and for E. coli and Rattus norvegicus, using microarray technology. The use of parametric non-linear regression models provides a more precise description of expression profiles, reducing the "noise" of the raw data to produce a clear "signal" given by the fitted curve, and describing each profile with a small number of biologically interpretable parameters. This approach then allows the direct comparison and clustering of the shapes of response patterns between genes and potentially enables a greater exploration and interpretation of the biological processes driving gene expression. Results Quantitative reverse transcriptase PCR-derived time-course data of genes were modelled. "Split-line" or "broken-stick" regression identified the initial time of gene up-regulation, enabling the classification of genes into those with primary and secondary responses. Five-day profiles were modelled using the biologically-oriented, critical exponential curve, y(t = A + (B + CtRt + ε. This non-linear regression approach allowed the expression patterns for different genes to be compared in terms of curve shape, time of maximal transcript level and the decline and asymptotic response levels. Three distinct regulatory patterns were identified for the five genes studied. Applying the regression modelling approach to microarray-derived time course data

  9. A preliminary study for investigating idiopatic normal pressure hydrocephalus by means of statistical parameters classification of intracranial pressure recordings.

    Science.gov (United States)

    Calisto, A; Bramanti, A; Galeano, M; Angileri, F; Campobello, G; Serrano, S; Azzerboni, B

    2009-01-01

    The objective of this study is to investigate Id-iopatic Normal Pressure Hydrocephalus (INPH) through a multidimensional and multiparameter analysis of statistical data obtained from accurate analysis of Intracranial Pressure (ICP) recordings. Such a study could permit to detect new factors, correlated with therapeutic response, which are able to validate a predicting significance for infusion test. The algorithm developed by the authors computes 13 ICP parameter trends on each of the recording, afterward 9 statistical information from each trend is determined. All data are transferred to the datamining software WEKA. According to the exploited feature-selection techniques, the WEKA has revealed that the most significant statistical parameter is the maximum of Single-Wave-Amplitude: setting a 27 mmHg threshold leads to over 90% of correct classification.

  10. Multiple-Choice versus Constructed-Response Tests in the Assessment of Mathematics Computation Skills.

    Science.gov (United States)

    Gadalla, Tahany M.

    The equivalence of multiple-choice (MC) and constructed response (discrete) (CR-D) response formats as applied to mathematics computation at grade levels two to six was tested. The difference between total scores from the two response formats was tested for statistical significance, and the factor structure of items in both response formats was…

  11. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  12. Significance of novel bioinorganic anodic aluminum oxide nanoscaffolds for promoting cellular response

    Directory of Open Access Journals (Sweden)

    Gérrard Eddy Jai Poinern

    2011-01-01

    Full Text Available Gérrard Eddy Jai Poinern, Robert Shackleton, Shariful Islam Mamun, Derek FawcettMurdoch Applied Nanotechnology Research Group, Department of Physics, Energy Studies and Nanotechnology, School of Engineering and Energy, Murdoch University, Murdoch, Western Australia, AustraliaAbstract: Tissue engineering is a multidisciplinary field that can directly benefit from the many advancements in nanotechnology and nanoscience. This article reviews a novel biocompatible anodic aluminum oxide (AAO, alumina membrane in terms of tissue engineering. Cells respond and interact with their natural environment, the extracellular matrix, and the landscape of the substrate. The interaction with the topographical features of the landscape occurs both in the micrometer and nanoscales. If all these parameters are favorable to the cell, the cell will respond in terms of adhesion, proliferation, and migration. The role of the substrate/scaffold is crucial in soliciting a favorable response from the cell. The size and type of surface feature can directly influence the response and behavior of the cell. In the case of using an AAO membrane, the surface features and porosity of the membrane can be dictated at the nanoscale during the manufacturing stage. This is achieved by using general laboratory equipment to perform a relatively straightforward electrochemical process. During this technique, changing the operational parameters of the process directly controls the nanoscale features produced. For example, the pore size, pore density, and, hence, density can be effectively controlled during the synthesis of the AAO membrane. In addition, being able to control the pore size and porosity of a biomaterial such as AAO significantly broadens its application in tissue engineering.Keywords: anodic aluminum oxide, nanoscaffolds, cellular response, tissue engineering

  13. Statistics Using Just One Formula

    Science.gov (United States)

    Rosenthal, Jeffrey S.

    2018-01-01

    This article advocates that introductory statistics be taught by basing all calculations on a single simple margin-of-error formula and deriving all of the standard introductory statistical concepts (confidence intervals, significance tests, comparisons of means and proportions, etc) from that one formula. It is argued that this approach will…

  14. Statistics Anxiety among Postgraduate Students

    Science.gov (United States)

    Koh, Denise; Zawi, Mohd Khairi

    2014-01-01

    Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…

  15. The l z ( p ) * Person-Fit Statistic in an Unfolding Model Context.

    Science.gov (United States)

    Tendeiro, Jorge N

    2017-01-01

    Although person-fit analysis has a long-standing tradition within item response theory, it has been applied in combination with dominance response models almost exclusively. In this article, a popular log likelihood-based parametric person-fit statistic under the framework of the generalized graded unfolding model is used. Results from a simulation study indicate that the person-fit statistic performed relatively well in detecting midpoint response style patterns and not so well in detecting extreme response style patterns.

  16. Attitude of teaching faculty towards statistics at a medical university in Karachi, Pakistan.

    Science.gov (United States)

    Khan, Nazeer; Mumtaz, Yasmin

    2009-01-01

    Statistics is mainly used in biological research to verify the clinicians and researchers findings and feelings, and gives scientific validity for their inferences. In Pakistan, the educational curriculum is developed in such a way that the students who are interested in entering in the field of biological sciences do not study mathematics after grade 10. Therefore, due to their fragile background of mathematical skills, the Pakistani medical professionals feel that they do not have adequate base to understand the basic concepts of statistical techniques when they try to use it in their research or read a scientific article. The aim of the study was to assess the attitude of medical faculty towards statistics. A questionnaire containing 42 close-ended and 4 open-ended questions, related to the attitude and knowledge of statistics, was distributed among the teaching faculty of Dow University of Health Sciences (DUHS). One hundred and sixty-seven filled questionnaires were returned from 374 faculty members (response rate 44.7%). Forty-three percent of the respondents claimed that they had 'introductive' level of statistics courses, 63% of the respondents strongly agreed that a good researcher must have some training in statistics, 82% of the faculty was in favour (strongly agreed or agreed) that statistics was really useful for research. Only 17% correctly stated that statistics is the science of uncertainty. Half of the respondents accepted that they have problem of writing the statistical section of the article. 64% of the subjects indicated that statistical teaching methods were the main reasons for the impression of its difficulties. 53% of the faculty indicated that the co-authorship of the statistician should depend upon his/her contribution in the study. Gender did not show any significant difference among the responses. However, senior faculty showed higher level of the importance for the use of statistics and difficulties of writing result section of

  17. ANALYSIS OF SUPPLY RESPONSE AND PRICE RISK ON RICE PRODUCTION IN NIGERIA

    Directory of Open Access Journals (Sweden)

    Opeyemi Eyitayo Ayinde

    2017-03-01

    Full Text Available  Nigeria, like most African countries, has engaged in agricultural liberalization since 1986 in the hope that reforms emphasizing price incentives will encourage producers to respond. Thus far, the reforms seem to have introduced greater uncertainty into the market given increasing rates of price volatility. This study amongst other things therefore seeks to determine and model the responsiveness of rice supply to price risk in Nigeria. Statistical information on domestic and imported quantities of rice was obtained for 41 years (1970 to 2011 from various sources, such as the Food and Agriculture Organization (FAO database, Federal Ministry of Agriculture statistical bulletins, Central Bank of Nigeria statistical bulletins and National Bureau of Statistic (NBS. Data were analyzed using equilibrium output supply function, co-integration models, and vector autoregressive distributed lag model. Rice importation was statistically significant and changes in output were also responsive to changes in price. The results indicate that producers are more responsive not only to price and non-price factor but also to price risk and exchange rate. It is therefore imperative to reduce the effects of price risk as to increase the response of producer to supply by bridging the gap in production

  18. Stress and fear responses in the teleost pallium

    DEFF Research Database (Denmark)

    Silva, Patricia Isabel da Mota E.; Martins, C.I.M.; Khan, Uniza Wahid

    2015-01-01

    Evolution has resulted in behavioural responses to threat which show extensive similarities between different animal species. The reaction to predator cues is one example of such prevailing responses, and functional homologies to mammalian limbic regions involved in threat-sensitive behaviour hav...... to chemical alarm cues, but this effect did not reach the level of statistical significance. Hence, limbic responses to stress and fear, akin to those seen in extant mammals, are also present in the teleost lineage...

  19. Quantitative Analysis of Repertoire-Scale Immunoglobulin Properties in Vaccine-Induced B-Cell Responses

    Directory of Open Access Journals (Sweden)

    Ilja V. Khavrutskii

    2017-08-01

    Full Text Available Recent advances in the next-generation sequencing of B-cell receptors (BCRs enable the characterization of humoral responses at a repertoire-wide scale and provide the capability for identifying unique features of immune repertoires in response to disease, vaccination, or infection. Immunosequencing now readily generates 103–105 sequences per sample; however, statistical analysis of these repertoires is challenging because of the high genetic diversity of BCRs and the elaborate clonal relationships among them. To date, most immunosequencing analyses have focused on reporting qualitative trends in immunoglobulin (Ig properties, such as usage or somatic hypermutation (SHM percentage of the Ig heavy chain variable (IGHV gene segment family, and on reducing complex Ig property distributions to simple summary statistics. However, because Ig properties are typically not normally distributed, any approach that fails to assess the distribution as a whole may be inadequate in (1 properly assessing the statistical significance of repertoire differences, (2 identifying how two repertoires differ, and (3 determining appropriate confidence intervals for assessing the size of the differences and their potential biological relevance. To address these issues, we have developed a technique that uses Wilcox’ robust statistics toolbox to identify statistically significant vaccine-specific differences between Ig repertoire properties. The advantage of this technique is that it can determine not only whether but also where the distributions differ, even when the Ig repertoire properties are non-normally distributed. We used this technique to characterize murine germinal center (GC B-cell repertoires in response to a complex Ebola virus-like particle (eVLP vaccine candidate with known protective efficacy. The eVLP-mediated GC B-cell responses were highly diverse, consisting of thousands of clonotypes. Despite this staggering diversity, we identified statistically

  20. Rapid Statistical Learning Supporting Word Extraction From Continuous Speech.

    Science.gov (United States)

    Batterink, Laura J

    2017-07-01

    The identification of words in continuous speech, known as speech segmentation, is a critical early step in language acquisition. This process is partially supported by statistical learning, the ability to extract patterns from the environment. Given that speech segmentation represents a potential bottleneck for language acquisition, patterns in speech may be extracted very rapidly, without extensive exposure. This hypothesis was examined by exposing participants to continuous speech streams composed of novel repeating nonsense words. Learning was measured on-line using a reaction time task. After merely one exposure to an embedded novel word, learners demonstrated significant learning effects, as revealed by faster responses to predictable than to unpredictable syllables. These results demonstrate that learners gained sensitivity to the statistical structure of unfamiliar speech on a very rapid timescale. This ability may play an essential role in early stages of language acquisition, allowing learners to rapidly identify word candidates and "break in" to an unfamiliar language.

  1. Investigation of Millennial Students' Responses to a Shelter-in-Place Experience

    Science.gov (United States)

    Johnson, Thomas C.; Frick, Melodie H.

    2016-01-01

    This study investigated millennial students' responses to an armed gunman threat and shelter-in-place warnings that occurred on a university campus. Using descriptive statistics and quantitative analysis, several significant differences were found for students' responses for sheltering-in-place and engaging in protective behaviors. Baxter Magolda'…

  2. Confidence Intervals: From tests of statistical significance to confidence intervals, range hypotheses and substantial effects

    Directory of Open Access Journals (Sweden)

    Dominic Beaulieu-Prévost

    2006-03-01

    Full Text Available For the last 50 years of research in quantitative social sciences, the empirical evaluation of scientific hypotheses has been based on the rejection or not of the null hypothesis. However, more than 300 articles demonstrated that this method was problematic. In summary, null hypothesis testing (NHT is unfalsifiable, its results depend directly on sample size and the null hypothesis is both improbable and not plausible. Consequently, alternatives to NHT such as confidence intervals (CI and measures of effect size are starting to be used in scientific publications. The purpose of this article is, first, to provide the conceptual tools necessary to implement an approach based on confidence intervals, and second, to briefly demonstrate why such an approach is an interesting alternative to an approach based on NHT. As demonstrated in the article, the proposed CI approach avoids most problems related to a NHT approach and can often improve the scientific and contextual relevance of the statistical interpretations by testing range hypotheses instead of a point hypothesis and by defining the minimal value of a substantial effect. The main advantage of such a CI approach is that it replaces the notion of statistical power by an easily interpretable three-value logic (probable presence of a substantial effect, probable absence of a substantial effect and probabilistic undetermination. The demonstration includes a complete example.

  3. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  4. Preventing statistical errors in scientific journals.

    NARCIS (Netherlands)

    Nuijten, M.B.

    2016-01-01

    There is evidence for a high prevalence of statistical reporting errors in psychology and other scientific fields. These errors display a systematic preference for statistically significant results, distorting the scientific literature. There are several possible causes for this systematic error

  5. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  6. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  7. Race, Sex, and Their Influences on Introductory Statistics Education

    Science.gov (United States)

    van Es, Cindy; Weaver, Michelle M.

    2018-01-01

    The Survey of Attitudes Toward Statistics or SATS was administered for three consecutive years to students in an Introductory Statistics course at Cornell University. Questions requesting demographic information and expected final course grade were added. Responses were analyzed to investigate possible differences between sexes and racial/ethnic…

  8. Significance of novel bioinorganic anodic aluminum oxide nanoscaffolds for promoting cellular response

    Science.gov (United States)

    Poinern, Gérrard Eddy Jai; Shackleton, Robert; Mamun, Shariful Islam; Fawcett, Derek

    2011-01-01

    Tissue engineering is a multidisciplinary field that can directly benefit from the many advancements in nanotechnology and nanoscience. This article reviews a novel biocompatible anodic aluminum oxide (AAO, alumina) membrane in terms of tissue engineering. Cells respond and interact with their natural environment, the extracellular matrix, and the landscape of the substrate. The interaction with the topographical features of the landscape occurs both in the micrometer and nanoscales. If all these parameters are favorable to the cell, the cell will respond in terms of adhesion, proliferation, and migration. The role of the substrate/scaffold is crucial in soliciting a favorable response from the cell. The size and type of surface feature can directly influence the response and behavior of the cell. In the case of using an AAO membrane, the surface features and porosity of the membrane can be dictated at the nanoscale during the manufacturing stage. This is achieved by using general laboratory equipment to perform a relatively straightforward electrochemical process. During this technique, changing the operational parameters of the process directly controls the nanoscale features produced. For example, the pore size, pore density, and, hence, density can be effectively controlled during the synthesis of the AAO membrane. In addition, being able to control the pore size and porosity of a biomaterial such as AAO significantly broadens its application in tissue engineering. PMID:24198483

  9. Significance of novel bioinorganic anodic aluminum oxide nanoscaffolds for promoting cellular response.

    Science.gov (United States)

    Poinern, Gérrard Eddy Jai; Shackleton, Robert; Mamun, Shariful Islam; Fawcett, Derek

    2011-01-14

    Tissue engineering is a multidisciplinary field that can directly benefit from the many advancements in nanotechnology and nanoscience. This article reviews a novel biocompatible anodic aluminum oxide (AAO, alumina) membrane in terms of tissue engineering. Cells respond and interact with their natural environment, the extracellular matrix, and the landscape of the substrate. The interaction with the topographical features of the landscape occurs both in the micrometer and nanoscales. If all these parameters are favorable to the cell, the cell will respond in terms of adhesion, proliferation, and migration. The role of the substrate/scaffold is crucial in soliciting a favorable response from the cell. The size and type of surface feature can directly influence the response and behavior of the cell. In the case of using an AAO membrane, the surface features and porosity of the membrane can be dictated at the nanoscale during the manufacturing stage. This is achieved by using general laboratory equipment to perform a relatively straightforward electrochemical process. During this technique, changing the operational parameters of the process directly controls the nanoscale features produced. For example, the pore size, pore density, and, hence, density can be effectively controlled during the synthesis of the AAO membrane. In addition, being able to control the pore size and porosity of a biomaterial such as AAO significantly broadens its application in tissue engineering.

  10. Predicting Statistical Distributions of Footbridge Vibrations

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2009-01-01

    The paper considers vibration response of footbridges to pedestrian loading. Employing Newmark and Monte Carlo simulation methods, a statistical distribution of bridge vibration levels is calculated modelling walking parameters such as step frequency and stride length as random variables...

  11. The relationship between procrastination, learning strategies and statistics anxiety among Iranian college students: a canonical correlation analysis.

    Science.gov (United States)

    Vahedi, Shahrum; Farrokhi, Farahman; Gahramani, Farahnaz; Issazadegan, Ali

    2012-01-01

    Approximately 66-80%of graduate students experience statistics anxiety and some researchers propose that many students identify statistics courses as the most anxiety-inducing courses in their academic curriculums. As such, it is likely that statistics anxiety is, in part, responsible for many students delaying enrollment in these courses for as long as possible. This paper proposes a canonical model by treating academic procrastination (AP), learning strategies (LS) as predictor variables and statistics anxiety (SA) as explained variables. A questionnaire survey was used for data collection and 246-college female student participated in this study. To examine the mutually independent relations between procrastination, learning strategies and statistics anxiety variables, a canonical correlation analysis was computed. Findings show that two canonical functions were statistically significant. The set of variables (metacognitive self-regulation, source management, preparing homework, preparing for test and preparing term papers) helped predict changes of statistics anxiety with respect to fearful behavior, Attitude towards math and class, Performance, but not Anxiety. These findings could be used in educational and psychological interventions in the context of statistics anxiety reduction.

  12. Gene Expression Programs in Response to Hypoxia: Cell Type Specificity and Prognostic Significance in Human Cancers.

    Directory of Open Access Journals (Sweden)

    2006-01-01

    Full Text Available BACKGROUND: Inadequate oxygen (hypoxia triggers a multifaceted cellular response that has important roles in normal physiology and in many human diseases. A transcription factor, hypoxia-inducible factor (HIF, plays a central role in the hypoxia response; its activity is regulated by the oxygen-dependent degradation of the HIF-1alpha protein. Despite the ubiquity and importance of hypoxia responses, little is known about the variation in the global transcriptional response to hypoxia among different cell types or how this variation might relate to tissue- and cell-specific diseases. METHODS AND FINDINGS: We analyzed the temporal changes in global transcript levels in response to hypoxia in primary renal proximal tubule epithelial cells, breast epithelial cells, smooth muscle cells, and endothelial cells with DNA microarrays. The extent of the transcriptional response to hypoxia was greatest in the renal tubule cells. This heightened response was associated with a uniquely high level of HIF-1alpha RNA in renal cells, and it could be diminished by reducing HIF-1alpha expression via RNA interference. A gene-expression signature of the hypoxia response, derived from our studies of cultured mammary and renal tubular epithelial cells, showed coordinated variation in several human cancers, and was a strong predictor of clinical outcomes in breast and ovarian cancers. In an analysis of a large, published gene-expression dataset from breast cancers, we found that the prognostic information in the hypoxia signature was virtually independent of that provided by the previously reported wound signature and more predictive of outcomes than any of the clinical parameters in current use. CONCLUSIONS: The transcriptional response to hypoxia varies among human cells. Some of this variation is traceable to variation in expression of the HIF1A gene. A gene-expression signature of the cellular response to hypoxia is associated with a significantly poorer prognosis

  13. Utilization of statistical table for waves in North-west Pacific Ocean and a long-term estimation on hull responses; Seihoku Taiheiyo haro tokeihyo no riyo to sentai oto choki yosoku

    Energy Technology Data Exchange (ETDEWEB)

    Shinkai, A [Kyushu University, Fukuoka (Japan). Faculty of Engineering

    1997-10-01

    Designing a vessel to sail oceans for an extended period of time requires statistical estimation on different hull responses to waves. To meet the requirement, it is necessary to accurately identify hydrographic conditions (particularly waves) which are considered to be encountered by the vessel. This paper makes clear the statistical characteristics of the wave statistics table presented by Fang et al, and compares them with other processes for discussion. This statistics collection is based on data collected in China, Hong Kong and Japan, including those collected in the Sea of Japan, the Yellow Sea, the North Sea, the East China Sea and the South China Sea. It was found that these data provide results slightly lower than the long-term estimation values derived from data of the global wave statistics (GWS) prepared by Hogben et al. The cause for this was found attributable to the format of the statistical data, especially setting of wave height classes. However, since the data provided by Fang et al include items of detailed information on small sea areas near the Chinese coast lines, the data are thought to provide useful information source in investigating long-term hull response characteristics relative to spatial spread of the sea areas in the North-west Pacific Ocean. 15 refs., 5 figs., 2 tabs.

  14. Damage detection of engine bladed-disks using multivariate statistical analysis

    Science.gov (United States)

    Fang, X.; Tang, J.

    2006-03-01

    The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.

  15. Selection of a design for response surface

    Science.gov (United States)

    Ranade, Shruti Sunil; Thiagarajan, Padma

    2017-11-01

    Box-Behnken, Central-Composite, D and I-optimal designs were compared using statistical tools. Experimental trials for all designs were generated. Random uniform responses were simulated for all models. R-square, Akaike and Bayesian Information Criterion for the fitted models were noted. One-way ANOVA and Tukey’s multiple comparison test were performed on these parameters. These models were evaluated based on the number of experimental trials generated in addition to the results of the statistical analyses. D-optimal design generated 12 trials in its model, which was lesser in comparison to both Central Composite and Box-Behnken designs. The R-square values of the fitted models were found to possess a statistically significant difference (P<0.0001). D-optimal design not only had the highest mean R-square value (0.7231), but also possessed the lowest means for both Akaike and Bayesian Information Criterion. The D-optimal design was recommended for generation of response surfaces, based on the assessment of the above parameters.

  16. Beyond Statistics: The Economic Content of Risk Scores

    Science.gov (United States)

    Einav, Liran; Finkelstein, Amy; Kluender, Raymond

    2016-01-01

    “Big data” and statistical techniques to score potential transactions have transformed insurance and credit markets. In this paper, we observe that these widely-used statistical scores summarize a much richer heterogeneity, and may be endogenous to the context in which they get applied. We demonstrate this point empirically using data from Medicare Part D, showing that risk scores confound underlying health and endogenous spending response to insurance. We then illustrate theoretically that when individuals have heterogeneous behavioral responses to contracts, strategic incentives for cream skimming can still exist, even in the presence of “perfect” risk scoring under a given contract. PMID:27429712

  17. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  18. Statistical Analysis and Evaluation of the Depth of the Ruts on Lithuanian State Significance Roads

    Directory of Open Access Journals (Sweden)

    Erinijus Getautis

    2011-04-01

    Full Text Available The aim of this work is to gather information about the national flexible pavement roads ruts depth, to determine its statistical dispersijon index and to determine their validity for needed requirements. Analysis of scientific works of ruts apearance in the asphalt and their influence for driving is presented in this work. Dynamical models of ruts in asphalt are presented in the work as well. Experimental outcome data of rut depth dispersijon in the national highway of Lithuania Vilnius – Kaunas is prepared. Conclusions are formulated and presented. Article in Lithuanian

  19. Perceived Statistical Knowledge Level and Self-Reported Statistical Practice Among Academic Psychologists

    Directory of Open Access Journals (Sweden)

    Laura Badenes-Ribera

    2018-06-01

    Full Text Available Introduction: Publications arguing against the null hypothesis significance testing (NHST procedure and in favor of good statistical practices have increased. The most frequently mentioned alternatives to NHST are effect size statistics (ES, confidence intervals (CIs, and meta-analyses. A recent survey conducted in Spain found that academic psychologists have poor knowledge about effect size statistics, confidence intervals, and graphic displays for meta-analyses, which might lead to a misinterpretation of the results. In addition, it also found that, although the use of ES is becoming generalized, the same thing is not true for CIs. Finally, academics with greater knowledge about ES statistics presented a profile closer to good statistical practice and research design. Our main purpose was to analyze the extension of these results to a different geographical area through a replication study.Methods: For this purpose, we elaborated an on-line survey that included the same items as the original research, and we asked academic psychologists to indicate their level of knowledge about ES, their CIs, and meta-analyses, and how they use them. The sample consisted of 159 Italian academic psychologists (54.09% women, mean age of 47.65 years. The mean number of years in the position of professor was 12.90 (SD = 10.21.Results: As in the original research, the results showed that, although the use of effect size estimates is becoming generalized, an under-reporting of CIs for ES persists. The most frequent ES statistics mentioned were Cohen's d and R2/η2, which can have outliers or show non-normality or violate statistical assumptions. In addition, academics showed poor knowledge about meta-analytic displays (e.g., forest plot and funnel plot and quality checklists for studies. Finally, academics with higher-level knowledge about ES statistics seem to have a profile closer to good statistical practices.Conclusions: Changing statistical practice is not

  20. Consecutive Acupuncture Stimulations Lead to Significantly Decreased Neural Responses

    NARCIS (Netherlands)

    Yeo, S.; Choe, I.H.; Noort, M.W.M.L. van den; Bosch, M.P.C.; Lim, S.

    2010-01-01

    Objective: Functional magnetic resonance imaging (fMRI), in combination with block design paradigms with consecutive acupuncture stimulations, has often been used to investigate the neural responses to acupuncture. In this study, we investigated whether previous acupuncture stimulations can affect

  1. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  2. Funding source and primary outcome changes in clinical trials registered on ClinicalTrials.gov are associated with the reporting of a statistically significant primary outcome: a cross-sectional study [v2; ref status: indexed, http://f1000r.es/5bj

    Directory of Open Access Journals (Sweden)

    Sreeram V Ramagopalan

    2015-04-01

    Full Text Available Background: We and others have shown a significant proportion of interventional trials registered on ClinicalTrials.gov have their primary outcomes altered after the listed study start and completion dates. The objectives of this study were to investigate whether changes made to primary outcomes are associated with the likelihood of reporting a statistically significant primary outcome on ClinicalTrials.gov. Methods: A cross-sectional analysis of all interventional clinical trials registered on ClinicalTrials.gov as of 20 November 2014 was performed. The main outcome was any change made to the initially listed primary outcome and the time of the change in relation to the trial start and end date. Findings: 13,238 completed interventional trials were registered with ClinicalTrials.gov that also had study results posted on the website. 2555 (19.3% had one or more statistically significant primary outcomes. Statistical analysis showed that registration year, funding source and primary outcome change after trial completion were associated with reporting a statistically significant primary outcome. Conclusions: Funding source and primary outcome change after trial completion are associated with a statistically significant primary outcome report on clinicaltrials.gov.

  3. Osteoarthritis of the thumb carpometacarpal joint: Correlation of ultrasound appearances to disability and treatment response

    International Nuclear Information System (INIS)

    Mallinson, P.I.; Tun, J.K.; Farnell, R.D.; Campbell, D.A.; Robinson, P.

    2013-01-01

    Aim: To evaluate grading of thumb carpometacarpal joint (CMCJ) osteoarthritis (OA) using ultrasound, correlating findings with disability and treatment response. Materials and methods: Patients with symptomatic thumb OA attending for ultrasound-guided CMCJ steroid injection and a group of asymptomatic controls were recruited prospectively. Thumb CMCJ ultrasound was graded (osteophytes, joint-space narrowing, capsule size, and measured capsule size), and a Disabilities of the Arm Shoulder and Hand (DASH) questionnaire was completed for each patient. Symptomatic patients then underwent injection with DASH repeated 6 weeks post-treatment. Ultrasound features were correlated with the initial DASH disability score and response as defined by change in DASH 6 weeks after treatment. Results: Thirty-one patients with symptomatic OA and 37 asymptomatic controls were recruited. With the exception of osteophytes (p = 0.017), no statistically significant correlation was demonstrated between severity of ultrasound features and patient disability. However, all features demonstrated statistically significant higher grades in the symptomatic group compared to controls. Ultrasound grading did not have statistical correlation with treatment response. Conclusion: No correlation was found between the majority of ultrasound features and the clinical severity of OA or likely response to treatment. However, these features are significantly more common in the symptomatic population

  4. Statistical analysis and data management

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found

  5. New applications of statistical tools in plant pathology.

    Science.gov (United States)

    Garrett, K A; Madden, L V; Hughes, G; Pfender, W F

    2004-09-01

    ABSTRACT The series of papers introduced by this one address a range of statistical applications in plant pathology, including survival analysis, nonparametric analysis of disease associations, multivariate analyses, neural networks, meta-analysis, and Bayesian statistics. Here we present an overview of additional applications of statistics in plant pathology. An analysis of variance based on the assumption of normally distributed responses with equal variances has been a standard approach in biology for decades. Advances in statistical theory and computation now make it convenient to appropriately deal with discrete responses using generalized linear models, with adjustments for overdispersion as needed. New nonparametric approaches are available for analysis of ordinal data such as disease ratings. Many experiments require the use of models with fixed and random effects for data analysis. New or expanded computing packages, such as SAS PROC MIXED, coupled with extensive advances in statistical theory, allow for appropriate analyses of normally distributed data using linear mixed models, and discrete data with generalized linear mixed models. Decision theory offers a framework in plant pathology for contexts such as the decision about whether to apply or withhold a treatment. Model selection can be performed using Akaike's information criterion. Plant pathologists studying pathogens at the population level have traditionally been the main consumers of statistical approaches in plant pathology, but new technologies such as microarrays supply estimates of gene expression for thousands of genes simultaneously and present challenges for statistical analysis. Applications to the study of the landscape of the field and of the genome share the risk of pseudoreplication, the problem of determining the appropriate scale of the experimental unit and of obtaining sufficient replication at that scale.

  6. Explorations in statistics: the log transformation.

    Science.gov (United States)

    Curran-Everett, Douglas

    2018-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This thirteenth installment of Explorations in Statistics explores the log transformation, an established technique that rescales the actual observations from an experiment so that the assumptions of some statistical analysis are better met. A general assumption in statistics is that the variability of some response Y is homogeneous across groups or across some predictor variable X. If the variability-the standard deviation-varies in rough proportion to the mean value of Y, a log transformation can equalize the standard deviations. Moreover, if the actual observations from an experiment conform to a skewed distribution, then a log transformation can make the theoretical distribution of the sample mean more consistent with a normal distribution. This is important: the results of a one-sample t test are meaningful only if the theoretical distribution of the sample mean is roughly normal. If we log-transform our observations, then we want to confirm the transformation was useful. We can do this if we use the Box-Cox method, if we bootstrap the sample mean and the statistic t itself, and if we assess the residual plots from the statistical model of the actual and transformed sample observations.

  7. [Big data in official statistics].

    Science.gov (United States)

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  8. The extended statistical analysis of toxicity tests using standardised effect sizes (SESs): a comparison of nine published papers.

    Science.gov (United States)

    Festing, Michael F W

    2014-01-01

    The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error) due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs) a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL) was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.

  9. The extended statistical analysis of toxicity tests using standardised effect sizes (SESs: a comparison of nine published papers.

    Directory of Open Access Journals (Sweden)

    Michael F W Festing

    Full Text Available The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.

  10. A Response to White and Gorard: Against Inferential Statistics: How and Why Current Statistics Teaching Gets It Wrong

    Science.gov (United States)

    Nicholson, James; Ridgway, Jim

    2017-01-01

    White and Gorard make important and relevant criticisms of some of the methods commonly used in social science research, but go further by criticising the logical basis for inferential statistical tests. This paper comments briefly on matters we broadly agree on with them and more fully on matters where we disagree. We agree that too little…

  11. The Relationship Between Procrastination, Learning Strategies and Statistics Anxiety Among Iranian College Students: A Canonical Correlation Analysis

    Science.gov (United States)

    Vahedi, Shahrum; Farrokhi, Farahman; Gahramani, Farahnaz; Issazadegan, Ali

    2012-01-01

    Objective: Approximately 66-80%of graduate students experience statistics anxiety and some researchers propose that many students identify statistics courses as the most anxiety-inducing courses in their academic curriculums. As such, it is likely that statistics anxiety is, in part, responsible for many students delaying enrollment in these courses for as long as possible. This paper proposes a canonical model by treating academic procrastination (AP), learning strategies (LS) as predictor variables and statistics anxiety (SA) as explained variables. Methods: A questionnaire survey was used for data collection and 246-college female student participated in this study. To examine the mutually independent relations between procrastination, learning strategies and statistics anxiety variables, a canonical correlation analysis was computed. Results: Findings show that two canonical functions were statistically significant. The set of variables (metacognitive self-regulation, source management, preparing homework, preparing for test and preparing term papers) helped predict changes of statistics anxiety with respect to fearful behavior, Attitude towards math and class, Performance, but not Anxiety. Conclusion: These findings could be used in educational and psychological interventions in the context of statistics anxiety reduction. PMID:24644468

  12. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  13. Statistical aspects of the cleanup of Enewetak Atoll

    International Nuclear Information System (INIS)

    Giacomini, J.J.; Miller, F.L. Jr.

    1981-01-01

    The Desert Research Institute participated in the Enewetak Atoll Radiological Cleanup by providing data-base management and statistical analysis support for the Department of Energy team. The data-base management responsibilities included both design and implementation of a system for recording (in machine-retrievable form) all radiological measurements made during the cleanup, excluding personnel dosimetry. Statistical analyses were performed throughout the cleanup and were used to guide excavation activities

  14. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  15. The response of clouds and aerosols to cosmic ray decreases

    DEFF Research Database (Denmark)

    Svensmark, J.; Enghoff, Martin Andreas Bødker; Shaviv, N. J.

    2016-01-01

    A method is developed to rank Forbush Decreases (FDs) in the galactic cosmic ray radiation according to their expected impact on the ionization of the lower atmosphere. Then a Monte Carlo bootstrap based statistical test is formulated to estimate the significance of the apparent response in physi......A method is developed to rank Forbush Decreases (FDs) in the galactic cosmic ray radiation according to their expected impact on the ionization of the lower atmosphere. Then a Monte Carlo bootstrap based statistical test is formulated to estimate the significance of the apparent response...... in physical and micro-physical cloud parameters to FDs. The test is subsequently applied to one ground based and three satellite based datasets. Responses (> 95%) to FDs are found in the following parameters of the analyzed datasets. AERONET: Ångström exponent (cloud condensation nuclei changes), SSM...... with the strength of the FDs, and the signs and magnitudes of the responses agree with model based expectations. The effect is mainly seen in liquid clouds. An impact through changes in UV driven photo chemistry is shown to be negligible and an impact via UV absorption in the stratosphere is found to have no effect...

  16. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497

  18. Parametric analysis of the statistical model of the stick-slip process

    Science.gov (United States)

    Lima, Roberta; Sampaio, Rubens

    2017-06-01

    In this paper it is performed a parametric analysis of the statistical model of the response of a dry-friction oscillator. The oscillator is a spring-mass system which moves over a base with a rough surface. Due to this roughness, the mass is subject to a dry-frictional force modeled as a Coulomb friction. The system is stochastically excited by an imposed bang-bang base motion. The base velocity is modeled by a Poisson process for which a probabilistic model is fully specified. The excitation induces in the system stochastic stick-slip oscillations. The system response is composed by a random sequence alternating stick and slip-modes. With realizations of the system, a statistical model is constructed for this sequence. In this statistical model, the variables of interest of the sequence are modeled as random variables, as for example, the number of time intervals in which stick or slip occur, the instants at which they begin, and their duration. Samples of the system response are computed by integration of the dynamic equation of the system using independent samples of the base motion. Statistics and histograms of the random variables which characterize the stick-slip process are estimated for the generated samples. The objective of the paper is to analyze how these estimated statistics and histograms vary with the system parameters, i.e., to make a parametric analysis of the statistical model of the stick-slip process.

  19. Response surface methodology: A non-conventional statistical tool to maximize the throughput of Streptomyces species biomass and their bioactive metabolites.

    Science.gov (United States)

    Latha, Selvanathan; Sivaranjani, Govindhan; Dhanasekaran, Dharumadurai

    2017-09-01

    Among diverse actinobacteria, Streptomyces is a renowned ongoing source for the production of a large number of secondary metabolites, furnishing immeasurable pharmacological and biological activities. Hence, to meet the demand of new lead compounds for human and animal use, research is constantly targeting the bioprospecting of Streptomyces. Optimization of media components and physicochemical parameters is a plausible approach for the exploration of intensified production of novel as well as existing bioactive metabolites from various microbes, which is usually achieved by a range of classical techniques including one factor at a time (OFAT). However, the major drawbacks of conventional optimization methods have directed the use of statistical optimization approaches in fermentation process development. Response surface methodology (RSM) is one of the empirical techniques extensively used for modeling, optimization and analysis of fermentation processes. To date, several researchers have implemented RSM in different bioprocess optimization accountable for the production of assorted natural substances from Streptomyces in which the results are very promising. This review summarizes some of the recent RSM adopted studies for the enhanced production of antibiotics, enzymes and probiotics using Streptomyces with the intention to highlight the significance of Streptomyces as well as RSM to the research community and industries.

  20. Brake response time is significantly impaired after total knee arthroplasty: investigation of performing an emergency stop while driving a car.

    Science.gov (United States)

    Jordan, Maurice; Hofmann, Ulf-Krister; Rondak, Ina; Götze, Marco; Kluba, Torsten; Ipach, Ingmar

    2015-09-01

    The objective of this study was to investigate whether total knee arthroplasty (TKA) impairs the ability to perform an emergency stop. An automatic transmission brake simulator was developed to evaluate total brake response time. A prospective repeated-measures design was used. Forty patients (20 left/20 right) were measured 8 days and 6, 12, and 52 wks after surgery. Eight days postoperative total brake response time increased significantly by 30% in right TKA and insignificantly by 2% in left TKA. Brake force significantly decreased by 35% in right TKA and by 25% in left TKA during this period. Baseline values were reached at week 12 in right TKA; the impairment of outcome measures, however, was no longer significant at week 6 compared with preoperative values. Total brake response time and brake force in left TKA fell below baseline values at weeks 6 and 12. Brake force in left TKA was the only outcome measure significantly impaired 8 days postoperatively. This study highlights that categorical statements cannot be provided. This study's findings on automatic transmission driving suggest that right TKA patients may resume driving 6 wks postoperatively. Fitness to drive in left TKA is not fully recovered 8 days postoperatively. If testing is not available, patients should refrain from driving until they return from rehabilitation.

  1. Comparison of pure and 'Latinized' centroidal Voronoi tessellation against various other statistical sampling methods

    International Nuclear Information System (INIS)

    Romero, Vicente J.; Burkardt, John V.; Gunzburger, Max D.; Peterson, Janet S.

    2006-01-01

    A recently developed centroidal Voronoi tessellation (CVT) sampling method is investigated here to assess its suitability for use in statistical sampling applications. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M-dimensional parameter spaces. On several 2-D test problems CVT has recently been found to provide exceedingly effective and efficient point distributions for response surface generation. Additionally, for statistical function integration and estimation of response statistics associated with uniformly distributed random-variable inputs (uncorrelated), CVT has been found in initial investigations to provide superior points sets when compared against latin-hypercube and simple-random Monte Carlo methods and Halton and Hammersley quasi-random sequence methods. In this paper, the performance of all these sampling methods and a new variant ('Latinized' CVT) are further compared for non-uniform input distributions. Specifically, given uncorrelated normal inputs in a 2-D test problem, statistical sampling efficiencies are compared for resolving various statistics of response: mean, variance, and exceedence probabilities

  2. Development of free statistical software enabling researchers to calculate confidence levels, clinical significance curves and risk-benefit contours

    International Nuclear Information System (INIS)

    Shakespeare, T.P.; Mukherjee, R.K.; Gebski, V.J.

    2003-01-01

    Confidence levels, clinical significance curves, and risk-benefit contours are tools improving analysis of clinical studies and minimizing misinterpretation of published results, however no software has been available for their calculation. The objective was to develop software to help clinicians utilize these tools. Excel 2000 spreadsheets were designed using only built-in functions, without macros. The workbook was protected and encrypted so that users can modify only input cells. The workbook has 4 spreadsheets for use in studies comparing two patient groups. Sheet 1 comprises instructions and graphic examples for use. Sheet 2 allows the user to input the main study results (e.g. survival rates) into a 2-by-2 table. Confidence intervals (95%), p-value and the confidence level for Treatment A being better than Treatment B are automatically generated. An additional input cell allows the user to determine the confidence associated with a specified level of benefit. For example if the user wishes to know the confidence that Treatment A is at least 10% better than B, 10% is entered. Sheet 2 automatically displays clinical significance curves, graphically illustrating confidence levels for all possible benefits of one treatment over the other. Sheet 3 allows input of toxicity data, and calculates the confidence that one treatment is more toxic than the other. It also determines the confidence that the relative toxicity of the most effective arm does not exceed user-defined tolerability. Sheet 4 automatically calculates risk-benefit contours, displaying the confidence associated with a specified scenario of minimum benefit and maximum risk of one treatment arm over the other. The spreadsheet is freely downloadable at www.ontumor.com/professional/statistics.htm A simple, self-explanatory, freely available spreadsheet calculator was developed using Excel 2000. The incorporated decision-making tools can be used for data analysis and improve the reporting of results of any

  3. Statistical image analysis of cerebral blood flow in moyamoya disease

    International Nuclear Information System (INIS)

    Yamada, Masaru; Yuzawa, Izumi; Suzuki, Sachio; Kurata, Akira; Fujii, Kiyotaka; Asano, Yuji

    2007-01-01

    The Summary of this study was to investigate pathophysiology of moyamoya disease, we analyzed brain single photon emission tomography (SPECT) images of patients with this disease by using interface software for a 3-dimensional (3D) data extraction format. Presenting symptoms were transient ischemic attack (TIA) in 21 patients and hemorrhage in 6 patients. All the patients underwent brain SPECT scan of 123 I-iofetamine (IMP) at rest and after acetazolamide challenge (17 mg/kg iv, 2-day method). Cerebral blood flow (CBF) was quantitatively measured using arterial blood sampling and an autoradiography model. The group of the patients who presented with TIAs showed decreased CBF in the frontal lobe at rest compared to that of patients with hemorrhage, but Z-score ((mean-patient data)/ standard deviation (SD)) did not reach statistical significance. Significant CBF decrease after acetazolamide challenge was observed in a wider cerebral cortical area in the TIA group than in the hemorrhagic group. The brain region of hemodynamic ischemia (stage II) correlated well with the responsible cortical area for clinical symptoms of TIA. A hemodynamic ischemia stage image clearly represented recovery of reserve capacity after bypass surgery. Statistical evaluation of SPECT may be useful to understand and clarify the pathophysiology of this disease. (author)

  4. Statistical modeling of the response characteristics of mechanosensitive stimuli in the human esophagus

    DEFF Research Database (Denmark)

    Drewes, A.M.; Reddy, H.; Staahl, C.

    2005-01-01

    by using a statistical model based on correlation analysis. The esophagus was distended with a bag in 32 healthy subjects by using an inflation rate of 25 mL/min. The luminal cross-sectional areas and sensory ratings were determined during the distentions. The stimuli were repeated after relaxation...... of mechanical gut stimuli in human beings. This might increase our understanding of visceral pain in health and disease and guide the statistical analysis of experimental data obtained in the gastrointestinal tract....

  5. Statistical learning of multisensory regularities is enhanced in musicians: An MEG study.

    Science.gov (United States)

    Paraskevopoulos, Evangelos; Chalas, Nikolas; Kartsidis, Panagiotis; Wollbrink, Andreas; Bamidis, Panagiotis

    2018-07-15

    The present study used magnetoencephalography (MEG) to identify the neural correlates of audiovisual statistical learning, while disentangling the differential contributions of uni- and multi-modal statistical mismatch responses in humans. The applied paradigm was based on a combination of a statistical learning paradigm and a multisensory oddball one, combining an audiovisual, an auditory and a visual stimulation stream, along with the corresponding deviances. Plasticity effects due to musical expertise were investigated by comparing the behavioral and MEG responses of musicians to non-musicians. The behavioral results indicated that the learning was successful for both musicians and non-musicians. The unimodal MEG responses are consistent with previous studies, revealing the contribution of Heschl's gyrus for the identification of auditory statistical mismatches and the contribution of medial temporal and visual association areas for the visual modality. The cortical network underlying audiovisual statistical learning was found to be partly common and partly distinct from the corresponding unimodal networks, comprising right temporal and left inferior frontal sources. Musicians showed enhanced activation in superior temporal and superior frontal gyrus. Connectivity and information processing flow amongst the sources comprising the cortical network of audiovisual statistical learning, as estimated by transfer entropy, was reorganized in musicians, indicating enhanced top-down processing. This neuroplastic effect showed a cross-modal stability between the auditory and audiovisual modalities. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Difference in method of administration did not significantly impact item response

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    assistant (PDA), or personal computer (PC) on the Internet, and a second form by PC, in the same administration. Structural invariance, equivalence of item responses, and measurement precision were evaluated using confirmatory factor analysis and item response theory methods. RESULTS: Multigroup...... levels in IVR, PQ, or PDA administration as compared to PC. Availability of large item response theory-calibrated PROMIS item banks allowed for innovations in study design and analysis.......PURPOSE: To test the impact of method of administration (MOA) on the measurement characteristics of items developed in the Patient-Reported Outcomes Measurement Information System (PROMIS). METHODS: Two non-overlapping parallel 8-item forms from each of three PROMIS domains (physical function...

  7. Blood phenylalanine concentrations in patients with PAH-deficient hyperphenylalaninaemia off diet without and with three different single oral doses of tetrahydrobiopterin: assessing responsiveness in a model of statistical process control.

    Science.gov (United States)

    Lindner, M; Gramer, G; Garbade, S F; Burgard, P

    2009-08-01

    Tetrahydrobiopterin (BH(4)) cofactor loading is a standard procedure to differentiate defects of BH(4) metabolism from phenylalanine hydroxylase (PAH) deficiency. BH(4) responsiveness also exists in PAH-deficient patients with high residual PAH activity. Unexpectedly, single cases with presumed nil residual PAH activity have been reported to be BH(4) responsive, too. BH(4) responsiveness has been defined either by a >or=30% reduction of blood Phe concentration after a single BH(4) dose or by a decline greater than the individual circadian Phe level variation. Since both methods have methodological disadvantages, we present a model of statistical process control (SPC) to assess BH(4) responsiveness. Phe levels in 17 adult PKU patients of three phenotypic groups off diet were compared without and with three different single oral dosages of BH(4) applied in a double-blind randomized cross-over design. Results are compared for >or=30% reduction and SPC. The effect of BH(4) by >or=30% reduction was significant for groups (p < 0.01) but not for dose (p = 0.064), with no interaction of group with dose (p = 0.24). SPC revealed significant effects for group (p < 0.01) and the interaction for group with dose (p < 0.05) but not for dose alone (p = 0.87). After one or more loadings, seven patients would be judged to be BH(4) responsive either by the 30% criterion or by the SPC model, but only three by both. Results for patients with identical PAH genotype were not very consistent within (for different BH(4) doses) and between the two models. We conclude that a comparison of protein loadings without and with BH(4) combined with a standardized procedure for data analysis and decision would increase the reliability of diagnostic results.

  8. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  9. Review of the Statistical Techniques in Medical Sciences | Okeh ...

    African Journals Online (AJOL)

    ... medical researcher in selecting the appropriate statistical techniques. Of course, all statistical techniques have certain underlying assumptions, which must be checked before the technique is applied. Keywords: Variable, Prospective Studies, Retrospective Studies, Statistical significance. Bio-Research Vol. 6 (1) 2008: pp.

  10. Non-linear signal response functions and their effects on the statistical and noise cancellation properties of isotope ratio measurements by multi-collector plasma mass spectrometry

    International Nuclear Information System (INIS)

    Doherty, W.

    2013-01-01

    A nebulizer-centric response function model of the analytical inductively coupled argon plasma ion source was used to investigate the statistical frequency distributions and noise reduction factors of simultaneously measured flicker noise limited isotope ion signals and their ratios. The response function model was extended by assuming i) a single gaussian distributed random noise source (nebulizer gas pressure fluctuations) and ii) the isotope ion signal response is a parabolic function of the nebulizer gas pressure. Model calculations of ion signal and signal ratio histograms were obtained by applying the statistical method of translation to the non-linear response function model of the plasma. Histograms of Ni, Cu, Pr, Tl and Pb isotope ion signals measured using a multi-collector plasma mass spectrometer were, without exception, negative skew. Histograms of the corresponding isotope ratios of Ni, Cu, Tl and Pb were either positive or negative skew. There was a complete agreement between the measured and model calculated histogram skew properties. The nebulizer-centric response function model was also used to investigate the effect of non-linear response functions on the effectiveness of noise cancellation by signal division. An alternative noise correction procedure suitable for parabolic signal response functions was derived and applied to measurements of isotope ratios of Cu, Ni, Pb and Tl. The largest noise reduction factors were always obtained when the non-linearity of the response functions was taken into account by the isotope ratio calculation. Possible applications of the nebulizer-centric response function model to other types of analytical instrumentation, large amplitude signal noise sources (e.g., lasers, pumped nebulizers) and analytical error in isotope ratio measurements by multi-collector plasma mass spectrometry are discussed. - Highlights: ► Isotope ion signal noise is modelled as a parabolic transform of a gaussian variable. ► Flicker

  11. Spectral and cross-spectral analysis of uneven time series with the smoothed Lomb-Scargle periodogram and Monte Carlo evaluation of statistical significance

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.

    2012-12-01

    Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and

  12. Dose Response Model of Biological Reaction to Low Dose Rate Gamma Radiation

    International Nuclear Information System (INIS)

    Magae, J.; Furikawa, C.; Hoshi, Y.; Kawakami, Y.; Ogata, H.

    2004-01-01

    It is necessary to use reproducible and stable indicators to evaluate biological responses to long term irradiation at low dose-rate. They should be simple and quantitative enough to produce the results statistically accurate, because we have to analyze the subtle changes of biological responses around background level at low dose. For these purposes we chose micronucleus formation of U2OS, a human osteosarcoma cell line, as indicators of biological responses. Cells were exposed to gamma ray in irradiation rom bearing 50,000 Ci 60Co. After irradiation, they were cultured for 24 h in the presence of cytochalasin B to block cytokinesis, and cytoplasm and nucleus were stained with DAPI and prospidium iodide, respectively. the number of binuclear cells bearing micronuclei was counted under a fluorescence microscope. Dose rate in the irradiation room was measured with PLD. Dose response of PLD is linear between 1 mGy to 10 Gy, and standard deviation of triplicate count was several percent of mean value. We fitted statistically dose response curves to the data, and they were plotted on the coordinate of linearly scale response and dose. The results followed to the straight line passing through the origin of the coordinate axes between 0.1-5 Gy, and dose and does rate effectiveness factor (DDREF) was less than 2 when cells were irradiated for 1-10 min. Difference of the percent binuclear cells bearing micronucleus between irradiated cells and control cells was not statistically significant at the dose above 0.1 Gy when 5,000 binuclear cells were analyzed. In contrast, dose response curves never followed LNT, when cells were irradiated for 7 to 124 days. Difference of the percent binuclear cells bearing micronucleus between irradiated cells and control cells was not statistically significant at the dose below 6 Gy, when cells were continuously irradiated for 124 days. These results suggest that dose response curve of biological reaction is remarkably affected by exposure

  13. Significance of psychological stress response and health-related quality of life in spouses of cancer patients when given bad news

    Directory of Open Access Journals (Sweden)

    Toyoko Kugimoto

    2017-01-01

    Full Text Available Objective: This study illuminates the degree of psychological stress response experienced by spouses of cancer patients when given bad news at three different times (notification of the name of the disease, notification of recurrence, and notification of terminality as well as the factors that influence the response and the health status of the spouse as measured by health-related quality of life (QOL. Methods: A total of 203 individuals (57 men and 146 women who had received the three types of news were surveyed using a self-report questionnaire on psychological stress response, marital satisfaction, and health-related QOL scales. Results: The degree of the psychological stress response was the highest for notification of terminality, followed by notification of the name of the disease, and notification of recurrence. The influencing factors varied depending on the notification period. Although no significant difference was observed for health-related QOL among the three notification types, significant differences were observed for certain items when compared with national standard values. Conclusions: When a notification of terminality, which produced the highest psychological stress response, is given, providing care that considers health-related QOL is necessary not only for patients but also for their spouses.

  14. Foetal response to music and voice.

    Science.gov (United States)

    Al-Qahtani, Noura H

    2005-10-01

    To examine whether prenatal exposure to music and voice alters foetal behaviour and whether foetal response to music differs from human voice. A prospective observational study was conducted in 20 normal term pregnant mothers. Ten foetuses were exposed to music and voice for 15 s at different sound pressure levels to find out the optimal setting for the auditory stimulation. Music, voice and sham were played to another 10 foetuses via a headphone on the maternal abdomen. The sound pressure level was 105 db and 94 db for music and voice, respectively. Computerised assessment of foetal heart rate and activity were recorded. 90 actocardiograms were obtained for the whole group. One way anova followed by posthoc (Student-Newman-Keuls method) analysis was used to find if there is significant difference in foetal response to music and voice versus sham. Foetuses responded with heart rate acceleration and motor response to both music and voice. This was statistically significant compared to sham. There was no significant difference between the foetal heart rate acceleration to music and voice. Prenatal exposure to music and voice alters the foetal behaviour. No difference was detected in foetal response to music and voice.

  15. The significance of environmental responsibility on airline customers' intention to purchase

    OpenAIRE

    Merilä, Outi

    2015-01-01

    Operating in an energy intensive industry, airlines’ environmental performance is under constant scrutiny of the regulators and authorities. By contrast, it seems that not many airlines have considered gaining competitive advantage in by differentiating as an environmentally responsible carrier. The commissioning company for this Master’s Thesis was Finnair, and the intention of this study was to find out whether factors related to environmental responsibility affect Swedish...

  16. Impact of Damping Uncertainty on SEA Model Response Variance

    Science.gov (United States)

    Schiller, Noah; Cabell, Randolph; Grosveld, Ferdinand

    2010-01-01

    Statistical Energy Analysis (SEA) is commonly used to predict high-frequency vibroacoustic levels. This statistical approach provides the mean response over an ensemble of random subsystems that share the same gross system properties such as density, size, and damping. Recently, techniques have been developed to predict the ensemble variance as well as the mean response. However these techniques do not account for uncertainties in the system properties. In the present paper uncertainty in the damping loss factor is propagated through SEA to obtain more realistic prediction bounds that account for both ensemble and damping variance. The analysis is performed on a floor-equipped cylindrical test article that resembles an aircraft fuselage. Realistic bounds on the damping loss factor are determined from measurements acquired on the sidewall of the test article. The analysis demonstrates that uncertainties in damping have the potential to significantly impact the mean and variance of the predicted response.

  17. Poly(methacrylic acid-ran-2-vinylpyridine Statistical Copolymer and Derived Dual pH-Temperature Responsive Block Copolymers by Nitroxide-Mediated Polymerization

    Directory of Open Access Journals (Sweden)

    Milan Marić

    2017-02-01

    Full Text Available Nitroxide-mediated polymerization using the succinimidyl ester functional unimolecular alkoxyamine initiator (NHS-BlocBuilder was used to first copolymerize tert-butyl methacrylate/2-vinylpyridine (tBMA/2VP with low dispersity (Đ = 1.30–1.41 and controlled growth (linear number average molecular Mn versus conversion, Mn = 3.8–10.4 kg·mol−1 across a wide composition of ranges (initial mol fraction 2VP, f2VP,0 = 0.10–0.90. The resulting statistical copolymers were first de-protected to give statistical polyampholytic copolymers comprised of methacrylic acid/2VP (MAA/2VP units. These copolymers exhibited tunable water-solubility due to the different pKas of the acidic MAA and basic 2VP units; being soluble at very low pH < 3 and high pH > 8. One of the tBMA/2VP copolymers was used as a macroinitiator for a 4-acryloylmorpholine/4-acryloylpiperidine (4AM/4AP mixture, to provide a second block with thermo-responsive behavior with tunable cloud point temperature (CPT, depending on the ratio of 4AM:4AP. Dynamic light scattering of the block copolymer at various pHs (3, 7 and 10 as a function of temperature indicated a rapid increase in particle size >2000 nm at 22–27 °C, corresponding to the 4AM/4AP segment’s thermos-responsiveness followed by a leveling in particle size to about 500 nm at higher temperatures.

  18. Prognostic significance of pathological response of primary tumor and metastatic axillary lymph nodes after neoadjuvant chemotherapy for locally advanced breast carcinoma.

    Science.gov (United States)

    Machiavelli, M R; Romero, A O; Pérez, J E; Lacava, J A; Domínguez, M E; Rodríguez, R; Barbieri, M R; Romero Acuña, L A; Romero Acuña, J M; Langhi, M J; Amato, S; Ortiz, E H; Vallejo, C T; Leone, B A

    1998-01-01

    The prognostic significance of pathological response of primary tumor and metastatic axillary lymph nodes after neoadjuvant chemotherapy was assessed in patients with noninflammatory locally advanced breast carcinoma. Between January 1989 and April 1995, 148 consecutive patients with locally advanced breast carcinoma participated in the study. Of these, 140 fully evaluable patients (67, stage IIIA; 73, stage IIIB) were treated with three courses of 5-fluorouracil, doxorubicin, and cyclophosphamide (FAC), followed by modified radical mastectomy when technically feasible or definitive radiation therapy. The median age was 53 years (range, 26 to 75 years); 55% of patients were postmenopausal. Objective response was recorded in 99 of 140 patients (71%; 95% confidence interval, 63% to 79%). Complete response occurred in 11 patients (8%), and partial response occurred in 88 patients (63%). No change was recorded in 37 patients (26%), and progressive disease occurred in 4 patients (3%). One hundred and thirty-six patients underwent the planned surgery. Maximal pathological response of the primary tumor (in situ carcinoma or minimal microscopic residual tumor) was observed in 24 (18%); 112 patients (82%) presented minimal pathological response of the primary tumor (gross residual tumor). The number of metastatic axillary nodes after neoadjuvant chemotherapy was as follows: N0, 39 patients (29%); N1-N3, 35 patients (26%); > N3, 62 patients (45%). Considering the initial TNM status, 75% of the patients had decreases in tumor compartment after neoadjuvant chemotherapy. Also, 31% and 23% of patients with clinical N1 and N2, respectively, showed uninvolved axillary lymph nodes. A significant correlation was noted between pathological response of primary tumor and the number of metastatic axillary lymph nodes. Median disease-free survival was 34 months, whereas median overall survival was 66 months. Pathological responses of both primary tumor and metastatic axillary lymph nodes

  19. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  20. Dose response relationship in local radiotherapy for hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Park, Hee Chul; Seong, Jin Sil; Han, Kwang Hyub; Chon, Chae Yoon; Moon, Young Myoung; Song, Jae Seok; Suh, Chang Ok

    2001-01-01

    In this study, it was investigated whether dose response relation existed or not in local radiotherapy for primary hepatocellular carcinoma. From January 1992 to March 2000, 158 patients were included in present study. Exclusion criteria included the presence of extrahepatic metastasis, liver cirrhosis of Child's class C, tumors occupying more than two thirds of the entire liver, and performance status on the ECOG scale of more than 3. Radiotherapy was given to the field including tumor with generous margin using 6, 10-MV X-ray. Mean tumor dose was 48.2±7.9 Gy in daily 1.8 Gy fractions. Tumor response was based on diagnostic radiologic examinations such as CT scan, MR imaging, hepatic artery angiography at 4-8 weeks following completion of treatment. Statistical analysis was done to investigate the existence of dose response relationship of local radiotherapy when it was applied to the treatment of primary hepatocellular carcinoma. An objective response was observed in 106 of 158 patients, giving a response rate of 67. 1%. Statistical analysis revealed that total dose was the most significant factor in relation to tumor response when local radiotherapy was applied to the treatment of primary hepatocellular carcinoma. Only 29.2% showed objective response in patients treated with dose less than 40 Gy, while 68.6% and 77.1 % showed major response in patients with 40-50 Gy and more than 50 Gy, respectively. Child-Pugh classification was significant factor in the development of ascites, overt radiation induced liver disease and gastroenteritis. Radiation dose was an important factor for development of radiation induced gastroduodenal ulcer. Present study showed the existence of dose response relationship in local radiotherapy for primary hepatocellular carcinoma. Only radiotherapy dose was a significant factor to predict the objective response. Further study is required to predict the maximal tolerance dose in consideration of liver function and non-irradiated liver

  1. Simulating Durum Wheat (Triticum turgidum L. Response to Root Zone Salinity based on Statistics and Macroscopic Models

    Directory of Open Access Journals (Sweden)

    Vahid Reza Jalali

    2017-10-01

    Full Text Available Introduction Salinity as an abiotic stress can cause excessive disturbance for seed germination and plant sustainable production. Salinity with three different mechanisms of osmotic potential reduction, ionic toxicity and disturbance of plant nutritional balance, can reduce performance of the final product. Planning for optimal use of available water and saline water with poor quality in agricultural activities is of great importance. Wheat is one of the eight main food sources including rice, corn, sugar beet, cattle, sorghum, millet and cassava which provide 70-90% of all calories and 66-90% of the protein consumed in developing countries. Durum wheat (Triticum turgidum L. is an important crop grows in some arid and semi-arid areas of the world such as Middle East and North Africa. In these regions, in addition to soil salinity, sharp decline in rainfall and a sharp drop in groundwater levels in recent years has emphasized on the efficient use of limited soil and water resources. Consequently, in order to use brackish water for agricultural productions, it is required to analyze its quantitative response to salinity stress by simulation models in those regions. The objective of this study is to assess the capability of statistics and macro-simulation models of yield in saline conditions. Materials and methods In this study, two general approach of simulation includes process-physical models and statistical-experimental models were investigated. For this purpose, in order to quantify the salinity effect on seed relative yield of durum wheat (Behrang Variety at different levels of soil salinity, process-physical models of Maas & Hoffman, van Genuchten & Hoffman, Dirksen et al. and Homaee et al. models were used. Also, statistical-experimental models of Modified Gompertz Function, Bi-Exponential Function and Modified Weibull Function were used too. In order to get closer to real conditions of growth circumstances in saline soils, a natural saline

  2. Space radiation-induced bystander effect: kinetics of biologic responses, mechanisms, and significance of secondary radiations

    International Nuclear Information System (INIS)

    Gonon, Geraldine

    2011-01-01

    Widespread evidence indicates that exposure of cell cultures to a particles results in significant biological changes in both the irradiated and non-irradiated bystander cells in the population. The induction of non-targeted biological responses in cell cultures exposed to low fluences of high charge (Z) and high energy (E) particles is relevant to estimates of the health risks of space radiation and to radiotherapy. Here, we investigated the mechanisms underlying the induction of stressful effects in confluent normal human fibroblast cultures exposed to low fluences of 1000 MeV/u iron ions (linear energy transfer (LET) 151 keV/μm), 600 MeV/u silicon ions (LET 50 keV/μm) or 290 MeV/u carbon ions (LET 13 keV/μm). We compared the results with those obtained in cell cultures exposed, in parallel, to low fluences of 0.92 MeV/u a particles (LET 109 keV/μm). Induction of DNA damage, changes in gene expression, protein carbonylation and lipid peroxidation during 24 h after exposure of confluent cultures to mean doses as low as 0.2 cGy of iron or silicon ions strongly supported the propagation of stressful effects from irradiated to bystander cells. At a mean dose of 0.2 cGy, only 1 and 3 % of the cells would be targeted through the nucleus by an iron or silicon ion, respectively. Within 24 h post-irradiation, immunoblot analyses revealed significant increases in the levels of phospho-TP53 (serine 15), p21Waf1 (also known as CDKN1A), HDM2, phospho-ERK1/2, protein carbonylation and lipid peroxidation. The magnitude of the responses suggested participation of non-targeted cells in the response. Furthermore, when the irradiated cell populations were subcultured in fresh medium shortly after irradiation, greater than expected increases in the levels of these markers were also observed during 24 h. Together, the results imply a rapidly propagated and persistent bystander effect. In situ analyses in confluent cultures showed 53BP1 foci formation, a marker of DNA damage, in

  3. [Statistics for statistics?--Thoughts about psychological tools].

    Science.gov (United States)

    Berger, Uwe; Stöbel-Richter, Yve

    2007-12-01

    Statistical methods take a prominent place among psychologists' educational programs. Being known as difficult to understand and heavy to learn, students fear of these contents. Those, who do not aspire after a research carrier at the university, will forget the drilled contents fast. Furthermore, because it does not apply for the work with patients and other target groups at a first glance, the methodological education as a whole was often questioned. For many psychological practitioners the statistical education makes only sense by enforcing respect against other professions, namely physicians. For the own business, statistics is rarely taken seriously as a professional tool. The reason seems to be clear: Statistics treats numbers, while psychotherapy treats subjects. So, does statistics ends in itself? With this article, we try to answer the question, if and how statistical methods were represented within the psychotherapeutical and psychological research. Therefore, we analyzed 46 Originals of a complete volume of the journal Psychotherapy, Psychosomatics, Psychological Medicine (PPmP). Within the volume, 28 different analyse methods were applied, from which 89 per cent were directly based upon statistics. To be able to write and critically read Originals as a backbone of research, presumes a high degree of statistical education. To ignore statistics means to ignore research and at least to reveal the own professional work to arbitrariness.

  4. Functional significance of the electrocorticographic auditory responses in the premotor cortex

    Directory of Open Access Journals (Sweden)

    Kazuyo eTanji

    2015-03-01

    Full Text Available Other than well-known motor activities in the precentral gyrus, functional magnetic resonance imaging (fMRI studies have found that the ventral part of the precentral gyrus is activated in response to linguistic auditory stimuli. It has been proposed that the premotor cortex in the precentral gyrus is responsible for the comprehension of speech, but the precise function of this area is still debated because patients with frontal lesions that include the precentral gyrus do not exhibit disturbances in speech comprehension. We report on a patient who underwent resection of the tumor in the precentral gyrus with electrocorticographic recordings while she performed the verb generation task during awake brain craniotomy. Consistent with previous fMRI studies, high-gamma band auditory activity was observed in the precentral gyrus. Due to the location of the tumor, the patient underwent resection of the auditory responsive precentral area which resulted in the post-operative expression of a characteristic articulatory disturbance known as apraxia of speech (AOS. The language function of the patient was otherwise preserved and she exhibited intact comprehension of both spoken and written language. The present findings demonstrated that a lesion restricted to the ventral precentral gyrus is sufficient for the expression of AOS and suggest that the auditory-responsive area plays an important role in the execution of fluent speech rather than the comprehension of speech. These findings also confirm that the function of the premotor area is predominantly motor in nature and its sensory responses is more consistent with the ‘sensory theory of speech production’, in which it was proposed that sensory representations are used to guide motor-articulatory processes.

  5. Statistical optimization for alkali pretreatment conditions of narrow-leaf cattail by response surface methodology

    Directory of Open Access Journals (Sweden)

    Arrisa Ruangmee

    2013-08-01

    Full Text Available Response surface methodology with central composite design was applied to optimize alkali pretreatment of narrow-leafcattail (Typha angustifolia. Joint effects of three independent variables; NaOH concentration (1-5%, temperature (60-100 ºC,and reaction time (30-150 min, were investigated to evaluate the increase in and the improvement of cellulosic componentscontained in the raw material after pretreatment. The combined optimum condition based on the cellulosic content obtainedfrom this study is: a concentration of 5% NaOH, a reaction time of 120 min, and a temperature of 100 ºC. This result has beenanalyzed employing ANOVA with a second order polynomial equation. The model was found to be significant and was able topredict accurately the response of strength at less than 5% error. Under this combined optimal condition, the desirable cellulosic content in the sample increased from 38.5 to 68.3%, while the unfavorable hemicellulosic content decreased from 37.6 to7.3%.

  6. No priming of the immune response in newborn Brown Norway rats dosed with ovalbumin in the mouth

    DEFF Research Database (Denmark)

    Madsen, Charlotte Bernhard; Pilegaard, Kirsten

    2003-01-01

    with ovalbumin and if this method could be used in an animal model for food allergy. Methods: Newborn Brown Norway rats were dosed with ovalbumin in the mouth (100 mug or 6 mg). As young adults, the animals were dosed by gavage for 35 days with 1 mg ovalbumin/day or once intraperitoneally with 100 mug. Control......E and IgG responses were decreased compared to the control groups, however, not always reaching statistical significance. A statistical significant decrease in the specific immune response was found in young adult rats dosed in the mouth as compared to by gavage. Conclusions: Dosing Brown Norway rats...

  7. On a curvature-statistics theorem

    International Nuclear Information System (INIS)

    Calixto, M; Aldaya, V

    2008-01-01

    The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature (κ = ±1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.

  8. On a curvature-statistics theorem

    Energy Technology Data Exchange (ETDEWEB)

    Calixto, M [Departamento de Matematica Aplicada y Estadistica, Universidad Politecnica de Cartagena, Paseo Alfonso XIII 56, 30203 Cartagena (Spain); Aldaya, V [Instituto de Astrofisica de Andalucia, Apartado Postal 3004, 18080 Granada (Spain)], E-mail: Manuel.Calixto@upct.es

    2008-08-15

    The spin-statistics theorem in quantum field theory relates the spin of a particle to the statistics obeyed by that particle. Here we investigate an interesting correspondence or connection between curvature ({kappa} = {+-}1) and quantum statistics (Fermi-Dirac and Bose-Einstein, respectively). The interrelation between both concepts is established through vacuum coherent configurations of zero modes in quantum field theory on the compact O(3) and noncompact O(2; 1) (spatial) isometry subgroups of de Sitter and Anti de Sitter spaces, respectively. The high frequency limit, is retrieved as a (zero curvature) group contraction to the Newton-Hooke (harmonic oscillator) group. We also make some comments on the physical significance of the vacuum energy density and the cosmological constant problem.

  9. Spontaneous Responses to Neoliberalism, and their Significance for Education

    Directory of Open Access Journals (Sweden)

    Johannes L van der Walt

    2017-05-01

    Full Text Available This paper is a sequel to the keynote address at the 2017 BCES Conference. The keynote address concluded with the thought that some educationists respond intuitively and spontaneously to neoliberalism and its impact on education whereas others reject neoliberalist precepts and their pedagogical implications on definite principled grounds. This paper deals with the former response; it offers pedagogical insights gleaned from an overview of intuitive, spontaneous reactions to neoliberalism.

  10. Testing statistical hypotheses of equivalence

    CERN Document Server

    Wellek, Stefan

    2010-01-01

    Equivalence testing has grown significantly in importance over the last two decades, especially as its relevance to a variety of applications has become understood. Yet published work on the general methodology remains scattered in specialists' journals, and for the most part, it focuses on the relatively narrow topic of bioequivalence assessment.With a far broader perspective, Testing Statistical Hypotheses of Equivalence provides the first comprehensive treatment of statistical equivalence testing. The author addresses a spectrum of specific, two-sided equivalence testing problems, from the

  11. Texture classification by texton: statistical versus binary.

    Directory of Open Access Journals (Sweden)

    Zhenhua Guo

    Full Text Available Using statistical textons for texture classification has shown great success recently. The maximal response 8 (Statistical_MR8, image patch (Statistical_Joint and locally invariant fractal (Statistical_Fractal are typical statistical texton algorithms and state-of-the-art texture classification methods. However, there are two limitations when using these methods. First, it needs a training stage to build a texton library, thus the recognition accuracy will be highly depended on the training samples; second, during feature extraction, local feature is assigned to a texton by searching for the nearest texton in the whole library, which is time consuming when the library size is big and the dimension of feature is high. To address the above two issues, in this paper, three binary texton counterpart methods were proposed, Binary_MR8, Binary_Joint, and Binary_Fractal. These methods do not require any training step but encode local feature into binary representation directly. The experimental results on the CUReT, UIUC and KTH-TIPS databases show that binary texton could get sound results with fast feature extraction, especially when the image size is not big and the quality of image is not poor.

  12. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    International Nuclear Information System (INIS)

    Dai, Wu-Sheng; Xie, Mi

    2013-01-01

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete

  13. Survey of editors and reviewers of high-impact psychology journals: statistical and research design problems in submitted manuscripts.

    Science.gov (United States)

    Harris, Alex; Reeder, Rachelle; Hyun, Jenny

    2011-01-01

    The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.

  14. Inferential Statistics from Black Hispanic Breast Cancer Survival Data

    Directory of Open Access Journals (Sweden)

    Hafiz M. R. Khan

    2014-01-01

    Full Text Available In this paper we test the statistical probability models for breast cancer survival data for race and ethnicity. Data was collected from breast cancer patients diagnosed in United States during the years 1973–2009. We selected a stratified random sample of Black Hispanic female patients from the Surveillance Epidemiology and End Results (SEER database to derive the statistical probability models. We used three common model building criteria which include Akaike Information Criteria (AIC, Bayesian Information Criteria (BIC, and Deviance Information Criteria (DIC to measure the goodness of fit tests and it was found that Black Hispanic female patients survival data better fit the exponentiated exponential probability model. A novel Bayesian method was used to derive the posterior density function for the model parameters as well as to derive the predictive inference for future response. We specifically focused on Black Hispanic race. Markov Chain Monte Carlo (MCMC method was used for obtaining the summary results of posterior parameters. Additionally, we reported predictive intervals for future survival times. These findings would be of great significance in treatment planning and healthcare resource allocation.

  15. Sustained Magnetic Responses in Temporal Cortex Reflect Instantaneous Significance of Approaching and Receding Sounds.

    Directory of Open Access Journals (Sweden)

    Dominik R Bach

    Full Text Available Rising sound intensity often signals an approaching sound source and can serve as a powerful warning cue, eliciting phasic attention, perception biases and emotional responses. How the evaluation of approaching sounds unfolds over time remains elusive. Here, we capitalised on the temporal resolution of magnetoencephalograpy (MEG to investigate in humans a dynamic encoding of perceiving approaching and receding sounds. We compared magnetic responses to intensity envelopes of complex sounds to those of white noise sounds, in which intensity change is not perceived as approaching. Sustained magnetic fields over temporal sensors tracked intensity change in complex sounds in an approximately linear fashion, an effect not seen for intensity change in white noise sounds, or for overall intensity. Hence, these fields are likely to track approach/recession, but not the apparent (instantaneous distance of the sound source, or its intensity as such. As a likely source of this activity, the bilateral inferior temporal gyrus and right temporo-parietal junction emerged. Our results indicate that discrete temporal cortical areas parametrically encode behavioural significance in moving sound sources where the signal unfolded in a manner reminiscent of evidence accumulation. This may help an understanding of how acoustic percepts are evaluated as behaviourally relevant, where our results highlight a crucial role of cortical areas.

  16. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  17. Statistical and theoretical research

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    Significant accomplishments include the creation of field designs to detect population impacts, new census procedures for small mammals, and methods for designing studies to determine where and how much of a contaminant is extent over certain landscapes. A book describing these statistical methods is currently being written and will apply to a variety of environmental contaminants, including radionuclides. PNL scientists also have devised an analytical method for predicting the success of field eexperiments on wild populations. Two highlights of current research are the discoveries that population of free-roaming horse herds can double in four years and that grizzly bear populations may be substantially smaller than once thought. As stray horses become a public nuisance at DOE and other large Federal sites, it is important to determine their number. Similar statistical theory can be readily applied to other situations where wild animals are a problem of concern to other government agencies. Another book, on statistical aspects of radionuclide studies, is written specifically for researchers in radioecology

  18. Significance analysis of lexical bias in microarray data

    Directory of Open Access Journals (Sweden)

    Falkow Stanley

    2003-04-01

    Full Text Available Abstract Background Genes that are determined to be significantly differentially regulated in microarray analyses often appear to have functional commonalities, such as being components of the same biochemical pathway. This results in certain words being under- or overrepresented in the list of genes. Distinguishing between biologically meaningful trends and artifacts of annotation and analysis procedures is of the utmost importance, as only true biological trends are of interest for further experimentation. A number of sophisticated methods for identification of significant lexical trends are currently available, but these methods are generally too cumbersome for practical use by most microarray users. Results We have developed a tool, LACK, for calculating the statistical significance of apparent lexical bias in microarray datasets. The frequency of a user-specified list of search terms in a list of genes which are differentially regulated is assessed for statistical significance by comparison to randomly generated datasets. The simplicity of the input files and user interface targets the average microarray user who wishes to have a statistical measure of apparent lexical trends in analyzed datasets without the need for bioinformatics skills. The software is available as Perl source or a Windows executable. Conclusion We have used LACK in our laboratory to generate biological hypotheses based on our microarray data. We demonstrate the program's utility using an example in which we confirm significant upregulation of SPI-2 pathogenicity island of Salmonella enterica serovar Typhimurium by the cation chelator dipyridyl.

  19. Statistical language learning in neonates revealed by event-related brain potentials

    Directory of Open Access Journals (Sweden)

    Näätänen Risto

    2009-03-01

    Full Text Available Abstract Background Statistical learning is a candidate for one of the basic prerequisites underlying the expeditious acquisition of spoken language. Infants from 8 months of age exhibit this form of learning to segment fluent speech into distinct words. To test the statistical learning skills at birth, we recorded event-related brain responses of sleeping neonates while they were listening to a stream of syllables containing statistical cues to word boundaries. Results We found evidence that sleeping neonates are able to automatically extract statistical properties of the speech input and thus detect the word boundaries in a continuous stream of syllables containing no morphological cues. Syllable-specific event-related brain responses found in two separate studies demonstrated that the neonatal brain treated the syllables differently according to their position within pseudowords. Conclusion These results demonstrate that neonates can efficiently learn transitional probabilities or frequencies of co-occurrence between different syllables, enabling them to detect word boundaries and in this way isolate single words out of fluent natural speech. The ability to adopt statistical structures from speech may play a fundamental role as one of the earliest prerequisites of language acquisition.

  20. Statistical Efficiency of Double-Bounded Dichotomous Choice Contingent Valuation

    OpenAIRE

    Michael Hanemann; John Loomis; Barbara Kanninen

    1991-01-01

    The statistical efficiency of conventional dichotomous choice contingent valuation surveys can be improved by asking each respondent a second dichotomous choice question which depends on the response to the first question—if the first response is "yes," the second bid is some amount greater than the first bid; while, if the first response is "no," the second bid is some amount smaller. This "double-bounded" approach is shown to be asymptotically more efficient than the conventional, "singlebo...

  1. Whither Statistics Education Research?

    Science.gov (United States)

    Watson, Jane

    2016-01-01

    This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…

  2. Innovation, ENBIS and the Importance of Practice in the Development of Statistics.

    NARCIS (Netherlands)

    Bisgaard, S.

    2005-01-01

    The theme of this paper is innovation - in statistics and, more generally, in business and industry. Innovations in statistical theory have historically frequently occurred in direct response to demands for solutions to important practical problems. A major motivation for the establishment of the

  3. Statistical modeling of nitrogen-dependent modulation of root system architecture in Arabidopsis thaliana.

    Science.gov (United States)

    Araya, Takao; Kubo, Takuya; von Wirén, Nicolaus; Takahashi, Hideki

    2016-03-01

    Plant root development is strongly affected by nutrient availability. Despite the importance of structure and function of roots in nutrient acquisition, statistical modeling approaches to evaluate dynamic and temporal modulations of root system architecture in response to nutrient availability have remained as widely open and exploratory areas in root biology. In this study, we developed a statistical modeling approach to investigate modulations of root system architecture in response to nitrogen availability. Mathematical models were designed for quantitative assessment of root growth and root branching phenotypes and their dynamic relationships based on hierarchical configuration of primary and lateral roots formulating the fishbone-shaped root system architecture in Arabidopsis thaliana. Time-series datasets reporting dynamic changes in root developmental traits on different nitrate or ammonium concentrations were generated for statistical analyses. Regression analyses unraveled key parameters associated with: (i) inhibition of primary root growth under nitrogen limitation or on ammonium; (ii) rapid progression of lateral root emergence in response to ammonium; and (iii) inhibition of lateral root elongation in the presence of excess nitrate or ammonium. This study provides a statistical framework for interpreting dynamic modulation of root system architecture, supported by meta-analysis of datasets displaying morphological responses of roots to diverse nitrogen supplies. © 2015 Institute of Botany, Chinese Academy of Sciences.

  4. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  5. Effects of Environmental Context on Physiological Response During Team Handball Small Sided Games.

    Science.gov (United States)

    Bělka, Jan; Hulka, Karel; Machová, Iva; Šafář, Michal; Weisser, Radim; Bellar, David M; Hoover, Donald L; Judge, Lawrence W

    2017-01-01

    This study examined the distance covered and physiological effects of altering the number of players during small-sided games (SSG) in team handball. Twelve professional female handball players [24.6±3.7 years, 172±6.2 cm, 68.2 ± 9.9kg, 22.7 ± 2 kg/m 2 ] participated in this study. The SSG were played, first with five on each side (SSG 5), then four (SSG 4), then three (SSG 3). Each game was four minutes long, followed by three minutes of rest. The distance covered and time spent in four speed zones (based on player movement speed) were selected for analysis: Zone 1 (0-1.4 m/s), Zone 2 (1.5-3.4 m/s), Zone 3 (3.5-5.2 m/s), and Zone 4 (>5.2 m/s). Statistically significant differences were found in Zone 2, between conditions SSG 3 and SSG 4 (p=.049,ω 2 = .32). The highest average heart rate (HR) occurred during SSG 3. Average HR between SSG 3 (89.7 % HRmax) and SSG 5 (87.8 % HRmax) (p= .04, ω2= .26) were also significantly different. Participant HR response between the speed zones was not statistically significant. HR response was negatively correlated with the number of players within the SSG condition. Statistically significant results were found for RPE between SSG 3 and the other two SSG conditions (SSG 4, p = .01, and SSG 5, p = .00). These results indicate that changing the number of SSG players can be used to manipulate the physiological response during handball training.

  6. Statistics of the electromagnetic response of a chaotic reverberation chamber

    Directory of Open Access Journals (Sweden)

    J.-B. Gros

    2015-11-01

    Full Text Available This article presents a study of the electromagnetic re- sponse of a chaotic reverberation chamber (RC in the pres- ence of losses. By means of simulations and of experi- ments, the fluctuations in the maxima of the field obtained in a conventional mode-stirred RC are compared with those in a chaotic RC in the neighborhood of the Lowest Useable Frequency (LUF. The present work illustrates that the uni- versal spectral and spatial statistical properties of chaotic RCs allow to meet more adequately the criteria required by the Standard IEC 61000-4-21 to perform tests of electro- magnetic compatibility.

  7. Assessment of statistic analysis in non-radioisotopic local lymph node assay (non-RI-LLNA) with α-hexylcinnamic aldehyde as an example

    International Nuclear Information System (INIS)

    Takeyoshi, Masahiro; Sawaki, Masakuni; Yamasaki, Kanji; Kimber, Ian

    2003-01-01

    The murine local lymph node assay (LLNA) is used for the identification of chemicals that have the potential to cause skin sensitization. However, it requires specific facility and handling procedures to accommodate a radioisotopic (RI) endpoint. We have developed non-radioisotopic (non-RI) endpoint of LLNA based on BrdU incorporation to avoid a use of RI. Although this alternative method appears viable in principle, it is somewhat less sensitive than the standard assay. In this study, we report investigations to determine the use of statistical analysis to improve the sensitivity of a non-RI LLNA procedure with α-hexylcinnamic aldehyde (HCA) in two separate experiments. Consequently, the alternative non-RI method required HCA concentrations of greater than 25% to elicit a positive response based on the criterion for classification as a skin sensitizer in the standard LLNA. Nevertheless, dose responses to HCA in the alternative method were consistent in both experiments and we examined whether the use of an endpoint based upon the statistical significance of induced changes in LNC turnover, rather than an SI of 3 or greater, might provide for additional sensitivity. The results reported here demonstrate that with HCA at least significant responses were, in each of two experiments, recorded following exposure of mice to 25% of HCA. These data suggest that this approach may be more satisfactory--at least when BrdU incorporation is measured. However, this modification of the LLNA is rather less sensitive than the standard method if employing statistical endpoint. Taken together the data reported here suggest that a modified LLNA in which BrdU is used in place of radioisotope incorporation shows some promise, but that in its present form, even with the use of a statistical endpoint, lacks some of the sensitivity of the standard method. The challenge is to develop strategies for further refinement of this approach

  8. Assessment of statistic analysis in non-radioisotopic local lymph node assay (non-RI-LLNA) with alpha-hexylcinnamic aldehyde as an example.

    Science.gov (United States)

    Takeyoshi, Masahiro; Sawaki, Masakuni; Yamasaki, Kanji; Kimber, Ian

    2003-09-30

    The murine local lymph node assay (LLNA) is used for the identification of chemicals that have the potential to cause skin sensitization. However, it requires specific facility and handling procedures to accommodate a radioisotopic (RI) endpoint. We have developed non-radioisotopic (non-RI) endpoint of LLNA based on BrdU incorporation to avoid a use of RI. Although this alternative method appears viable in principle, it is somewhat less sensitive than the standard assay. In this study, we report investigations to determine the use of statistical analysis to improve the sensitivity of a non-RI LLNA procedure with alpha-hexylcinnamic aldehyde (HCA) in two separate experiments. Consequently, the alternative non-RI method required HCA concentrations of greater than 25% to elicit a positive response based on the criterion for classification as a skin sensitizer in the standard LLNA. Nevertheless, dose responses to HCA in the alternative method were consistent in both experiments and we examined whether the use of an endpoint based upon the statistical significance of induced changes in LNC turnover, rather than an SI of 3 or greater, might provide for additional sensitivity. The results reported here demonstrate that with HCA at least significant responses were, in each of two experiments, recorded following exposure of mice to 25% of HCA. These data suggest that this approach may be more satisfactory-at least when BrdU incorporation is measured. However, this modification of the LLNA is rather less sensitive than the standard method if employing statistical endpoint. Taken together the data reported here suggest that a modified LLNA in which BrdU is used in place of radioisotope incorporation shows some promise, but that in its present form, even with the use of a statistical endpoint, lacks some of the sensitivity of the standard method. The challenge is to develop strategies for further refinement of this approach.

  9. New scanning technique using Adaptive Statistical Iterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT.

    Science.gov (United States)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-06-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  10. Combining censored and uncensored data in a U-statistic: design and sample size implications for cell therapy research.

    Science.gov (United States)

    Moyé, Lemuel A; Lai, Dejian; Jing, Kaiyan; Baraniuk, Mary Sarah; Kwak, Minjung; Penn, Marc S; Wu, Colon O

    2011-01-01

    The assumptions that anchor large clinical trials are rooted in smaller, Phase II studies. In addition to specifying the target population, intervention delivery, and patient follow-up duration, physician-scientists who design these Phase II studies must select the appropriate response variables (endpoints). However, endpoint measures can be problematic. If the endpoint assesses the change in a continuous measure over time, then the occurrence of an intervening significant clinical event (SCE), such as death, can preclude the follow-up measurement. Finally, the ideal continuous endpoint measurement may be contraindicated in a fraction of the study patients, a change that requires a less precise substitution in this subset of participants.A score function that is based on the U-statistic can address these issues of 1) intercurrent SCE's and 2) response variable ascertainments that use different measurements of different precision. The scoring statistic is easy to apply, clinically relevant, and provides flexibility for the investigators' prospective design decisions. Sample size and power formulations for this statistic are provided as functions of clinical event rates and effect size estimates that are easy for investigators to identify and discuss. Examples are provided from current cardiovascular cell therapy research.

  11. Macro-indicators of citation impacts of six prolific countries: InCites data and the statistical significance of trends.

    Directory of Open Access Journals (Sweden)

    Lutz Bornmann

    Full Text Available Using the InCites tool of Thomson Reuters, this study compares normalized citation impact values calculated for China, Japan, France, Germany, United States, and the UK throughout the time period from 1981 to 2010. InCites offers a unique opportunity to study the normalized citation impacts of countries using (i a long publication window (1981 to 2010, (ii a differentiation in (broad or more narrow subject areas, and (iii allowing for the use of statistical procedures in order to obtain an insightful investigation of national citation trends across the years. Using four broad categories, our results show significantly increasing trends in citation impact values for France, the UK, and especially Germany across the last thirty years in all areas. The citation impact of papers from China is still at a relatively low level (mostly below the world average, but the country follows an increasing trend line. The USA exhibits a stable pattern of high citation impact values across the years. With small impact differences between the publication years, the US trend is increasing in engineering and technology but decreasing in medical and health sciences as well as in agricultural sciences. Similar to the USA, Japan follows increasing as well as decreasing trends in different subject areas, but the variability across the years is small. In most of the years, papers from Japan perform below or approximately at the world average in each subject area.

  12. Sensitivity and responsiveness of the patient-reported TED-QOL to rehabilitative surgery in thyroid eye disease.

    Science.gov (United States)

    Fayers, Tessa; Fayers, Peter M; Dolman, Peter J

    2016-12-01

    We tested the sensitivity and responsiveness of the TED-QOL to rehabilitative surgery in thyroid eye disease (TED). The 3-item TED-QOL and 16-item GO-QOL, which assess quality of life (QoL) in TED, were administered to consecutive patients undergoing rehabilitative surgery. The questionnaires were completed pre-and post-operatively to assess sensitivity (ability to discriminate between different surgical groups) and responsiveness (ability to detect within patient changes over time).56 patients underwent 69 procedures for TED (29 orbital decompressions, 15 strabismus operations, 25 eyelid procedures). The differences in scores between the three types of surgery (a measure of sensitivity) were statistically significant at the 5% level pre-operatively and post-operatively for all 3 TED-QOL scales and for both GO-QOL scales, but much more so for the TED-QOL scales in each case. The within-patient changes between the pre- and post-operative scores for the same subjects (a measure of responsiveness) were statistically very highly significant for the TED-QOL overall and appearance scales for each of the surgeries. The pre- and post-operative difference for the TED-QOL functioning scale was highly statistically significant for strabismus surgery but not for decompression or lid surgery. The change between the pre- and post-operative scores for the GO-QOL was significant for the functioning scale with strabismus and lid surgery, and was highly significant for the appearance scale with lid surgery but not for strabismus surgery or decompression. The 3-item TED-QOL is sensitive and responsive to rehabilitative surgery in TED and compares favorably with the lengthier GO-QOL for these parameters.

  13. Random Forest Segregation of Drug Responses May define Regions of Biological Significance

    Directory of Open Access Journals (Sweden)

    Qasim eBukhari

    2016-03-01

    Full Text Available The ability to assess brain responses in unsupervised manner based on fMRI measure has remained a challenge. Here we have applied the Random Forest (RF method to detect differences in the pharmacological MRI (phMRI response in rats to treatment with an analgesic drug (buprenorphine as compared to control (saline. Three groups of animals were studied: two groups treated with different doses of the opioid buprenorphine, low (LD and high dose (HD, and one receiving saline. PhMRI responses were evaluated in 45 brain regions and RF analysis was applied to allocate rats to the individual treatment groups. RF analysis was able to identify drug effects based on differential phMRI responses in the hippocampus, amygdala, nucleus accumbens, superior colliculus and the lateral and posterior thalamus for drug vs. saline. These structures have high levels of mu opioid receptors. In addition these regions are involved in aversive signaling, which is inhibited by mu opioids. The results demonstrate that buprenorphine mediated phMRI responses comprise characteristic features that allow an unsupervised differentiation from placebo treated rats as well as the proper allocation to the respective drug dose group using the RF method, a method that has been successfully applied in clinical studies.

  14. Childhood-compared to adolescent-onset bipolar disorder has more statistically significant clinical correlates.

    Science.gov (United States)

    Holtzman, Jessica N; Miller, Shefali; Hooshmand, Farnaz; Wang, Po W; Chang, Kiki D; Hill, Shelley J; Rasgon, Natalie L; Ketter, Terence A

    2015-07-01

    The strengths and limitations of considering childhood-and adolescent-onset bipolar disorder (BD) separately versus together remain to be established. We assessed this issue. BD patients referred to the Stanford Bipolar Disorder Clinic during 2000-2011 were assessed with the Systematic Treatment Enhancement Program for BD Affective Disorders Evaluation. Patients with childhood- and adolescent-onset were compared to those with adult-onset for 7 unfavorable bipolar illness characteristics with replicated associations with early-onset patients. Among 502 BD outpatients, those with childhood- (adolescent- (13-18 years, N=218) onset had significantly higher rates for 4/7 unfavorable illness characteristics, including lifetime comorbid anxiety disorder, at least ten lifetime mood episodes, lifetime alcohol use disorder, and prior suicide attempt, than those with adult-onset (>18 years, N=174). Childhood- but not adolescent-onset BD patients also had significantly higher rates of first-degree relative with mood disorder, lifetime substance use disorder, and rapid cycling in the prior year. Patients with pooled childhood/adolescent - compared to adult-onset had significantly higher rates for 5/7 of these unfavorable illness characteristics, while patients with childhood- compared to adolescent-onset had significantly higher rates for 4/7 of these unfavorable illness characteristics. Caucasian, insured, suburban, low substance abuse, American specialty clinic-referred sample limits generalizability. Onset age is based on retrospective recall. Childhood- compared to adolescent-onset BD was more robustly related to unfavorable bipolar illness characteristics, so pooling these groups attenuated such relationships. Further study is warranted to determine the extent to which adolescent-onset BD represents an intermediate phenotype between childhood- and adult-onset BD. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. The Relationship of Instructional Methods with Student Responses to the Survey of Attitudes Toward Statistics.

    Science.gov (United States)

    Faghihi, Foroozandeh; Rakow, Ernest A.

    This study, conducted at the University of Memphis (Tennessee), compared the effects of a self-paced method of instruction on the attitudes and perceptions of students enrolled in an undergraduate statistics course with those of a comparable group of students taking statistics in a traditional lecture setting. The non-traditional course used a…

  16. A Reanalysis of Curvature in the Dose Response for Cancer and Modifications by Age at Exposure Following Radiation Therapy for Benign Disease

    Energy Technology Data Exchange (ETDEWEB)

    Little, Mark P., E-mail: mark.little@nih.gov [Radiation Epidemiology Branch, National Cancer Institute, Rockville, Maryland (United States); Stovall, Marilyn; Smith, Susan A. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Kleinerman, Ruth A. [Radiation Epidemiology Branch, National Cancer Institute, Rockville, Maryland (United States)

    2013-02-01

    Purpose: To assess the shape of the dose response for various cancer endpoints and modifiers by age and time. Methods and Materials: Reanalysis of the US peptic ulcer data testing for heterogeneity of radiogenic risk by cancer endpoint (stomach, pancreas, lung, leukemia, all other). Results: There are statistically significant (P<.05) excess risks for all cancer and for lung cancer and borderline statistically significant risks for stomach cancer (P=.07), and leukemia (P=.06), with excess relative risks Gy{sup -1} of 0.024 (95% confidence interval [CI] 0.011, 0.039), 0.559 (95% CI 0.221, 1.021), 0.042 (95% CI -0.002, 0.119), and 1.087 (95% CI -0.018, 4.925), respectively. There is statistically significant (P=.007) excess risk of pancreatic cancer when adjusted for dose-response curvature. General downward curvature is apparent in the dose response, statistically significant (P<.05) for all cancers, pancreatic cancer, and all other cancers (ie, other than stomach, pancreas, lung, leukemia). There are indications of reduction in relative risk with increasing age at exposure (for all cancers, pancreatic cancer), but no evidence for quadratic variations in relative risk with age at exposure. If a linear-exponential dose response is used, there is no significant heterogeneity in the dose response among the 5 endpoints considered or in the speed of variation of relative risk with age at exposure. The risks are generally consistent with those observed in the Japanese atomic bomb survivors and in groups of nuclear workers. Conclusions: There are excess risks for various malignancies in this data set. Generally there is a marked downward curvature in the dose response and significant reduction in relative risk with increasing age at exposure. The consistency of risks with those observed in the Japanese atomic bomb survivors and in groups of nuclear workers implies that there may be little sparing effect of fractionation of dose or low-dose-rate exposure.

  17. Encounter Probability of Significant Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    The determination of the design wave height (often given as the significant wave height) is usually based on statistical analysis of long-term extreme wave height measurement or hindcast. The result of such extreme wave height analysis is often given as the design wave height corresponding to a c...

  18. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  19. Evaluation of Theoretical and Empirical Characteristics of the Communication, Language, and Statistics Survey (CLASS)

    Science.gov (United States)

    Wagler, Amy E.; Lesser, Lawrence M.

    2018-01-01

    The interaction between language and the learning of statistical concepts has been receiving increased attention. The Communication, Language, And Statistics Survey (CLASS) was developed in response to the need to focus on dynamics of language in light of the culturally and linguistically diverse environments of introductory statistics classrooms.…

  20. Whole-body MRI quantitative biomarkers are associated significantly with treatment response in patients with newly diagnosed symptomatic multiple myeloma following bortezomib induction

    Energy Technology Data Exchange (ETDEWEB)

    Latifoltojar, Arash; Dikaios, Nikolaos [University College London, Centre for Medical Imaging, London (United Kingdom); Hall-Craggs, Margaret; Taylor, Stuart A.; Halligan, Steve; Punwani, Shonit [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospital, Department of Radiology, London (United Kingdom); Bainbridge, Alan; Sokolska, Magdalena [University College London Hospital, Department of Medical Physics and Bioengineering, London (United Kingdom); Rabin, Neil; Popat, Rakesh; Rismani, Ali; D' Sa, Shirley; Yong, Kwee [University College London Hospital, Department of Haematology, London (United Kingdom); Antonelli, Michela; Ourselin, Sebastien [University College London, Translational Imaging Group, Centre for Medical Imaging Computing, London (United Kingdom)

    2017-12-15

    To evaluate whole-body MRI (WB-MRI) parameters significantly associated with treatment response in multiple myeloma (MM). Twenty-one MM patients underwent WB-MRI at diagnosis and after two cycles of chemotherapy. Scans acquired at 3.0 T included T2, diffusion-weighted-imaging (DWI) and mDixon pre- and post-contrast. Twenty focal lesions (FLs) matched on DWI and post-contrast mDixon were selected for each time point. Estimated tumour volume (eTV), apparent diffusion coefficient (ADC), enhancement ratio (ER) and signal fat fraction (sFF) were derived. Clinical treatment response to chemotherapy was assessed using conventional criteria. Significance of temporal parameter change was assessed by the paired t test and receiver operating characteristics/area under the curve (AUC) analysis was performed. Parameter repeatability was assessed by interclass correlation (ICC) and Bland-Altman analysis of 10 healthy volunteers scanned at two time points. Fifteen of 21 patients responded to treatment. Of 254 FLs analysed, sFF (p < 0.0001) and ADC (p = 0.001) significantly increased in responders but not non-responders. eTV significantly decreased in 19/21 cases. Focal lesion sFF was the best discriminator of treatment response (AUC 1.0). Bone sFF repeatability was excellent (ICC 0.98) and better than bone ADC (ICC 0.47). WB-MRI derived focal lesion sFF shows promise as an imaging biomarker of treatment response in newly diagnosed MM. (orig.)

  1. A Statistical Perspective on Running with Prosthetic Lower-Limbs: An Advantage or Disadvantage?

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2014-11-01

    Full Text Available Technological developments have led to the increased use of carbon fiber and prosthetic lower-limbs in running events at the Paralympic Games. This study aims to exploit a series of statistical techniques in order to prepare a response to the vital question of whether utilizing prosthetic feet can affect an athletes ability when running competitively at the Paralympics Games by comparing both within and between different classifications. The study also considers the differences between running on biological limbs and prosthetic lower-limbs from a mechanical point of view. The results from the male 100 m, 200 m and 400 m at the 2012 London Paralympic Games have been the source of this investigation. The investigation provides statistical evidence to propose that the number of prosthetic limbs used and the structure of such limbs have a significant impact on the outcome of track events at the Paralympic Games.

  2. Applied statistics in agricultural, biological, and environmental sciences.

    Science.gov (United States)

    Agronomic research often involves measurement and collection of multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate statistical methods encompass the simultaneous analysis of all random variables measured on each experimental or s...

  3. Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).

    Science.gov (United States)

    Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur

    2016-01-01

    We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  4. The lz(p)* Person-Fit Statistic in an Unfolding Model Context

    NARCIS (Netherlands)

    Tendeiro, Jorge N.

    2017-01-01

    Although person-fit analysis has a long-standing tradition within item response theory, it has been applied in combination with dominance response models almost exclusively. In this article, a popular log likelihood-based parametric person-fit statistic under the framework of the generalized graded

  5. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  6. Statistical problems in nuclear regulation: introduction and overview

    International Nuclear Information System (INIS)

    Moore, R.H.; Easterling, R.G.

    1978-01-01

    The U.S. Nuclear Regulatory Commission (NRC) was organized formally in January 1975. The Commission's responsibilities can be categorized into four broad areas involving the licensing and use of nuclear materials and facilities: protecting public health and safety; protecting the environment; safeguarding nuclear materials and facilities; and assuring conformity with antitrust laws. A large variety of statistical problems are related to these basic responsibilities. They arise from the data-based nature of many of the issues to be resolved in making regulatory decisions. Hence, they are reflected in interactions among the NRC staff and licensees, vendors, and the public. This paper identifies and outlines some of these problems, providing a spectrum for comparison with the other presentations in this session. These problems are linked by the need for clear and objective treatment of data; their articulation and solution will benefit from insights and contributions from an informed statistical community

  7. [Comment on] Statistical discrimination

    Science.gov (United States)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  8. Design and statistical optimization of glipizide loaded lipospheres using response surface methodology.

    Science.gov (United States)

    Shivakumar, Hagalavadi Nanjappa; Patel, Pragnesh Bharat; Desai, Bapusaheb Gangadhar; Ashok, Purnima; Arulmozhi, Sinnathambi

    2007-09-01

    A 32 factorial design was employed to produce glipizide lipospheres by the emulsification phase separation technique using paraffin wax and stearic acid as retardants. The effect of critical formulation variables, namely levels of paraffin wax (X1) and proportion of stearic acid in the wax (X2) on geometric mean diameter (dg), percent encapsulation efficiency (% EE), release at the end of 12 h (rel12) and time taken for 50% of drug release (t50), were evaluated using the F-test. Mathematical models containing only the significant terms were generated for each response parameter using the multiple linear regression analysis (MLRA) and analysis of variance (ANOVA). Both formulation variables studied exerted a significant influence (p optimization using the desirability approach was employed to develop an optimized formulation by setting constraints on the dependent and independent variables. The experimental values of dg, % EE, rel12 and t50 values for the optimized formulation were found to be 57.54 +/- 1.38 mum, 86.28 +/- 1.32%, 77.23 +/- 2.78% and 5.60 +/- 0.32 h, respectively, which were in close agreement with those predicted by the mathematical models. The drug release from lipospheres followed first-order kinetics and was characterized by the Higuchi diffusion model. The optimized liposphere formulation developed was found to produce sustained anti-diabetic activity following oral administration in rats.

  9. Visual classification of very fine-grained sediments: Evaluation through univariate and multivariate statistics

    Science.gov (United States)

    Hohn, M. Ed; Nuhfer, E.B.; Vinopal, R.J.; Klanderman, D.S.

    1980-01-01

    Classifying very fine-grained rocks through fabric elements provides information about depositional environments, but is subject to the biases of visual taxonomy. To evaluate the statistical significance of an empirical classification of very fine-grained rocks, samples from Devonian shales in four cored wells in West Virginia and Virginia were measured for 15 variables: quartz, illite, pyrite and expandable clays determined by X-ray diffraction; total sulfur, organic content, inorganic carbon, matrix density, bulk density, porosity, silt, as well as density, sonic travel time, resistivity, and ??-ray response measured from well logs. The four lithologic types comprised: (1) sharply banded shale, (2) thinly laminated shale, (3) lenticularly laminated shale, and (4) nonbanded shale. Univariate and multivariate analyses of variance showed that the lithologic classification reflects significant differences for the variables measured, difference that can be detected independently of stratigraphic effects. Little-known statistical methods found useful in this work included: the multivariate analysis of variance with more than one effect, simultaneous plotting of samples and variables on canonical variates, and the use of parametric ANOVA and MANOVA on ranked data. ?? 1980 Plenum Publishing Corporation.

  10. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  11. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  12. Statistical methods for evaluating the attainment of cleanup standards

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, R.O.; Simpson, J.C.

    1992-12-01

    This document is the third volume in a series of volumes sponsored by the US Environmental Protection Agency (EPA), Statistical Policy Branch, that provide statistical methods for evaluating the attainment of cleanup Standards at Superfund sites. Volume 1 (USEPA 1989a) provides sampling designs and tests for evaluating attainment of risk-based standards for soils and solid media. Volume 2 (USEPA 1992) provides designs and tests for evaluating attainment of risk-based standards for groundwater. The purpose of this third volume is to provide statistical procedures for designing sampling programs and conducting statistical tests to determine whether pollution parameters in remediated soils and solid media at Superfund sites attain site-specific reference-based standards. This.document is written for individuals who may not have extensive training or experience with statistical methods. The intended audience includes EPA regional remedial project managers, Superfund-site potentially responsible parties, state environmental protection agencies, and contractors for these groups.

  13. Statistical mechanics for a class of quantum statistics

    International Nuclear Information System (INIS)

    Isakov, S.B.

    1994-01-01

    Generalized statistical distributions for identical particles are introduced for the case where filling a single-particle quantum state by particles depends on filling states of different momenta. The system of one-dimensional bosons with a two-body potential that can be solved by means of the thermodynamic Bethe ansatz is shown to be equivalent thermodynamically to a system of free particles obeying statistical distributions of the above class. The quantum statistics arising in this way are completely determined by the two-particle scattering phases of the corresponding interacting systems. An equation determining the statistical distributions for these statistics is derived

  14. Systematic review of statistically-derived models of immunological response in HIV-infected adults on antiretroviral therapy in Sub-Saharan Africa.

    Science.gov (United States)

    Sempa, Joseph B; Ujeneza, Eva L; Nieuwoudt, Martin

    2017-01-01

    In Sub-Saharan African (SSA) resource limited settings, Cluster of Differentiation 4 (CD4) counts continue to be used for clinical decision making in antiretroviral therapy (ART). Here, HIV-infected people often remain with CD4 counts immunological monitoring is necessary. Due to varying statistical modeling methods comparing immune response to ART across different cohorts is difficult. We systematically review such models and detail the similarities, differences and problems. 'Preferred Reporting Items for Systematic Review and Meta-Analyses' guidelines were used. Only studies of immune-response after ART initiation from SSA in adults were included. Data was extracted from each study and tabulated. Outcomes were categorized into 3 groups: 'slope', 'survival', and 'asymptote' models. Wordclouds were drawn wherein the frequency of variables occurring in the reviewed models is indicated by their size and color. 69 covariates were identified in the final models of 35 studies. Effect sizes of covariates were not directly quantitatively comparable in view of the combination of differing variables and scale transformation methods across models. Wordclouds enabled the identification of qualitative and semi-quantitative covariate sets for each outcome category. Comparison across categories identified sex, baseline age, baseline log viral load, baseline CD4, ART initiation regimen and ART duration as a minimal consensus set. Most models were different with respect to covariates included, variable transformations and scales, model assumptions, modelling strategies and reporting methods, even for the same outcomes. To enable comparison across cohorts, statistical models would benefit from the application of more uniform modelling techniques. Historic efforts have produced results that are anecdotal to individual cohorts only. This study was able to define 'prior' knowledge in the Bayesian sense. Such information has value for prospective modelling efforts.

  15. Gender differences in customer expectations and perceptions of corporate social responsibility

    DEFF Research Database (Denmark)

    Calabrese, Armando; Costa, Roberta; Rosati, Francesco

    2016-01-01

    The literature on business ethics, corporate social responsibility and sustainability includes many studies on gender differences, however the results are often contrasting. In particular, there has not yet been full agreement on the role and significance of gender differences in customer...... the statistical and the substantive significance of gender differences in customer expectations and perceptions of corporate responsibility, also examining the influence of age and education. The analysis is carried out on a remarkably large sample of 908 clients, pertaining to 12 of the largest Italian banks...... strategies in designing, planning, implementing and assessing sustainability initiatives....

  16. Data-driven inference for the spatial scan statistic.

    Science.gov (United States)

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  17. Medical Statistics – Mathematics or Oracle? Farewell Lecture

    Directory of Open Access Journals (Sweden)

    Gaus, Wilhelm

    2005-06-01

    Full Text Available Certainty is rare in medicine. This is a direct consequence of the individuality of each and every human being and the reason why we need medical statistics. However, statistics have their pitfalls, too. Fig. 1 shows that the suicide rate peaks in youth, while in Fig. 2 the rate is highest in midlife and Fig. 3 in old age. Which of these contradictory messages is right? After an introduction to the principles of statistical testing, this lecture examines the probability with which statistical test results are correct. For this purpose the level of significance and the power of the test are compared with the sensitivity and specificity of a diagnostic procedure. The probability of obtaining correct statistical test results is the same as that for the positive and negative correctness of a diagnostic procedure and therefore depends on prevalence. The focus then shifts to the problem of multiple statistical testing. The lecture demonstrates that for each data set of reasonable size at least one test result proves to be significant - even if the data set is produced by a random number generator. It is extremely important that a hypothesis is generated independently from the data used for its testing. These considerations enable us to understand the gradation of "lame excuses, lies and statistics" and the difference between pure truth and the full truth. Finally, two historical oracles are cited.

  18. Comparison of Classical Test Theory and Item Response Theory in Individual Change Assessment

    NARCIS (Netherlands)

    Jabrayilov, Ruslan; Emons, Wilco H. M.; Sijtsma, Klaas

    2016-01-01

    Clinical psychologists are advised to assess clinical and statistical significance when assessing change in individual patients. Individual change assessment can be conducted using either the methodologies of classical test theory (CTT) or item response theory (IRT). Researchers have been optimistic

  19. Testing for significance of phase synchronisation dynamics in the EEG.

    Science.gov (United States)

    Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J

    2013-06-01

    A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.

  20. New scanning technique using Adaptive Statistical lterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT

    International Nuclear Information System (INIS)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-01-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.

  1. Conversion factors and oil statistics

    International Nuclear Information System (INIS)

    Karbuz, Sohbet

    2004-01-01

    World oil statistics, in scope and accuracy, are often far from perfect. They can easily lead to misguided conclusions regarding the state of market fundamentals. Without proper attention directed at statistic caveats, the ensuing interpretation of oil market data opens the door to unnecessary volatility, and can distort perception of market fundamentals. Among the numerous caveats associated with the compilation of oil statistics, conversion factors, used to produce aggregated data, play a significant role. Interestingly enough, little attention is paid to conversion factors, i.e. to the relation between different units of measurement for oil. Additionally, the underlying information regarding the choice of a specific factor when trying to produce measurements of aggregated data remains scant. The aim of this paper is to shed some light on the impact of conversion factors for two commonly encountered issues, mass to volume equivalencies (barrels to tonnes) and for broad energy measures encountered in world oil statistics. This paper will seek to demonstrate how inappropriate and misused conversion factors can yield wildly varying results and ultimately distort oil statistics. Examples will show that while discrepancies in commonly used conversion factors may seem trivial, their impact on the assessment of a world oil balance is far from negligible. A unified and harmonised convention for conversion factors is necessary to achieve accurate comparisons and aggregate oil statistics for the benefit of both end-users and policy makers

  2. Stable statistical representations facilitate visual search.

    Science.gov (United States)

    Corbett, Jennifer E; Melcher, David

    2014-10-01

    Observers represent the average properties of object ensembles even when they cannot identify individual elements. To investigate the functional role of ensemble statistics, we examined how modulating statistical stability affects visual search. We varied the mean and/or individual sizes of an array of Gabor patches while observers searched for a tilted target. In "stable" blocks, the mean and/or local sizes of the Gabors were constant over successive displays, whereas in "unstable" baseline blocks they changed from trial to trial. Although there was no relationship between the context and the spatial location of the target, observers found targets faster (as indexed by faster correct responses and fewer saccades) as the global mean size became stable over several displays. Building statistical stability also facilitated scanning the scene, as measured by larger saccadic amplitudes, faster saccadic reaction times, and shorter fixation durations. These findings suggest a central role for peripheral visual information, creating context to free resources for detailed processing of salient targets and maintaining the illusion of visual stability.

  3. Proper interpretation of chronic toxicity studies and their statistics: A critique of "Which level of evidence does the US National Toxicology Program provide? Statistical considerations using the Technical Report 578 on Ginkgo biloba as an example".

    Science.gov (United States)

    Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol

    2015-09-02

    A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (p<0.05) compared with 240 (4800×0.05) expected by chance alone; thus, the carcinogenicity of Ginkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors. Published by Elsevier Ireland Ltd.

  4. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  5. Analysis of room transfer function and reverberant signal statistics

    DEFF Research Database (Denmark)

    Georganti, Eleftheria; Mourjopoulos, John; Jacobsen, Finn

    2008-01-01

    For some time now, statistical analysis has been a valuable tool in analyzing room transfer functions (RTFs). This work examines existing statistical time-frequency models and techniques for RTF analysis (e.g., Schroeder's stochastic model and the standard deviation over frequency bands for the RTF...... magnitude and phase). RTF fractional octave smoothing, as with 1-slash 3 octave analysis, may lead to RTF simplifications that can be useful for several audio applications, like room compensation, room modeling, auralisation purposes. The aim of this work is to identify the relationship of optimal response...... and the corresponding ratio of the direct and reverberant signal. In addition, this work examines the statistical quantities for speech and audio signals prior to their reproduction within rooms and when recorded in rooms. Histograms and other statistical distributions are used to compare RTF minima of typical...

  6. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2012-01-01

    In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Rankin...

  7. Statistical Seismology and Induced Seismicity

    Science.gov (United States)

    Tiampo, K. F.; González, P. J.; Kazemian, J.

    2014-12-01

    While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and

  8. Dysfunctions of the contemporary family from the perspective of judicial statistics

    Directory of Open Access Journals (Sweden)

    DANUTA KOWALCZYK

    2017-10-01

    Full Text Available The problems connected with the dysfunctions of the contemporary family are the subject of the study. Judicial statistics related mostly to the proceedings of family courts have been employed in the analysis. Emphasis has been placed on the problems related to the marital bond and parental responsibility. Statistical data suggests that there are some negative phenomena in both of these realms. Divorce and separation still constitute the cause of the incompleteness of family environments. Th e level of court interference in how parental responsibility is exercised which serves to protect children from the hazards of the lack of proper care is still not decreased.

  9. Statistical analysis of the effect of deposition parameters on the preferred orientation of sputtered AlN thin films

    International Nuclear Information System (INIS)

    Pantojas, V.M.; Otano-Rivera, W.; Caraballo, Jose N.

    2005-01-01

    A response surface statistical method was used to study the effects of deposition pressure, power and substrate temperature on the degree of preferred orientation of aluminum nitride films grown on Si (111) by dc magnetron sputtering. The AlN films were deposited at gas pressures ranging from 0.66 to 1.33 Pa, substrate temperature from 300 to 400 deg. C and power from 100 to 200 W. The degree of preferred orientation was evaluated and quantified using two-dimensional X-ray diffraction, which provides information on the out of plane (002) crystal alignment. The statistical method yielded a surface response curve in the parameter space and a correlation equation between the deposition parameters was obtained. Substrate temperature showed no significant effect upon texture quality for the temperature range studied. A surface response graph as a function of pressure and power was obtained. The main factor affecting texture quality was found to be a pressure-power interaction. The possible mechanisms that contribute to such correlation are discussed. Our best films yielded a rocking curve with full width at half maximum of 6.3 deg

  10. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  11. CUSUM-based person-fit statistics for adaptive testing

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    2001-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be inaccurately estimated. Several person-fit statistics for detecting nonfitting score patterns for paper-and-pencil tests have been proposed. In the context of computerized adaptive tests (CAT),

  12. CUSUM-based person-fit statistics for adaptive testing

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    1999-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be estimated inaccurately. Several person-fit statistics for detecting nonfitting score patterns for paper-and-pencil tests have been proposed. In the context of computerized adaptive tests (CAT),

  13. Disaster response team FAST skills training with a portable ultrasound simulator compared to traditional training: pilot study.

    Science.gov (United States)

    Paddock, Michael T; Bailitz, John; Horowitz, Russ; Khishfe, Basem; Cosby, Karen; Sergel, Michelle J

    2015-03-01

    Pre-hospital focused assessment with sonography in trauma (FAST) has been effectively used to improve patient care in multiple mass casualty events throughout the world. Although requisite FAST knowledge may now be learned remotely by disaster response team members, traditional live instructor and model hands-on FAST skills training remains logistically challenging. The objective of this pilot study was to compare the effectiveness of a novel portable ultrasound (US) simulator with traditional FAST skills training for a deployed mixed provider disaster response team. We randomized participants into one of three training groups stratified by provider role: Group A. Traditional Skills Training, Group B. US Simulator Skills Training, and Group C. Traditional Skills Training Plus US Simulator Skills Training. After skills training, we measured participants' FAST image acquisition and interpretation skills using a standardized direct observation tool (SDOT) with healthy models and review of FAST patient images. Pre- and post-course US and FAST knowledge were also assessed using a previously validated multiple-choice evaluation. We used the ANOVA procedure to determine the statistical significance of differences between the means of each group's skills scores. Paired sample t-tests were used to determine the statistical significance of pre- and post-course mean knowledge scores within groups. We enrolled 36 participants, 12 randomized to each training group. Randomization resulted in similar distribution of participants between training groups with respect to provider role, age, sex, and prior US training. For the FAST SDOT image acquisition and interpretation mean skills scores, there was no statistically significant difference between training groups. For US and FAST mean knowledge scores, there was a statistically significant improvement between pre- and post-course scores within each group, but again there was not a statistically significant difference between

  14. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  15. The antigravity suit in neurosurgery. Cardiovascular responses in seated neurosurgical patients.

    Science.gov (United States)

    Brodrick, P M; Ingram, G S

    1988-09-01

    The haemodynamic responses associated with inflation of the antigravity suit (G suit, aviation type) to 8.0 kPa were studied in a series of 40 patients who underwent neurosurgical operations in the sitting position. The study showed statistically significant increases in systolic arterial pressure (p less than 0.005) and mean central venous pressure (p less than 0.001) with inflation of the suit. The systolic arterial and mean central venous pressures remained significantly elevated immediately before deflation of the suit at the end of the operation (p less than 0.001 and p less than 0.005 respectively). The addition of 0.8-1.0 kPa positive end expiratory pressure during suit inflation was also investigated. A further increase in central venous pressure occurred but this did not achieve statistical significance.

  16. Statistical behavior of high doses in medical radiodiagnosis

    International Nuclear Information System (INIS)

    Barboza, Adriana Elisa

    2014-01-01

    This work has as main purpose statistically estimating occupational exposure in medical diagnostic radiology in cases of high doses recorded in 2011 at national level. For statistical survey of this study, doses of 372 IOE's diagnostic radiology in different Brazilian states were evaluated. Data were extracted from the work of monograph (Research Methodology Of High Doses In Medical Radiodiagnostic) that contains the database's information Sector Management doses of IRD/CNEN-RJ, Brazil. The identification of these states allows the Sanitary Surveillance (VISA) responsible, becomes aware of events and work with programs to reduce these events. (author)

  17. Statistical process control in nursing research.

    Science.gov (United States)

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  18. Quantifying spatial and temporal trends in beach-dune volumetric changes using spatial statistics

    Science.gov (United States)

    Eamer, Jordan B. R.; Walker, Ian J.

    2013-06-01

    Spatial statistics are generally underutilized in coastal geomorphology, despite offering great potential for identifying and quantifying spatial-temporal trends in landscape morphodynamics. In particular, local Moran's Ii provides a statistical framework for detecting clusters of significant change in an attribute (e.g., surface erosion or deposition) and quantifying how this changes over space and time. This study analyzes and interprets spatial-temporal patterns in sediment volume changes in a beach-foredune-transgressive dune complex following removal of invasive marram grass (Ammophila spp.). Results are derived by detecting significant changes in post-removal repeat DEMs derived from topographic surveys and airborne LiDAR. The study site was separated into discrete, linked geomorphic units (beach, foredune, transgressive dune complex) to facilitate sub-landscape scale analysis of volumetric change and sediment budget responses. Difference surfaces derived from a pixel-subtraction algorithm between interval DEMs and the LiDAR baseline DEM were filtered using the local Moran's Ii method and two different spatial weights (1.5 and 5 m) to detect statistically significant change. Moran's Ii results were compared with those derived from a more spatially uniform statistical method that uses a simpler student's t distribution threshold for change detection. Morphodynamic patterns and volumetric estimates were similar between the uniform geostatistical method and Moran's Ii at a spatial weight of 5 m while the smaller spatial weight (1.5 m) consistently indicated volumetric changes of less magnitude. The larger 5 m spatial weight was most representative of broader site morphodynamics and spatial patterns while the smaller spatial weight provided volumetric changes consistent with field observations. All methods showed foredune deflation immediately following removal with increased sediment volumes into the spring via deposition at the crest and on lobes in the lee

  19. Statistics Anxiety and Business Statistics: The International Student

    Science.gov (United States)

    Bell, James A.

    2008-01-01

    Does the international student suffer from statistics anxiety? To investigate this, the Statistics Anxiety Rating Scale (STARS) was administered to sixty-six beginning statistics students, including twelve international students and fifty-four domestic students. Due to the small number of international students, nonparametric methods were used to…

  20. Changing world extreme temperature statistics

    Science.gov (United States)

    Finkel, J. M.; Katz, J. I.

    2018-04-01

    We use the Global Historical Climatology Network--daily database to calculate a nonparametric statistic that describes the rate at which all-time daily high and low temperature records have been set in nine geographic regions (continents or major portions of continents) during periods mostly from the mid-20th Century to the present. This statistic was defined in our earlier work on temperature records in the 48 contiguous United States. In contrast to this earlier work, we find that in every region except North America all-time high records were set at a rate significantly (at least $3\\sigma$) higher than in the null hypothesis of a stationary climate. Except in Antarctica, all-time low records were set at a rate significantly lower than in the null hypothesis. In Europe, North Africa and North Asia the rate of setting new all-time highs increased suddenly in the 1990's, suggesting a change in regional climate regime; in most other regions there was a steadier increase.

  1. Applying Statistical Mechanics to pixel detectors

    International Nuclear Information System (INIS)

    Pindo, Massimiliano

    2002-01-01

    Pixel detectors, being made of a large number of active cells of the same kind, can be considered as significant sets to which Statistical Mechanics variables and methods can be applied. By properly redefining well known statistical parameters in order to let them match the ones that actually characterize pixel detectors, an analysis of the way they work can be performed in a totally new perspective. A deeper understanding of pixel detectors is attained, helping in the evaluation and comparison of their intrinsic characteristics and performance

  2. Introductory statistics for the behavioral sciences

    CERN Document Server

    Welkowitz, Joan; Cohen, Jacob

    1971-01-01

    Introductory Statistics for the Behavioral Sciences provides an introduction to statistical concepts and principles. This book emphasizes the robustness of parametric procedures wherein such significant tests as t and F yield accurate results even if such assumptions as equal population variances and normal population distributions are not well met.Organized into three parts encompassing 16 chapters, this book begins with an overview of the rationale upon which much of behavioral science research is based, namely, drawing inferences about a population based on data obtained from a samp

  3. Extending multivariate distance matrix regression with an effect size measure and the asymptotic null distribution of the test statistic.

    Science.gov (United States)

    McArtor, Daniel B; Lubke, Gitta H; Bergeman, C S

    2017-12-01

    Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains.

  4. Mathematical background and attitudes toward statistics in a sample of Spanish college students.

    Science.gov (United States)

    Carmona, José; Martínez, Rafael J; Sánchez, Manuel

    2005-08-01

    To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.

  5. Spreadsheets as tools for statistical computing and statistics education

    OpenAIRE

    Neuwirth, Erich

    2000-01-01

    Spreadsheets are an ubiquitous program category, and we will discuss their use in statistics and statistics education on various levels, ranging from very basic examples to extremely powerful methods. Since the spreadsheet paradigm is very familiar to many potential users, using it as the interface to statistical methods can make statistics more easily accessible.

  6. Statistical competencies for medical research learners: What is fundamental?

    Science.gov (United States)

    Enders, Felicity T; Lindsell, Christopher J; Welty, Leah J; Benn, Emma K T; Perkins, Susan M; Mayo, Matthew S; Rahbar, Mohammad H; Kidwell, Kelley M; Thurston, Sally W; Spratt, Heidi; Grambow, Steven C; Larson, Joseph; Carter, Rickey E; Pollock, Brad H; Oster, Robert A

    2017-06-01

    It is increasingly essential for medical researchers to be literate in statistics, but the requisite degree of literacy is not the same for every statistical competency in translational research. Statistical competency can range from 'fundamental' (necessary for all) to 'specialized' (necessary for only some). In this study, we determine the degree to which each competency is fundamental or specialized. We surveyed members of 4 professional organizations, targeting doctorally trained biostatisticians and epidemiologists who taught statistics to medical research learners in the past 5 years. Respondents rated 24 educational competencies on a 5-point Likert scale anchored by 'fundamental' and 'specialized.' There were 112 responses. Nineteen of 24 competencies were fundamental. The competencies considered most fundamental were assessing sources of bias and variation (95%), recognizing one's own limits with regard to statistics (93%), identifying the strengths, and limitations of study designs (93%). The least endorsed items were meta-analysis (34%) and stopping rules (18%). We have identified the statistical competencies needed by all medical researchers. These competencies should be considered when designing statistical curricula for medical researchers and should inform which topics are taught in graduate programs and evidence-based medicine courses where learners need to read and understand the medical research literature.

  7. An Exploration of the Perceived Usefulness of the Introductory Statistics Course and Students’ Intentions to Further Engage in Statistics

    Directory of Open Access Journals (Sweden)

    Rossi Hassad

    2018-01-01

    Full Text Available Students� attitude, including perceived usefulness, is generally associated with academic success. The related research in statistics education has focused almost exclusively on the role of attitude in explaining and predicting academic learning outcomes, hence there is a paucity of research evidence on how attitude (particularly perceived usefulness impacts students� intentions to use and stay engaged in statistics beyond the introductory course. This study explored the relationship between college students� perception of the usefulness of an introductory statistics course, their beliefs about where statistics will be most useful, and their intentions to take another statistics course. A cross-sectional study of 106 students was conducted. The mean rating for usefulness was 4.7 (out of 7, with no statistically significant differences based on gender and age. Sixty-four percent reported that they would consider taking another statistics course, and this subgroup rated the course as more useful (p = .01. The majority (67% reported that statistics would be most useful for either graduate school or research, whereas 14% indicated their job, and 19% were undecided. The �undecided� students had the lowest mean rating for usefulness of the course (p = .001. Addressing data, in the context of real-world problem-solving and decision-making, could facilitate students to better appreciate the usefulness and practicality of statistics. Qualitative research methods could help to elucidate these findings.

  8. Quantum statistics and liquid helium 3 - helum 4 mixtures

    International Nuclear Information System (INIS)

    Cohen, E.G.D.

    1979-01-01

    The behaviour of liquid helium 3-helium 4 mixtures is considered from the point of view of manifestation of quantum statistics effects in macrophysics. The Boze=Einstein statistics is shown to be of great importance for understanding superfluid helium-4 properties whereas the Fermi-Dirac statistics is of importance for understanding helium-3 properties. Without taking into consideration the interaction between the helium atoms it is impossible to understand the basic properties of liquid helium 33 - helium 4 mixtures at constant pressure. Proposed is a simple model of the liquid helium 3-helium 4 mixture, namely the binary mixture consisting of solid spheres of two types subjecting to the Fermi-Dirac and Bose-Einstein statistics relatively. This model predicts correctly the most surprising peculiarities of phase diagrams of concentration dependence on temperature for helium solutions. In particular, the helium 4 Bose-Einstein statistics is responsible for the phase lamination of helium solutions at low temperatures. It starts in the peculiar critical point. The helium 4 Fermi-Dirac statistics results in incomplete phase lamination close to the absolute zero temperatures, that permits operation of a powerful cooling facility, namely refrigerating machine on helium solution

  9. Data-driven inference for the spatial scan statistic

    Directory of Open Access Journals (Sweden)

    Duczmal Luiz H

    2011-08-01

    Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  10. Therapeutic Touch Has Significant Effects on Mouse Breast Cancer Metastasis and Immune Responses but Not Primary Tumor Size.

    Science.gov (United States)

    Gronowicz, Gloria; Secor, Eric R; Flynn, John R; Jellison, Evan R; Kuhn, Liisa T

    2015-01-01

    Evidence-based integrative medicine therapies have been introduced to promote wellness and offset side-effects from cancer treatment. Energy medicine is an integrative medicine technique using the human biofield to promote well-being. The biofield therapy chosen for study was Therapeutic Touch (TT). Breast cancer tumors were initiated in mice by injection of metastatic 66cl4 mammary carcinoma cells. The control group received only vehicle. TT or mock treatments were performed twice a week for 10 minutes. Two experienced TT practitioners alternated treatments. At 26 days, metastasis to popliteal lymph nodes was determined by clonogenic assay. Changes in immune function were measured by analysis of serum cytokines and by fluorescent activated cells sorting (FACS) of immune cells from the spleen and lymph nodes. No significant differences were found in body weight gain or tumor size. Metastasis was significantly reduced in the TT-treated mice compared to mock-treated mice. Cancer significantly elevated eleven cytokines. TT significantly reduced IL-1-a, MIG, IL-1b, and MIP-2 to control/vehicle levels. FACS demonstrated that TT significantly reduced specific splenic lymphocyte subsets and macrophages were significantly elevated with cancer. Human biofield therapy had no significant effect on primary tumor but produced significant effects on metastasis and immune responses in a mouse breast cancer model.

  11. Register-based statistics statistical methods for administrative data

    CERN Document Server

    Wallgren, Anders

    2014-01-01

    This book provides a comprehensive and up to date treatment of  theory and practical implementation in Register-based statistics. It begins by defining the area, before explaining how to structure such systems, as well as detailing alternative approaches. It explains how to create statistical registers, how to implement quality assurance, and the use of IT systems for register-based statistics. Further to this, clear details are given about the practicalities of implementing such statistical methods, such as protection of privacy and the coordination and coherence of such an undertaking. Thi

  12. Intensity response function of the photopic negative response (PhNR): effect of age and test-retest reliability.

    Science.gov (United States)

    Joshi, Nabin R; Ly, Emma; Viswanathan, Suresh

    2017-08-01

    To assess the effect of age and test-retest reliability of the intensity response function of the full-field photopic negative response (PhNR) in normal healthy human subjects. Full-field electroretinograms (ERGs) were recorded from one eye of 45 subjects, and 39 of these subjects were tested on two separate days with a Diagnosys Espion System (Lowell, MA, USA). The visual stimuli consisted of brief (test-retest reliability was assessed with the Wilcoxon signed-rank test and Bland-Altman analysis. Holm's correction was applied to account for multiple comparisons. V max of BT was significantly smaller than that of PT and b-wave, and the V max of PT and b-wave was not significantly different from each other. The slope parameter n was smallest for BT and the largest for b-wave and the difference between the slopes of all three measures were statistically significant. Small differences observed in the mean values of K for the different measures did not reach statistical significance. The Wilcoxon signed-rank test indicated no significant differences between the two test visits for any of the Naka-Rushton parameters for the three ERG measures, and the Bland-Altman plots indicated that the mean difference between test and retest measurements of the different fit parameters was close to zero and within 6% of the average of the test and retest values of the respective parameters for all three ERG measurements, indicating minimal bias. While the coefficient of reliability (COR, defined as 1.96 times the standard deviation of the test and retest difference) of each fit parameter was more or less comparable across the three ERG measurements, the %COR (COR normalized to the mean test and retest measures) was generally larger for BT compared to both PT and b-wave for each fit parameter. The Naka-Rushton fit parameters did not show statistically significant changes with age for any of the ERG measures when corrections were applied for multiple comparisons. However, the V max of

  13. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  14. Accelerator driven reactors, - the significance of the energy distribution of spallation neutrons on the neutron statistics

    Energy Technology Data Exchange (ETDEWEB)

    Fhager, V

    2000-01-01

    In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy

  15. Accelerator driven reactors, - the significance of the energy distribution of spallation neutrons on the neutron statistics

    International Nuclear Information System (INIS)

    Fhager, V.

    2000-01-01

    In order to make correct predictions of the second moment of statistical nuclear variables, such as the number of fissions and the number of thermalized neutrons, the dependence of the energy distribution of the source particles on their number should be considered. It has been pointed out recently that neglecting this number dependence in accelerator driven systems might result in bad estimates of the second moment, and this paper contains qualitative and quantitative estimates of the size of these efforts. We walk towards the requested results in two steps. First, models of the number dependent energy distributions of the neutrons that are ejected in the spallation reactions are constructed, both by simple assumptions and by extracting energy distributions of spallation neutrons from a high-energy particle transport code. Then, the second moment of nuclear variables in a sub-critical reactor, into which spallation neutrons are injected, is calculated. The results from second moment calculations using number dependent energy distributions for the source neutrons are compared to those where only the average energy distribution is used. Two physical models are employed to simulate the neutron transport in the reactor. One is analytical, treating only slowing down of neutrons by elastic scattering in the core material. For this model, equations are written down and solved for the second moment of thermalized neutrons that include the distribution of energy of the spallation neutrons. The other model utilizes Monte Carlo methods for tracking the source neutrons as they travel inside the reactor material. Fast and thermal fission reactions are considered, as well as neutron capture and elastic scattering, and the second moment of the number of fissions, the number of neutrons that leaked out of the system, etc. are calculated. Both models use a cylindrical core with a homogenous mixture of core material. Our results indicate that the number dependence of the energy

  16. In vivo evaluation of the effect of stimulus distribution on FIR statistical efficiency in event-related fMRI.

    Science.gov (United States)

    Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L

    2013-05-15

    Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.

  17. Definition of simulated driving tests for the evaluation of drivers' reactions and responses.

    Science.gov (United States)

    Bartolozzi, Riccardo; Frendo, Francesco

    2014-01-01

    This article aims at identifying the most significant measures in 2 perception-response (PR) tests performed at a driving simulator: a braking test and a lateral skid test, which were developed in this work. Forty-eight subjects (26 females and 22 males) with a mean age of 24.9 ± 3.0 years were enrolled for this study. They were asked to perform a drive on the driving simulator at the University of Pisa (Italy) following a specific test protocol, including 8-10 braking tests and 8-10 lateral skid tests. Driver input signals and vehicle model signals were recorded during the drives and analyzed to extract measures such as the reaction time, first response time, etc. Following a statistical procedure (based on analysis of variance [ANOVA] and post hoc tests), all test measures (3 for the braking test and 8 for the lateral skid test) were analyzed in terms of statistically significant differences among different drivers. The presented procedure allows evaluation of the capability of a given test to distinguish among different drivers. In the braking test, the reaction time showed a high dispersion among single drivers, leading to just 4.8 percent of statistically significant driver pairs (using the Games-Howell post hoc test), whereas the pedal transition time scored 31.9 percent. In the lateral skid test, 28.5 percent of the 2 × 2 comparisons showed significantly different reaction times, 19.5 percent had different response times, 35.2 percent had a different second peak of the steering wheel signal, and 33 percent showed different values of the integral of the steering wheel signal. For the braking test, which has been widely employed in similar forms in the literature, it was shown how the reaction time, with respect to the pedal transition time, can have a higher dispersion due to the influence of external factors. For the lateral skid test, the following measures were identified as the most significant for application studies: the reaction time for the reaction

  18. Analysis of photon statistics with Silicon Photomultiplier

    International Nuclear Information System (INIS)

    D'Ascenzo, N.; Saveliev, V.; Wang, L.; Xie, Q.

    2015-01-01

    The Silicon Photomultiplier (SiPM) is a novel silicon-based photodetector, which represents the modern perspective of low photon flux detection. The aim of this paper is to provide an introduction on the statistical analysis methods needed to understand and estimate in quantitative way the correct features and description of the response of the SiPM to a coherent source of light

  19. Method for determining appropriate statistical models of the random cyclic stress amplitudes of a stainless pipe weld metal

    International Nuclear Information System (INIS)

    Wang Jinnuo; Zhao Yongxiang; Wang Shaohua

    2001-01-01

    It is revealed by a strain-controlled fatigue test that there is a significant scatter of the cyclic stress-strain responses for a nuclear engineering material, 1Cr18Ni9Ti stainless steel pipe-weld metal. This implies that the existent deterministic analysis might be non-conservative. Taking into account the scatter, a method for determining the appropriate statistical models of material cyclic stress amplitudes is presented by considering the total fit, consistency with fatigue physics, and safety of design of seven commonly used distribution fitting into the test data. The seven distribution are Weibull (two-and three-parameter), normal, lognormal, extreme minimum value, extreme maximum value, and exponential. In the method, statistical parameters of the distributions are evaluated by a linear regression technique. Statistical tests are made by a transformation from t-distribution function to Pearson statistical parameter. Statistical tests are made by a transformation from t-distribution function to Pearson statistical parameter, i.e. the linear relationship coefficient. The total fit is assessed by a parameter so-called fitted relationship coefficient of the empirical and theoretical failure probabilities. The consistency with fatigue physics is analyzed by the hazard rate curves of distributions. The safety of design is measured by examining the change of predicted errors in the tail regions of distributions

  20. Software Used to Generate Cancer Statistics - SEER Cancer Statistics

    Science.gov (United States)

    Videos that highlight topics and trends in cancer statistics and definitions of statistical terms. Also software tools for analyzing and reporting cancer statistics, which are used to compile SEER's annual reports.

  1. Understanding Statistics and Statistics Education: A Chinese Perspective

    Science.gov (United States)

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  2. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  3. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: An SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, M.; de Haan, M.; Hogendoorn, S.M.; Wolters, L.H.; Huizenga, H.M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  4. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data

    NARCIS (Netherlands)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M.; Wolters, Lidewij H.; Huizenga, Hilde M.

    2015-01-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a

  5. Statistical inference and visualization in scale-space for spatially dependent images

    KAUST Repository

    Vaughan, Amy

    2012-03-01

    SiZer (SIgnificant ZERo crossing of the derivatives) is a graphical scale-space visualization tool that allows for statistical inferences. In this paper we develop a spatial SiZer for finding significant features and conducting goodness-of-fit tests for spatially dependent images. The spatial SiZer utilizes a family of kernel estimates of the image and provides not only exploratory data analysis but also statistical inference with spatial correlation taken into account. It is also capable of comparing the observed image with a specific null model being tested by adjusting the statistical inference using an assumed covariance structure. Pixel locations having statistically significant differences between the image and a given null model are highlighted by arrows. The spatial SiZer is compared with the existing independent SiZer via the analysis of simulated data with and without signal on both planar and spherical domains. We apply the spatial SiZer method to the decadal temperature change over some regions of the Earth. © 2011 The Korean Statistical Society.

  6. The Role of Statistics in Business and Industry

    CERN Document Server

    Hahn, Gerald J

    2011-01-01

    An insightful guide to the use of statistics for solving key problems in modern-day business and industry This book has been awarded the Technometrics Ziegel Prize for the best book reviewed by the journal in 2010. Technometrics is a journal of statistics for the physical, chemical and engineering sciences, published jointly by the American Society for Quality and the American Statistical Association. Criteria for the award include that the book brings together in one volume a body of material previously only available in scattered research articles and having the potential to significantly im

  7. Statistical analysis of the factors that influenced the mechanical properties improvement of cassava starch films

    Science.gov (United States)

    Monteiro, Mayra; Oliveira, Victor; Santos, Francisco; Barros Neto, Eduardo; Silva, Karyn; Silva, Rayane; Henrique, João; Chibério, Abimaelle

    2017-08-01

    In order to obtain cassava starch films with improved mechanical properties in relation to the synthetic polymer in the packaging production, a complete factorial design 23 was carried out in order to investigate which factor significantly influences the tensile strength of the biofilm. The factors to be investigated were cassava starch, glycerol and modified clay contents. Modified bentonite clay was used as a filling material of the biofilm. Glycerol was the plasticizer used to thermoplastify cassava starch. The factorial analysis suggested a regression model capable of predicting the optimal mechanical property of the cassava starch film from the maximization of the tensile strength. The reliability of the regression model was tested by the correlation established with the experimental data through the following statistical analyse: Pareto graph. The modified clay was the factor of greater statistical significance on the observed response variable, being the factor that contributed most to the improvement of the mechanical property of the starch film. The factorial experiments showed that the interaction of glycerol with both modified clay and cassava starch was significant for the reduction of biofilm ductility. Modified clay and cassava starch contributed to the maximization of biofilm ductility, while glycerol contributed to the minimization.

  8. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  9. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  10. Statistical shape analysis with applications in R

    CERN Document Server

    Dryden, Ian L

    2016-01-01

    A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while reta...

  11. Brain SPECT analysis using statistical parametric mapping in patients with posttraumatic stress disorder

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Euy Neyng; Sohn, Hyung Sun; Kim, Sung Hoon; Chung, Soo Kyo; Yang, Dong Won [College of Medicine, The Catholic Univ. of Korea, Seoul (Korea, Republic of)

    2001-07-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with posttraumatic stress disorder (PTSD) using statistical parametric mapping (SPM99). Noninvasive rCBF measurements using {sup 99m}Tc-ethyl cysteinate dimer (ECD) SPECT were performed on 23 patients with PTSD and 21 age matched normal controls without re-exposure to accident-related stimuli. The relative rCBF maps in patients with PTSD and controls were compared. In patients with PTSD, significant increased rCBF was found along the limbic system in the brain. There were a few foci of decreased rCBF in the superior frontal gyrus, parietal and temporal region. PTSD is associated with increased rCBF in limbic areas compared with age-matched normal controls. These findings implicate regions of the limbic brain, which may mediate the response to aversive stimuli in healthy individuals, play on important role in patients suffering from PTSD and suggest that ongoing hyperfunction of 'overlearned survival response' or flashbacks response in these regions after painful, life threatening, or horrifying events without re-exposure to same traumatic stimulus.

  12. Brain SPECT analysis using statistical parametric mapping in patients with posttraumatic stress disorder

    International Nuclear Information System (INIS)

    Kim, Euy Neyng; Sohn, Hyung Sun; Kim, Sung Hoon; Chung, Soo Kyo; Yang, Dong Won

    2001-01-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with posttraumatic stress disorder (PTSD) using statistical parametric mapping (SPM99). Noninvasive rCBF measurements using 99m Tc-ethyl cysteinate dimer (ECD) SPECT were performed on 23 patients with PTSD and 21 age matched normal controls without re-exposure to accident-related stimuli. The relative rCBF maps in patients with PTSD and controls were compared. In patients with PTSD, significant increased rCBF was found along the limbic system in the brain. There were a few foci of decreased rCBF in the superior frontal gyrus, parietal and temporal region. PTSD is associated with increased rCBF in limbic areas compared with age-matched normal controls. These findings implicate regions of the limbic brain, which may mediate the response to aversive stimuli in healthy individuals, play on important role in patients suffering from PTSD and suggest that ongoing hyperfunction of 'overlearned survival response' or flashbacks response in these regions after painful, life threatening, or horrifying events without re-exposure to same traumatic stimulus

  13. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  14. Statistical applications for chemistry, manufacturing and controls (CMC) in the pharmaceutical industry

    CERN Document Server

    Burdick, Richard K; Pfahler, Lori B; Quiroz, Jorge; Sidor, Leslie; Vukovinsky, Kimberly; Zhang, Lanju

    2017-01-01

    This book examines statistical techniques that are critically important to Chemistry, Manufacturing, and Control (CMC) activities. Statistical methods are presented with a focus on applications unique to the CMC in the pharmaceutical industry. The target audience consists of statisticians and other scientists who are responsible for performing statistical analyses within a CMC environment. Basic statistical concepts are addressed in Chapter 2 followed by applications to specific topics related to development and manufacturing. The mathematical level assumes an elementary understanding of statistical methods. The ability to use Excel or statistical packages such as Minitab, JMP, SAS, or R will provide more value to the reader. The motivation for this book came from an American Association of Pharmaceutical Scientists (AAPS) short course on statistical methods applied to CMC applications presented by four of the authors. One of the course participants asked us for a good reference book, and the only book recomm...

  15. A Reanalysis of Curvature in the Dose Response for Cancer and Modifications by Age at Exposure Following Radiation Therapy for Benign Disease

    International Nuclear Information System (INIS)

    Little, Mark P.; Stovall, Marilyn; Smith, Susan A.; Kleinerman, Ruth A.

    2013-01-01

    Purpose: To assess the shape of the dose response for various cancer endpoints and modifiers by age and time. Methods and Materials: Reanalysis of the US peptic ulcer data testing for heterogeneity of radiogenic risk by cancer endpoint (stomach, pancreas, lung, leukemia, all other). Results: There are statistically significant (P −1 of 0.024 (95% confidence interval [CI] 0.011, 0.039), 0.559 (95% CI 0.221, 1.021), 0.042 (95% CI −0.002, 0.119), and 1.087 (95% CI −0.018, 4.925), respectively. There is statistically significant (P=.007) excess risk of pancreatic cancer when adjusted for dose-response curvature. General downward curvature is apparent in the dose response, statistically significant (P<.05) for all cancers, pancreatic cancer, and all other cancers (ie, other than stomach, pancreas, lung, leukemia). There are indications of reduction in relative risk with increasing age at exposure (for all cancers, pancreatic cancer), but no evidence for quadratic variations in relative risk with age at exposure. If a linear-exponential dose response is used, there is no significant heterogeneity in the dose response among the 5 endpoints considered or in the speed of variation of relative risk with age at exposure. The risks are generally consistent with those observed in the Japanese atomic bomb survivors and in groups of nuclear workers. Conclusions: There are excess risks for various malignancies in this data set. Generally there is a marked downward curvature in the dose response and significant reduction in relative risk with increasing age at exposure. The consistency of risks with those observed in the Japanese atomic bomb survivors and in groups of nuclear workers implies that there may be little sparing effect of fractionation of dose or low-dose-rate exposure.

  16. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  17. Teaching biology through statistics: application of statistical methods in genetics and zoology courses.

    Science.gov (United States)

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.

  18. Effects of quantum coherence on work statistics

    Science.gov (United States)

    Xu, Bao-Ming; Zou, Jian; Guo, Li-Sha; Kong, Xiang-Mu

    2018-05-01

    In the conventional two-point measurement scheme of quantum thermodynamics, quantum coherence is destroyed by the first measurement. But as we know the coherence really plays an important role in the quantum thermodynamics process, and how to describe the work statistics for a quantum coherent process is still an open question. In this paper, we use the full counting statistics method to investigate the effects of quantum coherence on work statistics. First, we give a general discussion and show that for a quantum coherent process, work statistics is very different from that of the two-point measurement scheme, specifically the average work is increased or decreased and the work fluctuation can be decreased by quantum coherence, which strongly depends on the relative phase, the energy level structure, and the external protocol. Then, we concretely consider a quenched one-dimensional transverse Ising model and show that quantum coherence has a more significant influence on work statistics in the ferromagnetism regime compared with that in the paramagnetism regime, so that due to the presence of quantum coherence the work statistics can exhibit the critical phenomenon even at high temperature.

  19. Demand Response to Advertising in the Australian Meat Industry

    OpenAIRE

    Nicholas E. Piggott; James A. Chalfant; Julian M. Alston; Garry R. Griffith

    1996-01-01

    The implications of model specification choices for the measurement of demand response to advertising are examined using Australian data. Single-equation models versus complete systems and alternative corrections for autocorrelation are evaluated. Competing advertising efforts by two producer bodies are included. Across all specifications, the evidence on advertising effects is fairly consistent. In the preferred model, the only statistically significant effects of advertising are for Austral...

  20. Statistical learning and selective inference.

    Science.gov (United States)

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  1. Immune response and anamnestic immune response in children after a 3-dose primary hepatitis b vaccination

    International Nuclear Information System (INIS)

    Afzal, M.F.; Sultan, M.A.; Saleemi, A.I.

    2017-01-01

    Diseases caused by Hepatitis B virus (HBV) have a worldwide distribution. Pakistan adopted the recommendations of World Health Organization (WHO) for routine universal infant vaccination against hepatitis B in 2002, currently being administered at 6, 10, and 14 weeks of age in a combination vaccine. This study was conducted to determine the immune response and anamnestic immune response in children, 9 months-10 years of age, after a 3-dose primary Hepatitis B vaccination. Methods: This cross sectional study was conducted in the Department of Paediatrics, King Edward Medical University/Mayo Hospital, Lahore, Pakistan, from January to June, 2014. A total of 200 children of either sex between the ages of 9 months to 10 years, docu mented to have received 3 doses of hepatitis B vaccines according to Expanded Program of Immunization (6,10,14 weeks) schedule in infancy, were recruited by consecutive sampling. The level of serum anti-HBsAb by ELIZA was measured. Children with anti-HBs titers =10 mIU/mL were considered to be immune. Those with anti-HBsAb levels <10 mIU/mL were offered a booster dose of infant recombinant hepatitis B vaccine. The second serum sample was obtained 21-28 days following the administration of the booster dose and the anamnestic immune response was measured. Data was analysed using SPSS 17 to determine the relation between time interval since last vaccination and antibody titer. Chi square test was applied. Results: Of the 200 children, protective antibody response was found in 58 percent. Median serological response was 18.60 (range 2.82-65.15). Antibody levels were found to have a statistically significant (p-value 0.019) negative correlation with the time since last administration of vaccine. A booster dose of Hepatitis B vaccine was administered to all non-responders, with each registering a statistically significant (p-value 0.00) anamnestic response. Conclusion: The vaccination schedule with short dosage interval was unable to provide

  2. The Corporate Social Responsibility of Family Businesses: An International Approach

    Directory of Open Access Journals (Sweden)

    Gérard Hirigoyen

    2014-07-01

    Full Text Available This study analyzes the links between listed family businesses and social responsibility. On the theoretical level, it establishes a relationship between socioemotional wealth, proactive stakeholder engagement, and the social responsibility of family businesses. On a practical level, our results (obtained from a sample of 363 companies show that family businesses do not differ from non-family businesses in many dimensions of social responsibility. Moreover, family businesses have statistically significant lower ratings for four sub-dimensions of “corporate governance”, namely “balance of power and effectiveness of the Board”, “audit and control mechanisms”, “engagement with shareholders and shareholder structure”, and “executive compensation”.

  3. Applying Item Response Theory Methods to Examine the Impact of Different Response Formats

    Science.gov (United States)

    Hohensinn, Christine; Kubinger, Klaus D.

    2011-01-01

    In aptitude and achievement tests, different response formats are usually used. A fundamental distinction must be made between the class of multiple-choice formats and the constructed response formats. Previous studies have examined the impact of different response formats applying traditional statistical approaches, but these influences can also…

  4. Statistical reporting inconsistencies in experimental philosophy.

    Science.gov (United States)

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science.

  5. Statistical reporting inconsistencies in experimental philosophy

    Science.gov (United States)

    Colombo, Matteo; Duev, Georgi; Nuijten, Michèle B.; Sprenger, Jan

    2018-01-01

    Experimental philosophy (x-phi) is a young field of research in the intersection of philosophy and psychology. It aims to make progress on philosophical questions by using experimental methods traditionally associated with the psychological and behavioral sciences, such as null hypothesis significance testing (NHST). Motivated by recent discussions about a methodological crisis in the behavioral sciences, questions have been raised about the methodological standards of x-phi. Here, we focus on one aspect of this question, namely the rate of inconsistencies in statistical reporting. Previous research has examined the extent to which published articles in psychology and other behavioral sciences present statistical inconsistencies in reporting the results of NHST. In this study, we used the R package statcheck to detect statistical inconsistencies in x-phi, and compared rates of inconsistencies in psychology and philosophy. We found that rates of inconsistencies in x-phi are lower than in the psychological and behavioral sciences. From the point of view of statistical reporting consistency, x-phi seems to do no worse, and perhaps even better, than psychological science. PMID:29649220

  6. Statistical Analysis of Big Data on Pharmacogenomics

    Science.gov (United States)

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  7. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  8. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  9. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  10. Statistical Inference at Work: Statistical Process Control as an Example

    Science.gov (United States)

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  11. What can we learn from noise? - Mesoscopic nonequilibrium statistical physics.

    Science.gov (United States)

    Kobayashi, Kensuke

    2016-01-01

    Mesoscopic systems - small electric circuits working in quantum regime - offer us a unique experimental stage to explorer quantum transport in a tunable and precise way. The purpose of this Review is to show how they can contribute to statistical physics. We introduce the significance of fluctuation, or equivalently noise, as noise measurement enables us to address the fundamental aspects of a physical system. The significance of the fluctuation theorem (FT) in statistical physics is noted. We explain what information can be deduced from the current noise measurement in mesoscopic systems. As an important application of the noise measurement to statistical physics, we describe our experimental work on the current and current noise in an electron interferometer, which is the first experimental test of FT in quantum regime. Our attempt will shed new light in the research field of mesoscopic quantum statistical physics.

  12. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  13. Solution of the statistical bootstrap with Bose statistics

    International Nuclear Information System (INIS)

    Engels, J.; Fabricius, K.; Schilling, K.

    1977-01-01

    A brief and transparent way to introduce Bose statistics into the statistical bootstrap of Hagedorn and Frautschi is presented. The resulting bootstrap equation is solved by a cluster expansion for the grand canonical partition function. The shift of the ultimate temperature due to Bose statistics is determined through an iteration process. We discuss two-particle spectra of the decaying fireball (with given mass) as obtained from its grand microcanonical level density

  14. Comparing Simulated and Theoretical Sampling Distributions of the U3 Person-Fit Statistic.

    Science.gov (United States)

    Emons, Wilco H. M.; Meijer, Rob R.; Sijtsma, Klaas

    2002-01-01

    Studied whether the theoretical sampling distribution of the U3 person-fit statistic is in agreement with the simulated sampling distribution under different item response theory models and varying item and test characteristics. Simulation results suggest that the use of standard normal deviates for the standardized version of the U3 statistic may…

  15. Descriptive and inferential statistical methods used in burns research.

    Science.gov (United States)

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals

  16. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  17. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  18. Effects of inbreeding on potential and realized immune responses in Tenebrio molitor.

    Science.gov (United States)

    Rantala, Markus J; Viitaniemi, Heidi; Roff, Derek A

    2011-06-01

    Although numerous studies on vertebrates suggest that inbreeding reduces their resistance against parasites and pathogens, studies in insects have found contradictory evidence. In this study we tested the effect of 1 generation of brother-sister mating (inbreeding) on potential and realized immune responses and other life-history traits in Tenebrio molitor. We found that inbreeding reduced adult mass, pre-adult survival and increased development time, suggesting that inbreeding reduced the condition of the adults and thus potentially made them more susceptible to physiological stress. However, we found no significant effect of inbreeding on the potential immune response (encapsulation response), but inbreeding reduced the realized immune response (resistance against the entomopathogenic fungi, Beauveria bassiana). There was a significant family effect on encapsulation response, but no family effect on the resistance against the entomopathogenic fungi. Given that this latter trait showed significant inbreeding depression and that the sample size for the family-effect analysis was small it is likely that the lack of a significant family effect is due to reduced statistical power, rather than the lack of a heritable basis to the trait. Our study highlights the importance of using pathogens and parasites in immunoecological studies.

  19. On the Relationship Between Transfer Function-derived Response Times and Hydrograph Analysis Timing Parameters: Are there Similarities?

    Science.gov (United States)

    Bansah, S.; Ali, G.; Haque, M. A.; Tang, V.

    2017-12-01

    The proportion of precipitation that becomes streamflow is a function of internal catchment characteristics - which include geology, landscape characteristics and vegetation - and influence overall storage dynamics. The timing and quantity of water discharged by a catchment are indeed embedded in event hydrographs. Event hydrograph timing parameters, such as the response lag and time of concentration, are important descriptors of how long it takes the catchment to respond to input precipitation and how long it takes the latter to filter through the catchment. However, the extent to which hydrograph timing parameters relate to average response times derived from fitting transfer functions to annual hydrographs is unknown. In this study, we used a gamma transfer function to determine catchment average response times as well as event-specific hydrograph parameters across a network of eight nested watersheds ranging from 0.19 km2 to 74.6 km2 prairie catchments located in south central Manitoba (Canada). Various statistical analyses were then performed to correlate average response times - estimated using the parameters of the fitted gamma transfer function - to event-specific hydrograph parameters. Preliminary results show significant interannual variations in response times and hydrograph timing parameters: the former were in the order of a few hours to days, while the latter ranged from a few days to weeks. Some statistically significant relationships were detected between response times and event-specific hydrograph parameters. Future analyses will involve the comparison of statistical distributions of event-specific hydrograph parameters with that of runoff response times and baseflow transit times in order to quantity catchment storage dynamics across a range of temporal scales.

  20. Statistical Process Control: Going to the Limit for Quality.

    Science.gov (United States)

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  1. Statistical tests for person misfit in computerized adaptive testing

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Meijer, R.R.; van Krimpen-Stoop, Edith

    1998-01-01

    Recently, several person-fit statistics have been proposed to detect nonfitting response patterns. This study is designed to generalize an approach followed by Klauer (1995) to an adaptive testing system using the two-parameter logistic model (2PL) as a null model. The approach developed by Klauer

  2. Cortical Surround Interactions and Perceptual Salience via Natural Scene Statistics.

    Directory of Open Access Journals (Sweden)

    Ruben Coen-Cagli

    Full Text Available Spatial context in images induces perceptual phenomena associated with salience and modulates the responses of neurons in primary visual cortex (V1. However, the computational and ecological principles underlying contextual effects are incompletely understood. We introduce a model of natural images that includes grouping and segmentation of neighboring features based on their joint statistics, and we interpret the firing rates of V1 neurons as performing optimal recognition in this model. We show that this leads to a substantial generalization of divisive normalization, a computation that is ubiquitous in many neural areas and systems. A main novelty in our model is that the influence of the context on a target stimulus is determined by their degree of statistical dependence. We optimized the parameters of the model on natural image patches, and then simulated neural and perceptual responses on stimuli used in classical experiments. The model reproduces some rich and complex response patterns observed in V1, such as the contrast dependence, orientation tuning and spatial asymmetry of surround suppression, while also allowing for surround facilitation under conditions of weak stimulation. It also mimics the perceptual salience produced by simple displays, and leads to readily testable predictions. Our results provide a principled account of orientation-based contextual modulation in early vision and its sensitivity to the homogeneity and spatial arrangement of inputs, and lends statistical support to the theory that V1 computes visual salience.

  3. All of statistics a concise course in statistical inference

    CERN Document Server

    Wasserman, Larry

    2004-01-01

    This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...

  4. What type of statistical model to choose for the analysis of radioimmunoassays

    International Nuclear Information System (INIS)

    Huet, S.

    1984-01-01

    The current techniques used for statistical analysis of radioimmunoassays are not very satisfactory for either the statistician or the biologist. They are based on an attempt to make the response curve linear to avoid complicated computations. The present article shows that this practice has considerable effects (often neglected) on the statistical assumptions which must be formulated. A more strict analysis is proposed by applying the four-parameter logistic model. The advantages of this method are: the statistical assumptions formulated are based on observed data, and the model can be applied to almost all radioimmunoassays [fr

  5. Computed statistics at streamgages, and methods for estimating low-flow frequency statistics and development of regional regression equations for estimating low-flow frequency statistics at ungaged locations in Missouri

    Science.gov (United States)

    Southard, Rodney E.

    2013-01-01

    located in Region 1, 120 were located in Region 2, and 10 were located in Region 3. Streamgages located outside of Missouri were selected to extend the range of data used for the independent variables in the regression analyses. Streamgages included in the regression analyses had 10 or more years of record and were considered to be affected minimally by anthropogenic activities or trends. Regional regression analyses identified three characteristics as statistically significant for the development of regional equations. For Region 1, drainage area, longest flow path, and streamflow-variability index were statistically significant. The range in the standard error of estimate for Region 1 is 79.6 to 94.2 percent. For Region 2, drainage area and streamflow variability index were statistically significant, and the range in the standard error of estimate is 48.2 to 72.1 percent. For Region 3, drainage area and streamflow-variability index also were statistically significant with a range in the standard error of estimate of 48.1 to 96.2 percent. Limitations on the use of estimating low-flow frequency statistics at ungaged locations are dependent on the method used. The first method outlined for use in Missouri, power curve equations, were developed to estimate the selected statistics for ungaged locations on 28 selected streams with multiple streamgages located on the same stream. A second method uses a drainage-area ratio to compute statistics at an ungaged location using data from a single streamgage on the same stream with 10 or more years of record. Ungaged locations on these streams may use the ratio of the drainage area at an ungaged location to the drainage area at a streamgage location to scale the selected statistic value from the streamgage location to the ungaged location. This method can be used if the drainage area of the ungaged location is within 40 to 150 percent of the streamgage drainage area. The third method is the use of the regional regression equations

  6. Double sliding-window technique: a new method to calculate the neuronal response onset latency.

    Science.gov (United States)

    Berényi, Antal; Benedek, György; Nagy, Attila

    2007-10-31

    Neuronal response onset latency provides important data on the information processing within the central nervous system. In order to enhance the quality of the onset latency estimation, we have developed a 'double sliding-window' technique, which combines the advantages of mathematical methods with the reliability of standard statistical processes. This method is based on repetitive series of statistical probes between two virtual time windows. The layout of the significance curve reveals the starting points of changes in neuronal activity in the form of break-points between linear segments. A second-order difference function is applied to determine the position of maximum slope change, which corresponds to the onset of the response. In comparison with Poisson spike-train analysis, the cumulative sum technique and the method of Falzett et al., this 'double sliding-window, technique seems to be a more accurate automated procedure to calculate the response onset latency of a broad range of neuronal response characteristics.

  7. A Statistical Method for Aggregated Wind Power Plants to Provide Secondary Frequency Control

    DEFF Research Database (Denmark)

    Hu, Junjie; Ziras, Charalampos; Bindner, Henrik W.

    2017-01-01

    curtailment for aggregated wind power plants providing secondary frequency control (SFC) to the power system. By using historical SFC signals and wind speed data, we calculate metrics for the reserve provision error as a function of the scheduled wind power. We show that wind curtailment can be significantly......The increasing penetration of wind power brings significant challenges to power system operators due to the wind’s inherent uncertainty and variability. Traditionally, power plants and more recently demand response have been used to balance the power system. However, the use of wind power...... as a balancing-power source has also been investigated, especially for wind power dominated power systems such as Denmark. The main drawback is that wind power must be curtailed by setting a lower operating point, in order to offer upward regulation. We propose a statistical approach to reduce wind power...

  8. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  9. Effect of Two Different Doses of Dexmedetomidine on Stress Response in Laparoscopic Pyeloplasty: A Randomized Prospective Controlled Study.

    Science.gov (United States)

    Shamim, Rafat; Srivastava, Shashi; Rastogi, Amit; Kishore, Kamal; Srivastava, Aneesh

    2017-01-01

    Clonidine, opioids, β-blockers, and dexmedetomidine have been tried to attenuate stress responses during laparoscopic surgery. We evaluated the efficacy of dexmedetomidine in two different doses in attenuating stress responses on patients undergoing laparoscopic pyeloplasty. Ninety patients were assigned to one of the three groups: Group A, Group B, and Group C. Group B received dexmedetomidine 1 mcg/kg as loading dose, followed by 0.7 mcg/kg/h for maintenance; Group C received dexmedetomidine 0.7 mcg/kg as a loading dose, followed by 0.5 mcg/kg/h for maintenance. Group A received normal saline. Stress responses were assessed by the variations in heart rate (HR), mean arterial pressure (MAP), blood glucose levels, and serum cortisol levels. One-way analysis of variance test was applied. Multiple comparisons between groups were done with post hoc Bonferroni test. The HR and MAP were found to be higher in Group A. The difference was statistically significant ( P < 0.05) during intubation, carbon dioxide insufflation, and extubation when compared with Groups B and C. Blood glucose levels at postintubation and at extubation were higher in Group A and statistically significant ( P < 0.05) when compared with Groups B and C. Serum cortisol levels at postintubation, during midsurgery, and 2 h after extubation were higher in Group A and statistically significant ( P < 0.05) when compared with Groups B and C. However, HR, MAP, blood glucose levels, and serum cortisol levels were similar in dexmedetomidine groups. Dexmedetomidine decreases stress response and provides good condition for maintenance of anesthesia. Dexmedetomidine when used in lower dose in Group C decreases stress response comparable to higher dose in Group B.

  10. Inverse Statistics in the Foreign Exchange Market

    OpenAIRE

    Jensen, M. H.; Johansen, A.; Petroni, F.; Simonsen, I.

    2004-01-01

    We investigate intra-day foreign exchange (FX) time series using the inverse statistic analysis developed in [1,2]. Specifically, we study the time-averaged distributions of waiting times needed to obtain a certain increase (decrease) $\\rho$ in the price of an investment. The analysis is performed for the Deutsch mark (DM) against the $US for the full year of 1998, but similar results are obtained for the Japanese Yen against the $US. With high statistical significance, the presence of "reson...

  11. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  12. Statistical and Economic Techniques for Site-specific Nematode Management.

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  13. Predicting meaningful outcomes to medication and self-help treatments for binge-eating disorder in primary care: The significance of early rapid response.

    Science.gov (United States)

    Grilo, Carlos M; White, Marney A; Masheb, Robin M; Gueorguieva, Ralitza

    2015-04-01

    We examined rapid response among obese patients with binge-eating disorder (BED) in a randomized clinical trial testing antiobesity medication and self-help cognitive-behavioral therapy (shCBT), alone and in combination, in primary-care settings. One hundred four obese patients with BED were randomly assigned to 1 of 4 treatments: sibutramine, placebo, shCBT + sibutramine, or shCBT + placebo. Treatments were delivered by generalist primary-care physicians and the medications were given double-blind. Independent assessments were performed by trained and monitored doctoral research clinicians monthly throughout treatment, posttreatment (4 months), and at 6- and 12-month follow-ups (i.e., 16 months after randomization). Rapid response, defined as ≥65% reduction in binge eating by the fourth treatment week, was used to predict outcomes. Rapid response characterized 47% of patients, was unrelated to demographic and baseline clinical characteristics, and was significantly associated, prospectively, with remission from binge eating at posttreatment (51% vs. 9% for nonrapid responders), 6-month (53% vs. 23.6%), and 12-month (46.9% vs. 23.6%) follow-ups. Mixed-effects model analyses revealed that rapid response was significantly associated with greater decreases in binge-eating or eating-disorder psychopathology, depression, and percent weight loss. Our findings, based on a diverse obese patient group receiving medication and shCBT for BED in primary-care settings, indicate that patients who have a rapid response achieve good clinical outcomes through 12-month follow-ups after ending treatment. Rapid response represents a strong prognostic indicator of clinically meaningful outcomes, even in low-intensity medication and self-help interventions. Rapid response has important clinical implications for stepped-care treatment models for BED. clinicaltrials.gov: NCT00537810 (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  14. A method for statistical steady state thermal analysis of reactor cores

    International Nuclear Information System (INIS)

    Whetton, P.A.

    1980-01-01

    This paper presents a method for performing a statistical steady state thermal analysis of a reactor core. The technique is only outlined here since detailed thermal equations are dependent on the core geometry. The method has been applied to a pressurised water reactor core and the results are presented for illustration purposes. Random hypothetical cores are generated using the Monte-Carlo method. The technique shows that by splitting the parameters into two types, denoted core-wise and in-core, the Monte Carlo method may be used inexpensively. The idea of using extremal statistics to characterise the low probability events (i.e. the tails of a distribution) is introduced together with a method of forming the final probability distribution. After establishing an acceptable probability of exceeding a thermal design criterion, the final probability distribution may be used to determine the corresponding thermal response value. If statistical and deterministic (i.e. conservative) thermal response values are compared, information on the degree of pessimism in the deterministic method of analysis may be inferred and the restrictive performance limitations imposed by this method relieved. (orig.)

  15. Significant Correlation Between the Infant Gut Microbiome and Rotavirus Vaccine Response in Rural Ghana.

    Science.gov (United States)

    Harris, Vanessa C; Armah, George; Fuentes, Susana; Korpela, Katri E; Parashar, Umesh; Victor, John C; Tate, Jacqueline; de Weerth, Carolina; Giaquinto, Carlo; Wiersinga, Willem Joost; Lewis, Kristen D C; de Vos, Willem M

    2017-01-01

     Rotavirus (RV) is the leading cause of diarrhea-related death in children worldwide and 95% of RV-associated deaths occur in Africa and Asia where RV vaccines (RVVs) have lower efficacy. We hypothesize that differences in intestinal microbiome composition correlate with the decreased RVV efficacy observed in poor settings.  We conducted a nested, case-control study comparing prevaccination, fecal microbiome compositions between 6-week old, matched RVV responders and nonresponders in rural Ghana. These infants' microbiomes were then compared with 154 age-matched, healthy Dutch infants' microbiomes, assumed to be RVV responders. Fecal microbiome analysis was performed in all groups using the Human Intestinal Tract Chip.  We analyzed findings in 78 Ghanaian infants, including 39 RVV responder and nonresponder pairs. The overall microbiome composition was significantly different between RVV responders and nonresponders (FDR, 0.12), and Ghanaian responders were more similar to Dutch infants than nonresponders (P = .002). RVV response correlated with an increased abundance of Streptococcus bovis and a decreased abundance of the Bacteroidetes phylum in comparisons between both Ghanaian RVV responders and nonresponders (FDR, 0.008 vs 0.003) and Dutch infants and Ghanaian nonresponders (FDR, 0.002 vs 0.009).  The intestinal microbiome composition correlates significantly with RVV immunogenicity and may contribute to the diminished RVV immunogenicity observed in developing countries. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America.

  16. Multiparametric statistical investigation of seismicity occurred at El Hierro (Canary Islands) from 2011 to 2014

    Science.gov (United States)

    Telesca, Luciano; Lovallo, Michele; Lopez, Carmen; Marti Molist, Joan

    2016-03-01

    A detailed statistical investigation of the seismicity occurred at El Hierro volcano (Canary Islands) from 2011 to 2014 has been performed by analysing the time variation of four parameters: the Gutenberg-Richter b-value, the local coefficient of variation, the scaling exponent of the magnitude distribution and the main periodicity of the earthquake sequence calculated by using the Schuster's test. These four parameters are good descriptors of the time and magnitude distributions of the seismic sequence, and their variation indicate dynamical changes in the volcanic system. These variations can be attributed to the causes and types of seismicity, thus allowing to distinguish between different host-rock fracturing processes caused by intrusions of magma at different depths and overpressures. The statistical patterns observed among the studied unrest episodes and between them and the eruptive episode of 2011-2012 indicate that the response of the host rock to the deformation imposed by magma intrusion did not differ significantly from one episode to the other, thus suggesting that no significant local stress changes induced by magma intrusion occurred when comparing between all them. Therefore, despite the studied unrest episodes were caused by intrusions of magma at different depths and locations below El Hierro island, the mechanical response of the lithosphere was similar in all cases. This suggests that the reason why the first unrest culminated in an eruption while the other did not, may be related to the role of the regional/local tectonics acting at that moment, rather than to the forceful of magma intrusion.

  17. Statistical optimization of beta-carotene production by Arthrobacter agilis A17 using response surface methodology and Box-Behnken design

    Science.gov (United States)

    Özdal, Murat; Özdal, Özlem Gür; Gürkök, Sümeyra

    2017-04-01

    β-carotene is a commercially important natural pigment and has been widely applied in the medicine, pharmaceutical, food, feed and cosmetic industries. The current study aimed to investigate the usability of molasses for β-carotene production by Arthrobacter agilis A17 (KP318146) and to optimize the production process. Box-Behnken Design of Response Surface Methodology was used to determine the optimum levels and the interactions of three independent variables namely molasses, yeast extract and KH2PO4 at three different levels. β-carotene yield in optimized medium containing 70 g/l molasses, 25 g/l yeast extract and 0.96 g/l KH2PO4, reached up to 100 mg/l, which is approximately 2.5-fold higher than the yield, obtained from control cultivation. A remarkable β-carotene production on inexpensive carbon source was achieved with the use of statistical optimization.

  18. Fabrication of Hyperbranched Block-Statistical Copolymer-Based Prodrug with Dual Sensitivities for Controlled Release.

    Science.gov (United States)

    Zheng, Luping; Wang, Yunfei; Zhang, Xianshuo; Ma, Liwei; Wang, Baoyan; Ji, Xiangling; Wei, Hua

    2018-01-17

    Dendrimer with hyperbranched structure and multivalent surface is regarded as one of the most promising candidates close to the ideal drug delivery systems, but the clinical translation and scale-up production of dendrimer has been hampered significantly by the synthetic difficulties. Therefore, there is considerable scope for the development of novel hyperbranched polymer that can not only address the drawbacks of dendrimer but maintain its advantages. The reversible addition-fragmentation chain transfer self-condensing vinyl polymerization (RAFT-SCVP) technique has enabled facile preparation of segmented hyperbranched polymer (SHP) by using chain transfer monomer (CTM)-based double-head agent during the past decade. Meanwhile, the design and development of block-statistical copolymers has been proven in our recent studies to be a simple yet effective way to address the extracellular stability vs intracellular high delivery efficacy dilemma. To integrate the advantages of both hyperbranched and block-statistical structures, we herein reported the fabrication of hyperbranched block-statistical copolymer-based prodrug with pH and reduction dual sensitivities using RAFT-SCVP and post-polymerization click coupling. The external homo oligo(ethylene glycol methyl ether methacrylate) (OEGMA) block provides sufficient extracellularly colloidal stability for the nanocarriers by steric hindrance, and the interior OEGMA units incorporated by the statistical copolymerization promote intracellular drug release by facilitating the permeation of GSH and H + for the cleavage of the reduction-responsive disulfide bond and pH-liable carbonate link as well as weakening the hydrophobic encapsulation of drug molecules. The delivery efficacy of the target hyperbranched block-statistical copolymer-based prodrug was evaluated in terms of in vitro drug release and cytotoxicity studies, which confirms both acidic pH and reduction-triggered drug release for inhibiting proliferation of He

  19. Significance of fluid-structure interaction phenomena for containment response to ex-vessel steam explosions

    Energy Technology Data Exchange (ETDEWEB)

    Almstroem, H.; Sundel, T. [National Defence Research Establishment, Stockholm (Sweden); Frid, W.; Engelbrektson, A.

    1998-01-01

    When studying the structural response of a containment building to ex-vessel steam explosion loads, a two-step procedure is often used. In the first step of this procedure the structures are treated as rigid and the pressure-time history generated by the explosion at the rigid wall is calculated. In the second step the calculated pressure is applied to the structures. The obvious weakness of the two-step procedure is that it does not correspond to the real dynamic behaviour of the fluid-structure system. The purpose of this paper is to identify and evaluate the relevant fluid-structure interaction phenomena. This is achieved through direct treatment of the explosion process and the structural response. The predictions of a direct and two-step treatment are compared for a BWR Mark II containment design, consisting of two concentric walls interacting with water masses in the central and annular pools. It is shown that the two-step approach leads to unrealistic energy transfer in the containment system studied, and to significant overestimation of the deflection of the containment wall. As regards the pedestal wall, the direct method analysis shows that the flexibility of this wall affects the pressure-time history considerably. Three load types have been identified for this wall namely shock load, water blow as a result of water cavitation, and hydrodynamic load. Reloading impulse due to cavitation phenomena plays an important role as it amounts to about 40% of the total impulse load. Investigation of the generality of the cavitation phenomena in the context of ex-vessel steam explosion loads was outside the scope of this work. (author)

  20. Significance of fluid-structure interaction phenomena for containment response to ex-vessel steam explosions

    Energy Technology Data Exchange (ETDEWEB)

    Almstroem, H.; Sundel, T. (Nat. Defence Res. Establ., Tumba (Sweden)); Frid, W. (Swedish Nuclear Power Inspectorate, SE-10658, Stockholm (Sweden)); Engelbrektson, A. (VBB/SWECO, Box 34044, SE-10026, Stockholm (Sweden))

    1999-05-01

    When studying the structural response of a containment building to ex-vessel steam explosion loads, a two-step procedure is often used. In the first step of this procedure the structures are treated as rigid and the pressure-time history generated by the explosion, at the rigid wall, is calculated. In the second step the calculated pressure is applied to the structures. The obvious weakness of the two-step procedure is that it does not correspond to the real dynamic behaviour of the fluid-structure system. The purpose of this paper is to identify and evaluate the relevant fluid-structure interaction phenomena. This is achieved through direct treatment of the explosion process and the structural response. The predictions of a direct and two-step treatment are compared for a BWR Mark II containment design, consisting of two concentric walls interacting with water masses in the central and annular pools. It is shown that the two-step approach leads to unrealistic energy transfer in the containment system studied and to significant overestimation of the deflection of the containment wall. As regards the pedestal wall, the direct method analysis shows that the flexibility of this wall affects the pressure-time history considerably. Three load types have been identified for this wall namely shock load, water blow as a result of water cavitation, and hydrodynamic load. Reloading impulse due to cavitation phenomena plays an important role as it amounts to [approx]40% of the total impulse load. Investigation of the generality of the cavitation phenomena in the context of ex-vessel steam explosion loads was outside the scope of this work. (orig.) 5 refs.

  1. Significance of fluid-structure interaction phenomena for containment response to ex-vessel steam explosions

    International Nuclear Information System (INIS)

    Almstroem, H.; Sundel, T.; Frid, W.; Engelbrektson, A.

    1999-01-01

    When studying the structural response of a containment building to ex-vessel steam explosion loads, a two-step procedure is often used. In the first step of this procedure the structures are treated as rigid and the pressure-time history generated by the explosion, at the rigid wall, is calculated. In the second step the calculated pressure is applied to the structures. The obvious weakness of the two-step procedure is that it does not correspond to the real dynamic behaviour of the fluid-structure system. The purpose of this paper is to identify and evaluate the relevant fluid-structure interaction phenomena. This is achieved through direct treatment of the explosion process and the structural response. The predictions of a direct and two-step treatment are compared for a BWR Mark II containment design, consisting of two concentric walls interacting with water masses in the central and annular pools. It is shown that the two-step approach leads to unrealistic energy transfer in the containment system studied and to significant overestimation of the deflection of the containment wall. As regards the pedestal wall, the direct method analysis shows that the flexibility of this wall affects the pressure-time history considerably. Three load types have been identified for this wall namely shock load, water blow as a result of water cavitation, and hydrodynamic load. Reloading impulse due to cavitation phenomena plays an important role as it amounts to ∼40% of the total impulse load. Investigation of the generality of the cavitation phenomena in the context of ex-vessel steam explosion loads was outside the scope of this work. (orig.)

  2. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  3. A novel statistic for genome-wide interaction analysis.

    Directory of Open Access Journals (Sweden)

    Xuesen Wu

    2010-09-01

    Full Text Available Although great progress in genome-wide association studies (GWAS has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked. The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001statistic is able to search significant interaction between SNPs across the genome. Real data analysis showed that the results of genome-wide interaction analysis can be replicated in two independent studies.

  4. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  5. Statistical Power in Plant Pathology Research.

    Science.gov (United States)

    Gent, David H; Esker, Paul D; Kriss, Alissa B

    2018-01-01

    In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even

  6. Non-response to sad mood induction: implications for emotion research.

    Science.gov (United States)

    Rottenberg, Jonathan; Kovacs, Maria; Yaroslavsky, Ilya

    2018-05-01

    Experimental induction of sad mood states is a mainstay of laboratory research on affect and cognition, mood regulation, and mood disorders. Typically, the success of such mood manipulations is reported as a statistically significant pre- to post-induction change in the self-rated intensity of the target affect. The present commentary was motivated by an unexpected finding in one of our studies concerning the response rate to a well-validated sad mood induction. Using the customary statistical approach, we found a significant mean increase in self-rated sadness intensity with a moderate effect size, verifying the "success" of the mood induction. However, that "success" masked that, between one-fifth and about one-third of our samples (adolescents who had histories of childhood-onset major depressive disorder and healthy controls) reported absolutely no sadness in response to the mood induction procedure. We consider implications of our experience for emotion research by (1) commenting upon the typically overlooked phenomenon of nonresponse, (2) suggesting changes in reporting practices regarding mood induction success, and (3) outlining future directions to help scientists determine why some subjects do not respond to experimental mood induction.

  7. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  8. Nutritional factors as predictors of response to radio-chemotherapy and survival in unresectable squamous head and neck carcinoma

    International Nuclear Information System (INIS)

    Salas, Sebastien; Deville, Jean-Laurent; Giorgi, Roch; Pignon, Thierry; Bagarry, Danielle; Barrau, Karine; Zanaret, Michel; Giovanni, Antoine; Bourgeois, Aude; Favre, Roger; Duffaud, Florence

    2008-01-01

    Background and purpose: This study sought to evaluate nutritional prognostic factors before treatment in patients with unresectable head and neck cancer treated by concomitant radio-chemotherapy. Methods and materials: Seventy-two consecutive patients were treated. We studied the potential effects of CRP, Alb, preAlb, orosomucoid, weight, weight history, BMI, PINI, OPR and NRI on response to treatment, Event-Free Survival (EFS) and Overall Survival (OS). Effects of potential risk factors on OS and on EFS were analyzed by computing Kaplan-Meier estimates, and curves were compared using the log-rank test. Results: All biological nutritional factors were statistically correlated with the response to radio-chemotherapy. In multivariate analysis, only CRP (p = 0.004) remained statistically significant. A statistical correlation was found between Alb and EFS in multivariate analysis (p = 0.04). The factors influencing OS in univariate analysis were Alb (p = 0.008), CRP (p = 0.004), orosomucoid (p = 0.01) and NRI (p = 0.01), response to radio-chemotherapy (p < 0.001) and staging (p = 0.04). In multivariate analysis, only the response to radio-chemotherapy (p < 0.001) remained significant. Conclusions: This study illustrates the prognostic value of nutritional status. CRP and Alb may be useful in the assessment of advanced head and neck cancer patients at diagnosis and for stratifying patients taking part in randomized trials

  9. Statistical inference for template aging

    Science.gov (United States)

    Schuckers, Michael E.

    2006-04-01

    A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.

  10. Testing statistical self-similarity in the topology of river networks

    Science.gov (United States)

    Troutman, Brent M.; Mantilla, Ricardo; Gupta, Vijay K.

    2010-01-01

    Recent work has demonstrated that the topological properties of real river networks deviate significantly from predictions of Shreve's random model. At the same time the property of mean self-similarity postulated by Tokunaga's model is well supported by data. Recently, a new class of network model called random self-similar networks (RSN) that combines self-similarity and randomness has been introduced to replicate important topological features observed in real river networks. We investigate if the hypothesis of statistical self-similarity in the RSN model is supported by data on a set of 30 basins located across the continental United States that encompass a wide range of hydroclimatic variability. We demonstrate that the generators of the RSN model obey a geometric distribution, and self-similarity holds in a statistical sense in 26 of these 30 basins. The parameters describing the distribution of interior and exterior generators are tested to be statistically different and the difference is shown to produce the well-known Hack's law. The inter-basin variability of RSN parameters is found to be statistically significant. We also test generator dependence on two climatic indices, mean annual precipitation and radiative index of dryness. Some indication of climatic influence on the generators is detected, but this influence is not statistically significant with the sample size available. Finally, two key applications of the RSN model to hydrology and geomorphology are briefly discussed.

  11. Gene cluster statistics with gene families.

    Science.gov (United States)

    Raghupathy, Narayanan; Durand, Dannie

    2009-05-01

    Identifying genomic regions that descended from a common ancestor is important for understanding the function and evolution of genomes. In distantly related genomes, clusters of homologous gene pairs are evidence of candidate homologous regions. Demonstrating the statistical significance of such "gene clusters" is an essential component of comparative genomic analyses. However, currently there are no practical statistical tests for gene clusters that model the influence of the number of homologs in each gene family on cluster significance. In this work, we demonstrate empirically that failure to incorporate gene family size in gene cluster statistics results in overestimation of significance, leading to incorrect conclusions. We further present novel analytical methods for estimating gene cluster significance that take gene family size into account. Our methods do not require complete genome data and are suitable for testing individual clusters found in local regions, such as contigs in an unfinished assembly. We consider pairs of regions drawn from the same genome (paralogous clusters), as well as regions drawn from two different genomes (orthologous clusters). Determining cluster significance under general models of gene family size is computationally intractable. By assuming that all gene families are of equal size, we obtain analytical expressions that allow fast approximation of cluster probabilities. We evaluate the accuracy of this approximation by comparing the resulting gene cluster probabilities with cluster probabilities obtained by simulating a realistic, power-law distributed model of gene family size, with parameters inferred from genomic data. Surprisingly, despite the simplicity of the underlying assumption, our method accurately approximates the true cluster probabilities. It slightly overestimates these probabilities, yielding a conservative test. We present additional simulation results indicating the best choice of parameter values for data

  12. Understanding Short-Term Nonmigrating Tidal Variability in the Ionospheric Dynamo Region from SABER Using Information Theory and Bayesian Statistics

    Science.gov (United States)

    Kumari, K.; Oberheide, J.

    2017-12-01

    Nonmigrating tidal diagnostics of SABER temperature observations in the ionospheric dynamo region reveal a large amount of variability on time-scales of a few days to weeks. In this paper, we discuss the physical reasons for the observed short-term tidal variability using a novel approach based on Information theory and Bayesian statistics. We diagnose short-term tidal variability as a function of season, QBO, ENSO, and solar cycle and other drivers using time dependent probability density functions, Shannon entropy and Kullback-Leibler divergence. The statistical significance of the approach and its predictive capability is exemplified using SABER tidal diagnostics with emphasis on the responses to the QBO and solar cycle. Implications for F-region plasma density will be discussed.

  13. Statistical versus Musical Significance: Commentary on Leigh VanHandel's 'National Metrical Types in Nineteenth Century Art Song'

    Directory of Open Access Journals (Sweden)

    Justin London

    2010-01-01

    Full Text Available In “National Metrical Types in Nineteenth Century Art Song” Leigh Van Handel gives a sympathetic critique of William Rothstein’s claim that in western classical music of the late 18th and 19th centuries there are discernable differences in the phrasing and metrical practice of German versus French and Italian composers. This commentary (a examines just what Rothstein means in terms of his proposed metrical typology, (b questions Van Handel on how she has applied it to a purely melodic framework, (c amplifies Van Handel’s critique of Rothstein, and then (d concludes with a rumination on the reach of quantitative (i.e., statistically-driven versus qualitative claims regarding such things as “national metrical types.”

  14. Statistical characterization report for Single-Shell Tank 241-T-107

    International Nuclear Information System (INIS)

    Cromar, R.D.; Wilmarth, S.R.; Jensen, L.

    1994-01-01

    This report contains the results of the statistical analysis of data from three core samples obtained from single-shell tank 241-T-107 (T-107). Four specific topics are addressed. They are summarized below. Section 3.0 contains mean concentration estimates of analytes found in T-107. The estimates of open-quotes errorclose quotes associated with the concentration estimates are given as 95% confidence intervals (CI) on the mean. The results given are based on three types of samples: core composite samples, core segment samples, and drainable liquid samples. Section 4.0 contains estimates of the spatial variability (variability between cores and between segments) and the analytical variability (variability between the primary and the duplicate analysis). Statistical tests were performed to test the hypothesis that the between cores and the between segments spatial variability is zero. The results of the tests are as follows. Based on the core composite data, the between cores variance is significantly different from zero for 35 out of 74 analytes; i.e., for 53% of the analytes there is no statistically significant difference between the concentration means for two cores. Based on core segment data, the between segments variance is significantly different from zero for 22 out of 24 analytes and the between cores variance is significantly different from zero for 4 out of 24 analytes; i.e., for 8% of the analytes there is no statistically significant difference between segment means and for 83% of the analytes there is no difference between the means from the three cores. Section 5.0 contains the results of the application of multiple comparison methods to the core composite data, the core segment data, and the drainable liquid data. Section 6.0 contains the results of a statistical test conducted to determine the 222-S Analytical Laboratory's ability to homogenize solid core segments

  15. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  16. Significance of chronic toxoplasmosis in epidemiology of road traffic accidents in Russian Federation.

    Science.gov (United States)

    Stepanova, Ekaterina V; Kondrashin, Anatoly V; Sergiev, Vladimir P; Morozova, Lola F; Turbabina, Natalia A; Maksimova, Maria S; Brazhnikov, Alexey I; Shevchenko, Sergei B; Morozov, Evgeny N

    2017-01-01

    Studies carried out in Moscow residents have revealed that the prevalence of chronic toxoplasmosis is very close to those in countries of Eastern and Central Europe. Our findings also demonstrated a statistically significant relationship between the rate of traffic accidents and the seroprevalence of chronic toxoplasmosis in drivers who were held responsible for accidents. The latter was 2.37 times higher in drivers who were involved in road accidents compared with control groups. These results suggest that the consequences of chronic toxoplasmosis (particularly a slower reaction time and decreased concentration) might contribute to the peculiarities of the epidemiology of road traffic accidents in the Russian Federation and might interfere with the successful implementation of the Federal Programme named "Increase road traffic safety". Suggestions for how to address overcome this problem are discussed in this paper.

  17. Dose-response analysis using R

    DEFF Research Database (Denmark)

    Ritz, Christian; Baty, Florent; Streibig, Jens Carl

    2015-01-01

    Dose-response analysis can be carried out using multi-purpose commercial statistical software, but except for a few special cases the analysis easily becomes cumbersome as relevant, non-standard output requires manual programming. The extension package drc for the statistical environment R provides...

  18. Cluster size statistic and cluster mass statistic: two novel methods for identifying changes in functional connectivity between groups or conditions.

    Science.gov (United States)

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.

  19. The response of thrombosis in the portal vein or hepatic vein in hepatocellular carcinoma to radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Bong Kyung; Kim, Jae Chul [Dept. of Radiation Oncology, Kyungpook National University School of Medicine, Daegu (Korea, Republic of)

    2016-09-15

    The purpose of current study is to evaluate the response of the patients with portal vein thrombosis (PVT) or hepatic vein thrombosis (HVT) in hepatocellular carcinoma (HCC) treated with three-dimensional conformal radiation therapy (3D-CRT). In addition, survival of patients and potential prognostic factors of the survival was evaluated. Forty-seven patients with PVT or HVT in HCC, referred to our department for radiotherapy, were retrospectively reviewed. For 3D-CRT plans, a gross tumor volume (GTV) was defined as a hypodense filling defect area in the portal vein (PV) or hepatic vein (HV). Survival of patients, and response to radiation therapy (RT) were analyzed. Potential prognostic factors for survival and response to RT were evaluated. The median survival time of 47 patients was 8 months, with 1-year survival rate of 15% and response rate of 40%. Changes in Child-Pugh score, response to RT, Eastern cooperative oncology group performance status (ECOG PS), hepatitis C antibody (HCVAb) positivity, and additional post RT treatment were statistically significant prognostic factors for survival in univariate analysis (p = 0.000, p = 0.018, p = 0.000, p = 0.013, and p = 0.047, respectively). Of these factors, changes in Child-Pugh score, and response to RT were significant for patients' prognosis in multivariate analysis (p = 0.001 and p = 0.035, respectively). RT could constitute a reasonable treatment option for patients with PVT or HVT in HCC with acceptable toxicity. Changes in Child-Pugh score, and response to RT were statistically significant factors of survival of patients.

  20. Statistical Damage Detection of Civil Engineering Structures using ARMAV Models

    DEFF Research Database (Denmark)

    Andersen, P.; Kirkegaard, Poul Henning

    In this paper a statistically based damage detection of a lattice steel mast is performed. By estimation of the modal parameters and their uncertainties it is possible to detect whether some of the modal parameters have changed with a statistical significance. The estimation of the uncertainties ...

  1. Clinicopathological significance of c-MYC in esophageal squamous cell carcinoma.

    Science.gov (United States)

    Lian, Yu; Niu, Xiangdong; Cai, Hui; Yang, Xiaojun; Ma, Haizhong; Ma, Shixun; Zhang, Yupeng; Chen, Yifeng

    2017-07-01

    Esophageal squamous cell carcinoma is one of the most common malignant tumors. The oncogene c-MYC is thought to be important in the initiation, promotion, and therapy resistance of cancer. In this study, we aim to investigate the clinicopathologic roles of c-MYC in esophageal squamous cell carcinoma tissue. This study is aimed at discovering and analyzing c-MYC expression in a series of human esophageal tissues. A total of 95 esophageal squamous cell carcinoma samples were analyzed by the western blotting and immunohistochemistry techniques. Then, correlation of c-MYC expression with clinicopathological features of esophageal squamous cell carcinoma patients was statistically analyzed. In most esophageal squamous cell carcinoma cases, the c-MYC expression was positive in tumor tissues. The positive rate of c-MYC expression in tumor tissues was 61.05%, obviously higher than the adjacent normal tissues (8.42%, 8/92) and atypical hyperplasia tissues (19.75%, 16/95). There was a statistical difference among adjacent normal tissues, atypical hyperplasia tissues, and tumor tissues. Overexpression of the c-MYC was detected in 61.05% (58/95) esophageal squamous cell carcinomas, which was significantly correlated with the degree of differentiation (p = 0.004). The positive rate of c-MYC expression was 40.0% in well-differentiated esophageal tissues, with a significantly statistical difference (p = 0.004). The positive rate of c-MYC was 41.5% in T1 + T2 esophageal tissues and 74.1% in T3 + T4 esophageal tissues, with a significantly statistical difference (p = 0.001). The positive rate of c-MYC was 45.0% in I + II esophageal tissues and 72.2% in III + IV esophageal tissues, with a significantly statistical difference (p = 0.011). The c-MYC expression strongly correlated with clinical staging (p = 0.011), differentiation degree (p = 0.004), lymph node metastasis (p = 0.003), and invasion depth (p = 0.001) of patients with esophageal squamous cell carcinoma. The c-MYC was

  2. Optimization of significant insolation distribution parameters - A new approach towards BIPV system design

    Energy Technology Data Exchange (ETDEWEB)

    Paul, D. [SSBB and Senior Member-ASQ, Kolkata (India); Mandal, S.N. [Kalyani Govt Engg College, Kalyani (India); Mukherjee, D.; Bhadra Chaudhuri, S.R. [Dept of E. and T. C. Engg, B.E.S.U., Shibpur (India)

    2010-10-15

    System efficiency and payback time are yet to attain a commercially viable level for solar photovoltaic energy projects. Despite huge development in prediction of solar radiation data, there is a gap in extraction of pertinent information from such data. Hence the available data cannot be effectively utilized for engineering application. This is acting as a barrier for the emerging technology. For making accurate engineering and financial calculations regarding any solar energy project, it is crucial to identify and optimize the most significant statistic(s) representing insolation availability by the Photovoltaic setup at the installation site. Quality Function Deployment (QFD) technique has been applied for identifying the statistic(s), which are of high significance from a project designer's point of view. A MATLAB trademark program has been used to build the annual frequency distribution of hourly insolation over any module plane at a given location. Descriptive statistical analysis of such distributions is done through MINITAB trademark. For Building Integrated Photo Voltaic (BIPV) installation, similar statistical analysis has been carried out for the composite frequency distribution, which is formed by weighted summation of insolation distributions for different module planes used in the installation. Vital most influential statistic(s) of the composite distribution have been optimized through Artificial Neural Network computation. This approach is expected to open up a new horizon in BIPV system design. (author)

  3. PRIS-STATISTICS: Power Reactor Information System Statistical Reports. User's Manual

    International Nuclear Information System (INIS)

    2013-01-01

    The IAEA developed the Power Reactor Information System (PRIS)-Statistics application to assist PRIS end users with generating statistical reports from PRIS data. Statistical reports provide an overview of the status, specification and performance results of every nuclear power reactor in the world. This user's manual was prepared to facilitate the use of the PRIS-Statistics application and to provide guidelines and detailed information for each report in the application. Statistical reports support analyses of nuclear power development and strategies, and the evaluation of nuclear power plant performance. The PRIS database can be used for comprehensive trend analyses and benchmarking against best performers and industrial standards.

  4. Adaptive statistical iterative reconstruction-applied ultra-low-dose CT with radiography- comparable radiation dose: Usefulness for lung nodule detection

    International Nuclear Information System (INIS)

    Yoon, Hyun Jung; Chung, Myung Jin; Hwang, Hye Sun; Lee, Kyung Soo; Moon, Jung Won

    2015-01-01

    To assess the performance of adaptive statistical iterative reconstruction (ASIR)-applied ultra-low-dose CT (ULDCT) in detecting small lung nodules. Thirty patients underwent both ULDCT and standard dose CT (SCT). After determining the reference standard nodules, five observers, blinded to the reference standard reading results, independently evaluated SCT and both subsets of ASIR- and filtered back projection (FBP)-driven ULDCT images. Data assessed by observers were compared statistically. Converted effective doses in SCT and ULDCT were 2.81 ± 0.92 and 0.17 ± 0.02 mSv, respectively. A total of 114 lung nodules were detected on SCT as a standard reference. There was no statistically significant difference in sensitivity between ASIR-driven ULDCT and SCT for three out of the five observers (p = 0.678, 0.735, < 0.01, 0.038, and < 0.868 for observers 1, 2, 3, 4, and 5, respectively). The sensitivity of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT in three out of the five observers (p < 0.01 for three observers, and p = 0.064 and 0.146 for two observers). In jackknife alternative free-response receiver operating characteristic analysis, the mean values of figure-of-merit (FOM) for FBP, ASIR-driven ULDCT, and SCT were 0.682, 0.772, and 0.821, respectively, and there were no significant differences in FOM values between ASIR-driven ULDCT and SCT (p = 0.11), but the FOM value of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT and SCT (p = 0.01 and 0.00). Adaptive statistical iterative reconstruction-driven ULDCT delivering a radiation dose of only 0.17 mSv offers acceptable sensitivity in nodule detection compared with SCT and has better performance than FBP-driven ULDCT

  5. Adaptive statistical iterative reconstruction-applied ultra-low-dose CT with radiography- comparable radiation dose: Usefulness for lung nodule detection

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Hyun Jung; Chung, Myung Jin; Hwang, Hye Sun; Lee, Kyung Soo [Dept. of Radiology and Center for Imaging Science, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of); Moon, Jung Won [Dept. of Radiology, Kangbuk Samsung Hospital, Seoul (Korea, Republic of)

    2015-10-15

    To assess the performance of adaptive statistical iterative reconstruction (ASIR)-applied ultra-low-dose CT (ULDCT) in detecting small lung nodules. Thirty patients underwent both ULDCT and standard dose CT (SCT). After determining the reference standard nodules, five observers, blinded to the reference standard reading results, independently evaluated SCT and both subsets of ASIR- and filtered back projection (FBP)-driven ULDCT images. Data assessed by observers were compared statistically. Converted effective doses in SCT and ULDCT were 2.81 ± 0.92 and 0.17 ± 0.02 mSv, respectively. A total of 114 lung nodules were detected on SCT as a standard reference. There was no statistically significant difference in sensitivity between ASIR-driven ULDCT and SCT for three out of the five observers (p = 0.678, 0.735, < 0.01, 0.038, and < 0.868 for observers 1, 2, 3, 4, and 5, respectively). The sensitivity of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT in three out of the five observers (p < 0.01 for three observers, and p = 0.064 and 0.146 for two observers). In jackknife alternative free-response receiver operating characteristic analysis, the mean values of figure-of-merit (FOM) for FBP, ASIR-driven ULDCT, and SCT were 0.682, 0.772, and 0.821, respectively, and there were no significant differences in FOM values between ASIR-driven ULDCT and SCT (p = 0.11), but the FOM value of FBP-driven ULDCT was significantly lower than that of ASIR-driven ULDCT and SCT (p = 0.01 and 0.00). Adaptive statistical iterative reconstruction-driven ULDCT delivering a radiation dose of only 0.17 mSv offers acceptable sensitivity in nodule detection compared with SCT and has better performance than FBP-driven ULDCT.

  6. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  7. Complex patterns of response to oral hygiene instructions: longitudinal evaluation of periodontal patients.

    Science.gov (United States)

    Amoo-Achampong, Felice; Vitunac, David E; Deeley, Kathleen; Modesto, Adriana; Vieira, Alexandre R

    2018-05-02

    Oral hygiene instruction is an intervention widely practiced but increased knowledge about oral health does not necessarily dramatically impact oral disease prevalence in populations. We aimed to measure plaque and bleeding in periodontal patients over time to determine patterns of patient response to oral hygiene instructions. Longitudinal plaque and bleeding index data were evaluated in 227 periodontal patients to determine the impact of oral hygiene instructions. Over multiple visits, we determined relative plaque accumulation and gingival bleeding for each patient. Subsequently, we grouped them in three types of oral hygiene status in response to initial instructions, using the longitudinal data over the period they were treated and followed for their periodontal needs. These patterns of oral hygiene based on the plaque and gingival bleeding indexes were evaluated based on age, sex, ethnic background, interleukin 1 alpha and beta genotypes, diabetes status, smoking habits, and other concomitant diseases. Chi-square and Fisher's exact tests were used to determine if any differences between these variables were statistically significant with alpha set at 0.05. Three patterns in response to oral hygiene instructions emerged. Plaque and gingival bleeding indexes improved, worsened, or fluctuated over time in the periodontal patients studied. Out of all the confounders considered, only ethnic background showed statistically significant differences. White individuals more often than other ethnic groups fluctuated in regards to oral hygiene quality after instructions. There are different responses to professional oral hygiene instructions. These responses may be related to ethnicity.

  8. Behavioral investment strategy matters: a statistical arbitrage approach

    OpenAIRE

    Sun, David; Tsai, Shih-Chuan; Wang, Wei

    2011-01-01

    In this study, we employ a statistical arbitrage approach to demonstrate that momentum investment strategy tend to work better in periods longer than six months, a result different from findings in past literature. Compared with standard parametric tests, the statistical arbitrage method produces more clearly that momentum strategies work only in longer formation and holding periods. Also they yield positive significant returns in an up market, but negative yet insignificant returns in a down...

  9. Comparing identified and statistically significant lipids and polar metabolites in 15-year old serum and dried blood spot samples for longitudinal studies: Comparing lipids and metabolites in serum and DBS samples

    Energy Technology Data Exchange (ETDEWEB)

    Kyle, Jennifer E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Casey, Cameron P. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Stratton, Kelly G. [National Security Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zink, Erika M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Kim, Young-Mo [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Zheng, Xueyun [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Monroe, Matthew E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Weitz, Karl K. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Bloodsworth, Kent J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Orton, Daniel J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Ibrahim, Yehia M. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Moore, Ronald J. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Lee, Christine G. [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Research Service, Portland Veterans Affairs Medical Center, Portland OR USA; Pedersen, Catherine [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Orwoll, Eric [Department of Medicine, Bone and Mineral Unit, Oregon Health and Science University, Portland OR USA; Smith, Richard D. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Burnum-Johnson, Kristin E. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA; Baker, Erin S. [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland WA USA

    2017-02-05

    The use of dried blood spots (DBS) has many advantages over traditional plasma and serum samples such as smaller blood volume required, storage at room temperature, and ability for sampling in remote locations. However, understanding the robustness of different analytes in DBS samples is essential, especially in older samples collected for longitudinal studies. Here we analyzed DBS samples collected in 2000-2001 and stored at room temperature and compared them to matched serum samples stored at -80°C to determine if they could be effectively used as specific time points in a longitudinal study following metabolic disease. Four hundred small molecules were identified in both the serum and DBS samples using gas chromatograph-mass spectrometry (GC-MS), liquid chromatography-MS (LC-MS) and LC-ion mobility spectrometry-MS (LC-IMS-MS). The identified polar metabolites overlapped well between the sample types, though only one statistically significant polar metabolite in a case-control study was conserved, indicating degradation occurs in the DBS samples affecting quantitation. Differences in the lipid identifications indicated that some oxidation occurs in the DBS samples. However, thirty-six statistically significant lipids correlated in both sample types indicating that lipid quantitation was more stable across the sample types.

  10. Security of statistical data bases: invasion of privacy through attribute correlational modeling

    Energy Technology Data Exchange (ETDEWEB)

    Palley, M.A.

    1985-01-01

    This study develops, defines, and applies a statistical technique for the compromise of confidential information in a statistical data base. Attribute Correlational Modeling (ACM) recognizes that the information contained in a statistical data base represents real world statistical phenomena. As such, ACM assumes correlational behavior among the database attributes. ACM proceeds to compromise confidential information through creation of a regression model, where the confidential attribute is treated as the dependent variable. The typical statistical data base may preclude the direct application of regression. In this scenario, the research introduces the notion of a synthetic data base, created through legitimate queries of the actual data base, and through proportional random variation of responses to these queries. The synthetic data base is constructed to resemble the actual data base as closely as possible in a statistical sense. ACM then applies regression analysis to the synthetic data base, and utilizes the derived model to estimate confidential information in the actual database.

  11. Dose-Response Calculator for ArcGIS

    Science.gov (United States)

    Hanser, Steven E.; Aldridge, Cameron L.; Leu, Matthias; Nielsen, Scott E.

    2011-01-01

    The Dose-Response Calculator for ArcGIS is a tool that extends the Environmental Systems Research Institute (ESRI) ArcGIS 10 Desktop application to aid with the visualization of relationships between two raster GIS datasets. A dose-response curve is a line graph commonly used in medical research to examine the effects of different dosage rates of a drug or chemical (for example, carcinogen) on an outcome of interest (for example, cell mutations) (Russell and others, 1982). Dose-response curves have recently been used in ecological studies to examine the influence of an explanatory dose variable (for example, percentage of habitat cover, distance to disturbance) on a predicted response (for example, survival, probability of occurrence, abundance) (Aldridge and others, 2008). These dose curves have been created by calculating the predicted response value from a statistical model at different levels of the explanatory dose variable while holding values of other explanatory variables constant. Curves (plots) developed using the Dose-Response Calculator overcome the need to hold variables constant by using values extracted from the predicted response surface of a spatially explicit statistical model fit in a GIS, which include the variation of all explanatory variables, to visualize the univariate response to the dose variable. Application of the Dose-Response Calculator can be extended beyond the assessment of statistical model predictions and may be used to visualize the relationship between any two raster GIS datasets (see example in tool instructions). This tool generates tabular data for use in further exploration of dose-response relationships and a graph of the dose-response curve.

  12. Bose and his statistics

    International Nuclear Information System (INIS)

    Venkataraman, G.

    1992-01-01

    Treating radiation gas as a classical gas, Einstein derived Planck's law of radiation by considering the dynamic equilibrium between atoms and radiation. Dissatisfied with this treatment, S.N. Bose derived Plank's law by another original way. He treated the problem in generality: he counted how many cells were available for the photon gas in phase space and distributed the photons into these cells. In this manner of distribution, there were three radically new ideas: The indistinguishability of particles, the spin of the photon (with only two possible orientations) and the nonconservation of photon number. This gave rise to a new discipline of quantum statistical mechanics. Physics underlying Bose's discovery, its significance and its role in development of the concept of ideal gas, spin-statistics theorem and spin particles are described. The book has been written in a simple and direct language in an informal style aiming to stimulate the curiosity of a reader. (M.G.B.)

  13. Statistical analysis of surface roughness in turning based on cutting parameters and tool vibrations with response surface methodology (RSM)

    Science.gov (United States)

    Touati, Soufiane; Mekhilef, Slimane

    2018-03-01

    In this paper, we present an experimental study to determine the effect of the cutting conditions and tool vibration on the surface roughness in finish turning of 32CrMoV12-28 steel, using carbide cutting tool YT15. For these purposes, a linear quadratic model in interaction of connecting surface roughness (Ra, Rz) with different combinations of cutting parameters such as cutting speed, feed rate, depth of cut and tool vibration, in radial and in tangential cutting force directions (Vy) and (Vz) is elaborated. In order to express the degree of interaction of cutting parameters and tool vibration, a multiple linear regression and response surface methodology are adopted. The application of this statistical technique for predicting the surface roughness shows that the feed rate is the most dominant factor followed by the cutting speed. However, the depth of the cut and tool vibrations have secondary effect. The presented models have some interest since they are used in the cutting process optimization.

  14. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  15. Statistical distribution for generalized ideal gas of fractional-statistics particles

    International Nuclear Information System (INIS)

    Wu, Y.

    1994-01-01

    We derive the occupation-number distribution in a generalized ideal gas of particles obeying fractional statistics, including mutual statistics, by adopting a state-counting definition. When there is no mutual statistics, the statistical distribution interpolates between bosons and fermions, and respects a fractional exclusion principle (except for bosons). Anyons in a strong magnetic field at low temperatures constitute such a physical system. Applications to the thermodynamic properties of quasiparticle excitations in the Laughlin quantum Hall fluid are discussed

  16. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  17. Social responsibility in e-commerce: Reflection on customer's satisfaction and loyalty in internet promotion of tourist services

    Directory of Open Access Journals (Sweden)

    Marić Tijana

    2015-01-01

    Full Text Available The aim of this paper is to determine, based on theoretical and empirical research, whether and to what extent the application of the social responsibility concept in e-commerce and marketing affects the satisfaction and loyalty of customers opting for on-line purchase. The main task is to use the case of global on-line buying and selling of tourist services to test the null hypothesis on the existence of a statistically significant correlation between the concept of social responsibility and customer's satisfaction and loyalty. Empirical research was conducted on a sample of 409 respondents from selected countries: Serbia, Turkey, Egypt, Italy and Spain. Contingency coefficient and Pearson's correlation coefficient were used for the interpretation of the results and determination of the degree of correlation between these variables. The research results indicated a significant statistical correlation between the concept of social responsibility and customer's satisfaction and loyalty. These results served as a basis for proposed measures necessary for defining the model of social corporate social responsibility in e-commerce, which will be generally binding for all online advertisers. Suggestions for future research are provided in the paper.

  18. Are studies reporting significant results more likely to be published?

    Science.gov (United States)

    Koletsi, Despina; Karagianni, Anthi; Pandis, Nikolaos; Makou, Margarita; Polychronopoulou, Argy; Eliades, Theodore

    2009-11-01

    Our objective was to assess the hypothesis that there are variations of the proportion of articles reporting a significant effect, with a higher percentage of those articles published in journals with impact factors. The contents of 5 orthodontic journals (American Journal of Orthodontics and Dentofacial Orthopedics, Angle Orthodontist, European Journal of Orthodontics, Journal of Orthodontics, and Orthodontics and Craniofacial Research), published between 2004 and 2008, were hand-searched. Articles with statistical analysis of data were included in the study and classified into 4 categories: behavior and psychology, biomaterials and biomechanics, diagnostic procedures and treatment, and craniofacial growth, morphology, and genetics. In total, 2622 articles were examined, with 1785 included in the analysis. Univariate and multivariate logistic regression analyses were applied with statistical significance as the dependent variable, and whether the journal had an impact factor, the subject, and the year were the independent predictors. A higher percentage of articles showed significant results relative to those without significant associations (on average, 88% vs 12%) for those journals. Overall, these journals published significantly more studies with significant results, ranging from 75% to 90% (P = 0.02). Multivariate modeling showed that journals with impact factors had a 100% increased probability of publishing a statistically significant result compared with journals with no impact factor (odds ratio [OR], 1.99; 95% CI, 1.19-3.31). Compared with articles on biomaterials and biomechanics, all other subject categories showed lower probabilities of significant results. Nonsignificant findings in behavior and psychology and diagnosis and treatment were 1.8 (OR, 1.75; 95% CI, 1.51-2.67) and 3.5 (OR, 3.50; 95% CI, 2.27-5.37) times more likely to be published, respectively. Journals seem to prefer reporting significant results; this might be because of authors

  19. The Relationship Between Radiative Forcing and Temperature. What Do Statistical Analyses of the Instrumental Temperature Record Measure?

    International Nuclear Information System (INIS)

    Kaufmann, R.K.; Kauppi, H.; Stock, J.H.

    2006-01-01

    Comparing statistical estimates for the long-run temperature effect of doubled CO2 with those generated by climate models begs the question, is the long-run temperature effect of doubled CO2 that is estimated from the instrumental temperature record using statistical techniques consistent with the transient climate response, the equilibrium climate sensitivity, or the effective climate sensitivity. Here, we attempt to answer the question, what do statistical analyses of the observational record measure, by using these same statistical techniques to estimate the temperature effect of a doubling in the atmospheric concentration of carbon dioxide from seventeen simulations run for the Coupled Model Intercomparison Project 2 (CMIP2). The results indicate that the temperature effect estimated by the statistical methodology is consistent with the transient climate response and that this consistency is relatively unaffected by sample size or the increase in radiative forcing in the sample

  20. Critical analysis of adsorption data statistically

    Science.gov (United States)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.

  1. Can Money Buy Happiness? A Statistical Analysis of Predictors for User Satisfaction

    Science.gov (United States)

    Hunter, Ben; Perret, Robert

    2011-01-01

    2007 data from LibQUAL+[TM] and the ACRL Library Trends and Statistics database were analyzed to determine if there is a statistically significant correlation between library expenditures and usage statistics and library patron satisfaction across 73 universities. The results show that users of larger, better funded libraries have higher…

  2. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  3. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  4. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  5. Application of statistical experimental design to study the formulation variables influencing the coating process of lidocaine liposomes.

    Science.gov (United States)

    González-Rodríguez, M L; Barros, L B; Palma, J; González-Rodríguez, P L; Rabasco, A M

    2007-06-07

    In this paper, we have used statistical experimental design to investigate the effect of several factors in coating process of lidocaine hydrochloride (LID) liposomes by a biodegradable polymer (chitosan, CH). These variables were the concentration of CH coating solution, the dripping rate of this solution on the liposome colloidal dispersion, the stirring rate, the time since the liposome production to the liposome coating and finally the amount of drug entrapped into liposomes. The selected response variables were drug encapsulation efficiency (EE, %), coating efficiency (CE, %) and zeta potential. Liposomes were obtained by thin-layer evaporation method. They were subsequently coated with CH according the experimental plan provided by a fractional factorial (2(5-1)) screening matrix. We have used spectroscopic methods to determine the zeta potential values. The EE (%) assay was carried out in dialysis bags and the brilliant red probe was used to determine CE (%) due to its property of forming molecular complexes with CH. The graphic analysis of the effects allowed the identification of the main formulation and technological factors by the analysis of the selected responses and permitted the determination of the proper level of these factors for the response improvement. Moreover, fractional design allowed quantifying the interactions between the factors, which will consider in next experiments. The results obtained pointed out that LID amount was the predominant factor that increased the drug entrapment capacity (EE). The CE (%) response was mainly affected by the concentration of the CH solution and the stirring rate, although all the interactions between the main factors have statistical significance.

  6. Changing response of the North Atlantic/European winter climate to the 11 year solar cycle

    Science.gov (United States)

    Ma, Hedi; Chen, Haishan; Gray, Lesley; Zhou, Liming; Li, Xing; Wang, Ruili; Zhu, Siguang

    2018-03-01

    Recent studies have presented conflicting results regarding the 11 year solar cycle (SC) influences on winter climate over the North Atlantic/European region. Analyses of only the most recent decades suggest a synchronized North Atlantic Oscillation (NAO)-like response pattern to the SC. Analyses of long-term climate data sets dating back to the late 19th century, however, suggest a mean sea level pressure (mslp) response that lags the SC by 2-4 years in the southern node of the NAO (i.e. Azores region). To understand the conflicting nature and cause of these time dependencies in the SC surface response, the present study employs a lead/lag multi-linear regression technique with a sliding window of 44 years over the period 1751-2016. Results confirm previous analyses, in which the average response for the whole time period features a statistically significant 2-4 year lagged mslp response centered over the Azores region. Overall, the lagged nature of Azores mslp response is generally consistent in time. Stronger and statistically significant SC signals tend to appear in the periods when the SC forcing amplitudes are relatively larger. Individual month analysis indicates the consistent lagged response in December-January-February average arises primarily from early winter months (i.e. December and January), which has been associated with ocean feedback processes that involve reinforcement by anomalies from the previous winter. Additional analysis suggests that the synchronous NAO-like response in recent decades arises primarily from late winter (February), possibly reflecting a result of strong internal noise.

  7. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    Science.gov (United States)

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  8. Statistical methods for analysing responses of wildlife to human disturbance.

    Science.gov (United States)

    Haiganoush K. Preisler; Alan A. Ager; Michael J. Wisdom

    2006-01-01

    1. Off-road recreation is increasing rapidly in many areas of the world, and effects on wildlife can be highly detrimental. Consequently, we have developed methods for studying wildlife responses to off-road recreation with the use of new technologies that allow frequent and accurate monitoring of human-wildlife interactions. To illustrate these methods, we studied the...

  9. The Effect of "Clickers" on Attendance in an Introductory Statistics Course: An Action Research Study

    Science.gov (United States)

    Amstelveen, Raoul H.

    2013-01-01

    The purpose of this study was to design and implement a Classroom Response System, also known as a "clicker," to increase attendance in introductory statistics courses at an undergraduate university. Since 2010, non-attendance had been prevalent in introductory statistics courses. Moreover, non-attendance created undesirable classrooms…

  10. A Nineteenth Century Statistical Society that Abandoned Statistics

    NARCIS (Netherlands)

    Stamhuis, I.H.

    2007-01-01

    In 1857, a Statistical Society was founded in the Netherlands. Within this society, statistics was considered a systematic, quantitative, and qualitative description of society. In the course of time, the society attracted a wide and diverse membership, although the number of physicians on its rolls

  11. IGESS: a statistical approach to integrating individual-level genotype data and summary statistics in genome-wide association studies.

    Science.gov (United States)

    Dai, Mingwei; Ming, Jingsi; Cai, Mingxuan; Liu, Jin; Yang, Can; Wan, Xiang; Xu, Zongben

    2017-09-15

    Results from genome-wide association studies (GWAS) suggest that a complex phenotype is often affected by many variants with small effects, known as 'polygenicity'. Tens of thousands of samples are often required to ensure statistical power of identifying these variants with small effects. However, it is often the case that a research group can only get approval for the access to individual-level genotype data with a limited sample size (e.g. a few hundreds or thousands). Meanwhile, summary statistics generated using single-variant-based analysis are becoming publicly available. The sample sizes associated with the summary statistics datasets are usually quite large. How to make the most efficient use of existing abundant data resources largely remains an open question. In this study, we propose a statistical approach, IGESS, to increasing statistical power of identifying risk variants and improving accuracy of risk prediction by i ntegrating individual level ge notype data and s ummary s tatistics. An efficient algorithm based on variational inference is developed to handle the genome-wide analysis. Through comprehensive simulation studies, we demonstrated the advantages of IGESS over the methods which take either individual-level data or summary statistics data as input. We applied IGESS to perform integrative analysis of Crohns Disease from WTCCC and summary statistics from other studies. IGESS was able to significantly increase the statistical power of identifying risk variants and improve the risk prediction accuracy from 63.2% ( ±0.4% ) to 69.4% ( ±0.1% ) using about 240 000 variants. The IGESS software is available at https://github.com/daviddaigithub/IGESS . zbxu@xjtu.edu.cn or xwan@comp.hkbu.edu.hk or eeyang@hkbu.edu.hk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  12. Hemispheric Asymmetry of Visual Cortical Response by Means of Functional Transcranial Doppler

    Science.gov (United States)

    Roje-Bedeković, Marina; Lovrenčić-Huzjan, Arijana; Bosnar-Puretić, Marijana; Šerić, Vesna; Demarin, Vida

    2012-01-01

    We assessed the visual evoked response and investigated side-to-side differences in mean blood flow velocities (MBFVs) by means of functional transcranial Doppler (fTCD) in 49 right-handed patients with severe internal carotid artery (ICA) stenosis and 30 healthy volunteers, simultaneously in both posterior cerebral arteries (PCAs) using 2 MHz probes, successively in the dark and during the white light stimulation. Statistically significant correlation (P = 0.001) was shown in healthy and in patients (P 0.05). The correlation between ipsilateral left PCA was significantly higher than the one with contralateral right PCA (P < 0.05). There is a clear trend towards the lateralisation of the visual evoked response in the right PCA. PMID:22135771

  13. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  14. Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation

    Science.gov (United States)

    Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.

    2016-01-01

    A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.

  15. CORPORATE SOCIAL RESPONSIBILITY AND ITS FINANCIAL PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Raluca Miruna Zapciu

    2015-06-01

    Full Text Available The field of corporate social responsibility (CSR has grown exponentially in the last two decades. There are different views of the role of the firm in society and disagreement as to whether wealth maximization should be the sole goal of a corporation. Nevertheless, there still remains a debate about the legitimacy and value of corporate responses to CSR concerns. This paper examines the effect of CSR on financial performance. It examines the effect CSR- related shareholder proposals lead to positive announcements returns and superior accounting performance. Also, the channels through which companies benefit from CSR are examined. The paper finds that CSR improves employee satisfaction and helps companies cater to customers that are responsive to sustainable practices and that the adoption of CSR proposals is associated with an increase in labor productivity and sales growth. The results indicate that the sign of the relationship is positive and statistically significant relationship between corporate social responsibility and financial performance, supporting the view that socially responsible corporate performance can be associated with a series of bottom-line benefits.

  16. Prevalence of significant bacteriuria among symptomatic and ...

    African Journals Online (AJOL)

    Data were analyzed using the Statistical Package for Social Sciences (SPSS) version 16.0 (SPSS, Inc., Chicago, Ill). Results: A total of 100 consenting participants were recruited into the study. The mean age was: 23.42 ± 8.31 years and a range of 14‑50 years. Only 9% (9/100) had significant bacteriuria while 44.4% (4/9) ...

  17. How Narrative Focus and a Statistical Map Shape Health Policy Support Among State Legislators.

    Science.gov (United States)

    Niederdeppe, Jeff; Roh, Sungjong; Dreisbach, Caitlin

    2016-01-01

    This study attempts to advance theorizing about health policy advocacy with combinations of narrative focus and a statistical map in an attempt to increase state legislators' support for policies to address the issue of obesity by reducing food deserts. Specifically, we examine state legislators' responses to variations in narrative focus (individual vs. community) about causes and solutions for food deserts in U.S. communities, and a statistical map (presence vs. absence) depicting the prevalence of food deserts across the United States. Using a Web-based randomized experiment (N=496), we show that narrative focus and the statistical map interact to produce different patterns of cognitive response and support for policies to reduce the prevalence of food deserts. The presence of a statistical map showing the prevalence of food deserts in the United States appeared to matter only when combined with an individual narrative, offsetting the fact that the individual narrative in isolation produced fewer thoughts consistent with the story's persuasive goal and more counterarguments in opposition to environmental causes and solutions for obesity than other message conditions. The image did not have an impact when combined with a story describing a community at large. Cognitive responses fully mediated message effects on intended persuasive outcomes. We conclude by discussing the study's contributions to communication theory and practice.

  18. Predicting survey responses: how and why semantics shape survey statistics on organizational behaviour.

    Directory of Open Access Journals (Sweden)

    Jan Ketil Arnulf

    Full Text Available Some disciplines in the social sciences rely heavily on collecting survey responses to detect empirical relationships among variables. We explored whether these relationships were a priori predictable from the semantic properties of the survey items, using language processing algorithms which are now available as new research methods. Language processing algorithms were used to calculate the semantic similarity among all items in state-of-the-art surveys from Organisational Behaviour research. These surveys covered areas such as transformational leadership, work motivation and work outcomes. This information was used to explain and predict the response patterns from real subjects. Semantic algorithms explained 60-86% of the variance in the response patterns and allowed remarkably precise prediction of survey responses from humans, except in a personality test. Even the relationships between independent and their purported dependent variables were accurately predicted. This raises concern about the empirical nature of data collected through some surveys if results are already given a priori through the way subjects are being asked. Survey response patterns seem heavily determined by semantics. Language algorithms may suggest these prior to administering a survey. This study suggests that semantic algorithms are becoming new tools for the social sciences, opening perspectives on survey responses that prevalent psychometric theory cannot explain.

  19. Understanding and forecasting polar stratospheric variability with statistical models

    Directory of Open Access Journals (Sweden)

    C. Blume

    2012-07-01

    Full Text Available The variability of the north-polar stratospheric vortex is a prominent aspect of the middle atmosphere. This work investigates a wide class of statistical models with respect to their ability to model geopotential and temperature anomalies, representing variability in the polar stratosphere. Four partly nonstationary, nonlinear models are assessed: linear discriminant analysis (LDA; a cluster method based on finite elements (FEM-VARX; a neural network, namely the multi-layer perceptron (MLP; and support vector regression (SVR. These methods model time series by incorporating all significant external factors simultaneously, including ENSO, QBO, the solar cycle, volcanoes, to then quantify their statistical importance. We show that variability in reanalysis data from 1980 to 2005 is successfully modeled. The period from 2005 to 2011 can be hindcasted to a certain extent, where MLP performs significantly better than the remaining models. However, variability remains that cannot be statistically hindcasted within the current framework, such as the unexpected major warming in January 2009. Finally, the statistical model with the best generalization performance is used to predict a winter 2011/12 with warm and weak vortex conditions. A vortex breakdown is predicted for late January, early February 2012.

  20. Statistical Analysis of Clinical Data on a Pocket Calculator, Part 2 Statistics on a Pocket Calculator, Part 2

    CERN Document Server

    Cleophas, Ton J

    2012-01-01

    The first part of this title contained all statistical tests relevant to starting clinical investigations, and included tests for continuous and binary data, power, sample size, multiple testing, variability, confounding, interaction, and reliability. The current part 2 of this title reviews methods for handling missing data, manipulated data, multiple confounders, predictions beyond observation, uncertainty of diagnostic tests, and the problems of outliers. Also robust tests, non-linear modeling , goodness of fit testing, Bhatacharya models, item response modeling, superiority testing, variab